var/home/core/zuul-output/0000755000175000017500000000000015150024505014522 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015150036306015470 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000315354215150036131020256 0ustar corecoreY{Jm 0{vR̍>5c@E1%]˜(O)X(6I;Ff"mcI۫d@FNsdxό?2$&tg*Y%\ߘfDP'F%Ab*d@e˛H,љ:72 2ƴ40tr>PYD'vt'oI¢w}o٬owko%gQ(%t#NL֜ eh&Ƨ,RH 4*,!SD 1Ed_wkxdL3F;/u7Taqu5Ոӄp\2dd$YLYG(#?%U?hB\;ErE& SOZXHBWy|iZ~hal\t2Hgb*t--ߖ|Hp(-J C?>:zR{܃ lM6_OފߍO1nԝG?ƥF%QV5pDVHwԡ/.2h{qf]ɻ_\x"gGO%k3Ɵ>Nz7l9HGAr Mme)M,O!Xa~YB ɻ!@J$ty#&i 5ܘ=ЂK]IIɻ]rwbXh)g''H_`!GKF5/O]Zڢ>:O񨡺ePӋ&56zGnL!?lJJYq=Wo/"IyQ4\:y|6h6dQX0>HTG5QOuxMe 1׶/5άRIo>a~W;D=;y|AAY'"葋_d$Ə{(he NSfX1982TH#D֪v3l"<, { Tms'oI&'Adp]{1DL^5"Ϧޙ`F}W5XDV7V5EE9esYYfiMOV i/ f>3VQ 7,oTW⇊AqO:rƭĘ DuZ^ To3dEN/} fI+?|Uz5SUZa{P,97óI,Q{eNFV+(hʺb ״ʻʞX6ýcsT z`q 0C?41- _n^ylSO2|'W'BOTLl-9Ja [$3BV2DC4l!TO C*Mrii1f5 JA *#jv߿Imy%u LOL8c3ilLJ!Ip,2(( *%KGj   %*e5-wFp"a~fzqu6tY,d,`!qIv꜒"T[1!I!NwL}\|}.b3oXR\(L _nJB/_xY.# ſԸv}9U}'/o uSH<:˷tGLS0l/LKcQ.os2% t)Eh~2p cL1%'4-1a_`[Zz㧦|k˭c ĚOρ_} Ewt3th?tvͪ{~;J0= |JUԍ;Iw}/9nh7l%>'ct Հ}a>-:(QxPyA Z UcÖgڌ:8cΗ|U1,-N9 dI [@3YN%:ò6PT:”QVay 77ĐrX(K&Y5+$wL#ɽ 4d-bbdAJ?w:P>n^2] e}gjFX@&avF묇cTy^}m .Ŏ7Uֻ󂊹P-\!3^.Y9[XԦo Έ')Ji.VՕH4~)(kKC&?lm$K/$s_. WM]̍"W%`lO2-"ew@E=Y]Q4`Iz_*2coT'ƟlQ.Ff!bpRw@\6"yr+i37Z_j*YLfnYJ~Z~okJX ?A?gU3U;,ד1t7lJ#wՆ;I|p"+I4ˬZcն a.1wXhxDI:;.^m9W_c.4z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ߐ/ۼ𹫘qݎt7Ym݃|M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"EkuS#q * CƂ lu" yo6"3껝I~flQ~NCBX`]ڦÞhkXO _-Qy2$?T3ͤEZ긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@56."D/< {Dۥ sLxZn$N(lYiV =?_e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP}b&x Uhm._O 4m6^^osVЦ+*@5Fˢg'!>$]0 5_glg}릅h:@61Xv` 5DFnx ˭jCtu,R|ۯG8`&ו:ݓ3<:~iXN9`2ŦzhѤ^ MW`c?&d.'[\]}7A[?~R6*.9t,綨 3 6DFe^u; +֡X< paan}7ftJ^%0\?mg5k][ip4@]p6Uu|܀|Kx6خQU2KTǺ.ȕPQVzWuk{n#NWj8+\[ ?yiI~fs[:.۽ '5nWppH? 8>X+m7_Z`V j[ s3nϏT=1:T <= pDCm3-b _F(/f<8sl, 0۬Z"X.~b٦G3TE.֣eմi<~ik[m9뀥!cNIl8y$~\T B "2j*ҕ;ێIs ɛqQQKY`\ +\0(FęRQ hN œ@n|Vo|6 8~J[,o%l%!%tyNO}}=ʬ-'vlQ]m"ifӠ1˟ud9)˔~BѤ]һS8]uBi( Ql{]UcLxٻa,2r(#'CDd2݄kTxn@v7^58þ Ţ&VY+yn~F8I !6WB3C%X)ybLFB%X2UGgE_hdxMUY.b=eaI3Z=᢬-'~DWc;j FRrI5%N/K;Dk rCbm7чsSW_8g{RY.~XfEߪg:smBi1 YBX4),[c^54Sg(s$sN' 88`wC3TE+A\.ԍל9 y{͝BxG&JS meT;{З>'[LR"w F05N<&AJ3DA0ʄ4(zTUWDdE3̻l^-Xw3Fɀ{B-~.h+U8 i1b8wؖ#N7z7~oB(ъ{zZJ }z&OF wkߓG9!1u8^drKkJBxF&+62,b.-Z*qqdX>$'dW<qIE2Ľ)5kJҼMЌ DR3csf6rRSr[I߽ogCc;S5ׂdKZ=M3դ#F;SYƘK`K<<ƛ G׌MU.APf\M*t*vw]xo{:l[n=`smFQµtxx7/W%g!&^=SzDNew(æ*m3D Bo.hI"!A6:uQզ}@j=Mo<}nYUw1Xw:]e/sm lˣaVۤkĨdԖ)RtS2 "E I"{;ōCb{yex&Td >@).p$`XKxnX~E膂Og\IGֻq<-uˮ◶>waPcPw3``m- } vS¢=j=1 W=&;JW(7b ?Q.|K,ϩ3g)D͵Q5PBj(h<[rqTɈjM-y͢FY~p_~O5-֠kDNTͷItI1mk"@$AǏ}%S5<`d+0o,AրcbvJ2O`gA2Ȏp@Z#"U4Xk1G;7#m eji'ĒGIqB//(O &1I;svHd=mJW~ړUCOīpAiB^MP=MQ`=JB!"]b6Ƞi]ItЀ'Vf:yo=K˞r:( n72-˒#K9T\aVܩO "^OF1%e"xm뻱~0GBeFO0ޑ]w(zM6j\v00ׅYɓHڦd%NzT@gID!EL2$%Ӧ{(gL pWkn\SDKIIKWi^9)N?[tLjV}}O͌:&c!JC{J` nKlȉW$)YLE%I:/8)*H|]}\E$V*#(G;3U-;q7KǰfξC?ke`~UK mtIC8^P߼fub8P銗KDi'U6K×5 .]H<$ ^D'!" b1D8,?tT q lKxDȜOY2S3ҁ%mo(YT\3}sѦoY=-- /IDd6Gs =[F۴'c,QAIٰ9JXOz);B= @%AIt0v[Ƿ&FJE͙A~IQ%iShnMІt.޿>qƽl' ,Τ9)%@ wl42iG.y3bBA{pR A ?IEY ?|-nz#}~f ‰dŷ=ɀ,m7VyIwGHέ 2tޞߛM{FL\#a s.3\}*=#uL#]  GE|FKi3&,ۓxmF͉lG$mN$!;ߑl5O$}D~5| 01 S?tq6cl]M[I5'ոfiҞ:Z YՑ"jyKWk^dd@U_a4/vvV qHMI{+']1m]<$*YP7g# s!8!ߐ>'4k7/KwΦθW'?~>x0_>9Hhs%y{#iUI[Gzďx7OnuKRv'm;/~n-KI`5-'YݦD-!+Y򼤙&m^YAKC˴vҢ]+X`iDf?U7_nMBLϸY&0Ro6Qžl+nݷ" 㬙g|ӱFB@qNx^eCSW3\ZSA !c/!b"'9k I S2=bgj쯏W?=`}H0--VV#YmKW^[?R$+ +cU )?wW@!j-gw2ŝl1!iaI%~`{Tռl>~,?5D K\gd(ZH8@x~5w.4\h(`dc)}1Kqi4~'p!;_V>&M!s}FDͳ֧0O*Vr/tdQu!4YhdqT nXeb|Ivż7>! &ĊL:}3*8&6f5 %>~R݄}WgѨ@OĹCtWai4AY!XH _pw騋[b[%/d>. !Df~;)(Oy )r#.<]]i-*ػ-f24qlT1  jL>1qY|\䛧\|r>Ch}Ϊ=jnk?p ^C8"M#Eޑ-5@f,|Ά(Շ*(XCK*"pXR[كrq IH!6=Ocnи%G"|ڔ^kПy׏<:n:!d#[7>^.hd/}ӾP'k2MؤYy/{!ca /^wT j˚ب|MLE7Ee/I lu//j8MoGqdDt^_Y\-8!ד|$@D.ݮl`p48io^.š{_f>O)J=iwwӑ؇n-i3,1׿5'odۆ3(h>1UW蚍R$W>sngir^$W v:?_ͬ5kݰw[!$s׭dֲcUh=Ɩ9b&2} -/f;M.~dhÓ5¨LIa6PnzɗBQiG'CXt!*<0U-(qc;}*CiKe@p&Em&x!i6ٱ˭K& FCfJ9%ٕQ·BD-]R1#]TROr}S [;Zcq6xMY 6seAU9c>Xf~TTX)QӅtӚe~=WtX-sJb?U'3X7J4l+Cj%LPFxŰAVG Y%.9Vnd8? ǫjU3k%E)OD:"Ϳ%E)=}l/'O"Q_4ILAٍKK7'lWQVm0c:%UEhZ].1lcazn2ͦ_DQP/2 re%_bR~r9_7*vrv |S.Z!rV%¢EN$i^B^rX؆ z1ǡXtiK`uk&LO./!Z&p:ˏ!_B{{s1>"=b'K=}|+: :8au"N@#=Ugzy]sTv||Aec Xi.gL'—Ʃb4AUqػ< &}BIrwZ\"t%>6ES5oaPqobb,v 2w s1,jX4W->L!NUy*Gݓ KmmlTbc[O`uxOp  |T!|ik3cL_ AvG i\fs$<;uI\XAV{ˍlJsŅjЙNhwfG8>Vڇg18 O3E*dt:|X`Z)|z&V*"9U_R=Wd<)tc(߯)Y]g5>.1C( .K3g&_P9&`|8|Ldl?6o AMҪ1EzyNAtRuxyn\]q_ߍ&zk.)Eu{_rjuWݚ;*6mMq!R{QWR=oVbmyanUn.Uqsy.?W8 r[zW*8nؿ[;vmcoW]"U;gm>?Z֒Z6`!2XY]-Zcp˿˘ɲ}MV<в~!?YXV+lx)RRfb-I7p)3XɯEr^,bfbKJ'@hX><[@ ,&,]$*բk-Yv5 '1T9!(*t 0'b@񲱥-kc6VnR0h& 0Z|ђ8 CGV[4xIIWN?Yt>lf@ Vi`D~ڇŁQLLkY <ZPKoma_u` !>Z;3F\dEB n+0Z ?&s{ 6(E|<ޭLk1Yn(F!%sx]>CTl9"و5 |ݹր|/#.w0ޒx"khD?O`-9C| &8֨O8VH5uH)28 Ǿ-R9~ +#e;U6]aD6Xzqd5y n';)VKL]O@b OIAG Lmc 2;\d˽$Mu>WmCEQuabAJ;`uy-u.M>9VsWٔo RS`S#m8k;(WAXq 8@+S@+' 8U˜z+ZU;=eTtX->9U-q .AV/|\ǔ%&$]1YINJ2]:a0OWvI.O6xMY0/M$ *s5x{gsəL3{$)ՆbG(}1wt!wVf;I&Xi43غgR 6 ݩJ$)}Ta@ nS*X#r#v6*;WJ-_@q.+?DK១btMp1 1Gȩ f,M`,Lr6E} m"8_SK$_#O;V 7=xLOu-ȹ2NKLjp*: 'SasyrFrcC0 ѱ LKV:U} -:U8t[=EAV$=i[mhm"roe5jqf$i>;V0eOޞ4ccc2J1TN.7q;"sդSP) 0v3-)-ٕAg"pZ: "ka+n!e߮lɹL V3Os\ဝ+A= 2䣔AzG\ ` \vc"Kj61O Px"3Pc /' PW*3GX liWv-6W&)cX |]O;C%8@*Z1%8Gk@5^NtY"Fbi8D'+_1&1 7U^k6v읨gQ`LRx+I&s5Www` q:cdʰ H`X;"}B=-/M~C>''1R[sdJm RD3Q{)bJatdq>*Ct/GǍ-`2:u)"\**dPdvc& HwMlF@a5`+F>ΰ-q>0*s%Q)L>$ćYV\dsEGز/:ٕycZtO 2ze31cDB/eWy!A/V4cbpWaPBIpqS<(lȣ'3K?e Z?ڠ8VSZM}pnqL f2D?mzq*a[~;DY〩b𻾋-]f8dBմVs6傊zF"daeY(R+q%sor|.v\sfa:TX%;3Xl= \k>kqBbB;t@/Cԍ)Ga[ r=nl-w/38ѮI*/=2!j\FW+[3=`BZWX Zd>t*Uǖ\*Fu6Y3[yBPj|LcwaIuR;uݷ㺾|47ߍeys=.EinE% 1zY\+͕߬VͭW_겼cazyU1wOw)Ǽn@6 |lk'Z|VZpsqL5 څB}>u)^v~,󿴝} 3+m𢛲Pz_Sp2auQAP*tLnIXA6L7 8UgKdT)*7>p{Pgi-b)>U6IXabPde Ӽ8Ģ8GɄnb'G ֤Mcv4?>HC78NE@UMc8>`TvZ:}O wm#rT^GR7Mwil`$lzqdN#mj1r89CX(%!HcQw=S"cUT+@s`6E**IJƻ|X 6mXu U! D"/RJr!*J)\@qv;ZF/(yӕug<*_jDƪ075U( x{QU)Va4ZxZcU{!Ǫ {PWa 4Gwc?U]*_ ߿(7]ixxdzEbI5N!xy@܌Ofgwz2~]R퀻vI ]7 n//Mu5_ӼƗ˙k>{|U߰4;4la,7<9x/+҄WC]shc~3͵-`v!oiM7`\j= +JQ2uP;g+֍ɗe{Πf|TzG lf:A80O]dx! |OȄt=]ަ>%3puB7 0|/0_~ⷃ_,y]\[h/IOSh>IPÃ[|(S 4RW1=_WU[[~{$F糕JRi+U@yw(dHΙ]98M[~*TKÚ`K4f/Q KX#P^8t<>2Ml%B-Oں{}z8~dPu۲-H//Q9wSl/lxI.3a /E&ώj #Q-_'x]5XrՌ^0(8?1/MAX\/t@%"euΫa2ގj}1G(A.8y<s0a%6y)n,C/&aM)@yD(" ,E^1A">bPX2~s}eb|L's!&˪׳{d~9xyUΚQ-1溊`y:Y1I`,&Q8Vj>[:H+-)UUu|00K^V L'Eq,D@L|U W%nVүT (l~E_P-= -#/tr4 oyR,yxͤ%ΞHG_ 1홪o}njZ. t5SJ . V;AD 2hDFy-bZθ.[G_ciBQ/0ȣaOWEڬ'&דΊ_E~yڌ sSh&˪f t1ywcrTąUQ2$?u漎h^{0ibԵP`[6(Iq.4nw+߿sלʰzҿ>[D`Z)Hdj)bY `,Q|Lߊ[T(HXK7qV4@Et ` Bi5]<9t JqW3鿲~L;iL E3ƼѕϢoo) / yot׵[ަm9&i3![?|0ڴ[$< kϹY*XioROuz5}_[f2u6}Y",͚8؜u^mrn$,ngzD !'m Cb A j;FƐ#& hok76|CNgmLn 0U>)5(7 =N 035'':ٵԞ-ۚ4İ5o2 4q9]`J4Ȣ`v4uzAcKF˲J,,/2VavY Kf(AqT9\olψ$Q[gwψh4Ue_ݶh]$Sfv|Ծ {-`}H |%+X4Ub 8S[MQ3d b@Y0iC+Tj&8Jҽ{˰w5&S',L)>!}VHɖR,vXǴσUTf]qs˕1]~Sfs閰wx)=ovK= 5uHtNd( [/%(sÈ\9xÌFYC c,o#3U52⇥9lCC( =QR ;rI{غ-sy{|C"\JdJem͕yږy"ų[^UP?P5AW/0%:E9qqy! 0~^PB|E>N5dSY]+!r"ԡ +[EK@&v-JWD8²xŲFEiptiD~z&[bKCLTSgֆ{~;d0R#j>D55bs&M%\čQ<2'qDM}/,|e-Jybp:Zє7zJkCw].HjJmӇJ"{jS`+bJxn<jGjZ;Rns0][gf4n2=q Y6uQ21:oJS,K 6=oIT*ώ禎&QuDewN(HU okr2Z'W,dø.۾UrjxX0EM~섥9{R.(8.~F ˦]c2oŘaO^mגbCY( *u%bЏ:S1:V>>RF֊t~f++s*K[a5u! ' \ ]5wBW9 vL3 ߶y,ti.c@v7!&eail9{լ o&ZO2cK¯A@9n|M87F+YdP&&;G,oF0eMIp(Æ`/EP!:]}_[*_El$m-ubDzڐ-iʈyM$ږ'֣'sxդD- t'ëKmE bmY\۞velijmY8|}V}X%lKEwTMizEVWl miR~kb; Kn3qx>|ő 9 "籎g3|:@ן ey)T4DHXs c;ō{yγ%蒖ClS!39m+!w&\YfJiy;12E˒Qlx udw) Ǻ&D(džG:7yAf?0,Qⲽ=@fq=EXH)񌈷gYI¼RT=ęЕ+q;T]U&,lq;2W{Z+3&,I#/;Ʒ?FAvL%וWI|լѻʺq+3{K&Ow7C 筃_Uab*|7(T>Lui`l4Baw3Mw i 4BaL*ӈiV3=<=Lc/4[?1`Y3?Q)*8TƔ锅Tg_2]?>f`L'2@ 5MAb@uaNLgaDN?3Mc֎|?CCiAv*J[Sjڨҍa c"up뙡?up?;BP^ 3];hoICb\Xfz R _iYǹAqh`Ϧ?sXYM3~tF`P2f<: 1i=3&Hۈ/RD%䐽?|-Kh&Di7I;<6Oy^.3/h|a]9Ȟ$snLJKA38.No45JoI?7qNf(Oh:9ŤY}hV߾Pq] GN+908B0u7 ZoM&*X˾MJQ1G[t8yqUW˳ RQ8D< zU6x-iq`˻ѭ@; n*1+z=;{t~;?B!Z᠃P [ IH]Ł`}/صGl B, An'@$(EI*'dv@I燠Vŏ"$1 hEB1PaIH]? \r]cĆh[> \|mNT# avr/lg"8 ,rAk|# obnۜu)&@[3tqn^}'w$: t[wb&5>C6ed08ٖL%8d,dDapmل#߂ !U(m| om5{K.(ކHYi]Dirȳm?3iՠ?-K׻o#)==>m9&kׁքK_dGf΀):K-t@vX ösJ4[vos4$R: 2hWp}h T(A/pdWD^[B>؝2%ޟ%[n>@P6ďp ƶDئOnh[ U^ bOhIZn|Gq{'+:@_6O*= be|6Y^bWE1`9f܃rU; rXY `ZG̝eY}6X9HK | ܇EhJ Hey莘c;Av:l?6޿;ؠ  R8 Ic 2@(5.r` |tޤ|vz/I$2p!D&,)c8/Y,z5Oyzӫ8Z]h_j% C& 7;%EQĊN<܆#5"v%  @{_z,5> vډ.̗rǢa`vdlݲCg1e`*yqGh?$sZc4& aB;{~<$n/aoY=?b㞦(PhXF",?`uu;+`u`]O&*`XeIwsF'A˪$3WtAG pj(K&/8=J1,zU]g8|ař}᷌Y%Q`LJ0zߠn>`HPV%LK=̘djҧ@I yj.XCu`UwS-4{@OX^U޿?9wgAǝ NA'tb s0O4McUYsvbdس;i ƣ=|? фtդ{ GϚ~zQ''j.B?20@3L~z/cda3kBO*rzou5b5 vz2ѽ?1CdYq $ _WE'u ,A]f9\< `L<^qd (iڅNО'|/-qӞ +(Ⱦ~?{Q5_@u7$]PoPo B G!kOpCB55*GؐPrPr}B*W: PB/7j4*nG|mXE <5ږw9*.AGdՃe&F{/nu1}arb֋>n0 YY}\V&=ug>^xN~ea:ImՔ?%H̫GϤʼ`]遲mA9à~̮Ŋ.OW zU=\$ЩǓ\q*"Bت yxaOR<+w mgkI- E`kY5] VXg:%DK:å?>ay<9ă#<#9:Θ?dz10<{ oEOH[Эy'=JpPqZ[(4VԑR{*ϷY˚~lyVJ¥p=ֲ rS8qVpсeZӢzͭP=Ũ_IP{ qR/Zh:,7$)0+A~`T-׷Os7$C4y^@Ά[fu] ?-@a:W2 55A;puVU\-LNcLm0C%~]C˯8z^ uγdCЗr2,3c  r"@{^LzSnEГ'2ilo/ݫ0?9ԧwK(Hlh:'fͣ"=u>ael̖ Fg \8zyA0> vU+o/Kgda:"OVxW"Lq9Sf^a֨fY4>4zѰf&<} C9Wʬ?IoGV֮W~#Ƚ~¼kr< a({MZZk>?@t*b7Q'gscF}OeS<}zO/h^/gÇ]tYd;6 =Qp̓!)]4uG?"ݾvU+榭=Kd\\DVRB,y]u븨.ݫ|i(RnVP<=7jG.+>|ΪߕQG4S6$n3ă_ jul#؄"j1[ UX.@8 R8E꜡x(=53{I&Ѕ#W[Ҭ7bJ()B#<>e[vl3<V>_D/}%DLuM' 6ؙׄlҰ9o1h_%3;jŰbs ;\R^è2V1ݥY6tY$ZR倻O npTǁ%K>6p!$ 2w2Fnya3۶y2 ƿZU<Կ/{'|)]y@M_Qr7Nav Ԝ'JzJ}i`>s7+YU%RjH!#}QK S%¿察m_O^=~_ono>WIOm5,GDx|%8w"?k1tcw s ɿ~oQgWw2Fkmt9|߽ d%E)^p||HzݟXa.v9zdJ.\"NWa,๹.+!ڎj{mܡצQ3ep σp'ڒ! /-AE:'hLSmj//ib0L7T),Ź&JWfzzN$[ӒU7δZ([lǞ,Xe zT:&ӣy3 v)Q5(c厱A ,Nnqtf ɛޒ2<܂v\zȦXWZ8- 8v8t늟r8R>pÂck*tՠA _)!.}DN"Nj5 knjnK\bR1IAVèۧ ?b.3#s-kbf=q)#R戌E9j2>bumgAH=^[?S$X 5,N !VxoǫK'ٕEmZ铂(:ѩ E61dn!i@GWY֘ZVl "{>m'=qu!P% 'ZԶ Egy?/ /o aBKЙuI' !7>hVRlU$)xUNcGA^Ap@eɩ E?]a M,ȊDO?2JgK/e"UCKฝiCN795.VX|8\عJS`$xgWY\,kR%30V'VxJBPİ3arq.KiFq1\BݓA7פdx#$Hus`R(ć.v\!>81= hl["D"+fAP}hbIx>Yʂ>tBރMq|r UTL٨Fl{#H: M51腘\'wD:&lsAXa G/YpY{tix41x$f*FԎI|ċI(- фZ AlS3KH(8snrQcqi،WIs4(11[ӥA* \1@khWg=?B|IWU\z{7xEm2%:P!9e&pH:qW{fѭ\n=7w,8!ul8eѣj ClU+\cI;[dj 'Ge 9A餳=WZJ?Ew&hLYj!HUiUkJ5UˠI'ශuPzt9Ib0\dJ3ym'wK(swel\^(>n!@^NN$=5"}Cfysv׻o,Ξ wCU$<%6 )M&a^?6sCuxp Sc/X&zd\)3W0\]i]oZmI8`Yp|e]8Vg.UYl>;{ ɵ-I'7wW,8**U !w'{yIږw)1iy{%]GՋ c/#K<3{׹ %Ô>נܪ_ F焏K&lQy &M$}t{Nm$.=~ct2a2"`v}7n T7uߴBlYU퍒*`פ;i.fky(u>F8,krQ8=y4g.:291L $Ez:ǻD)WCD<bs.8c֋,LAb^A7y׳Q|qK=k0H?Wz/ ~fc `Bas$(PѼ%'Hc.U J?S 7 6T:ҸH(GcМJAM>sY}2w\ZH>^ӎ=Qzf쌏]bj1^2cpXp\+9w`WSNNW'JνRodɝ4n| !BS)\\ECe ^r6VE Z!0p:ꚻ헋 {4{xFm ג(]! ^~j>i`7䖼sp$*xdѩU/w(L+'q .QZ R( ^|:}mDg.y}me!]W^eur3WM4C ||`u EH!%-H/vsذLD`?8rBno= C7!MÅgU-{Se2i{{ LNu[nZ젢T޿ݎ 4{ArŃ+;j&Ll_PS@Xt2Rh7Ae攗hjoa=DkD}2NcINU"Oيbݰ=(.-X7}!K^ׁEo5Q|\Ȕuv^**bR 3[+k&$nws$~E;D-ܪ%ĤmVԓIs.xAy_^id })G''f܉OD2o]=[}ɹb-I1@ƮluL}@zRhN>Kth6e~ Ss۶MG-k~7Jҳ0'یSCF0Hsƚ`2J$H: S9zsct98f MbNMG|-[)Wu G87G-Vu\e1Ar2ZJedܹRP ;]~t׃/;=Nd:VeyY&c)\b~;~Qv::\DbKP:_>÷˿,D?{U9:¼* !ؕ% òOsB^KB7|N8*CÁx$x?odu|{ *bJ*~ppyPZRۿb]C`~9@éBlVLŒ[;-B}^ZX0wrtB`}yGO c8;L@>0, 5@SZ[$2?]lAx)\s [w^jߗq>Ճ./+J 52g~pjU0 :eۈ! v_Cunʟ}woϛ_~ Il6F$Mܼ|'?;:?+v:^+;_e"4:-trP?Ud0w:0 43?.*:ꀰP(e 1Jט7~W{l?!@#%wPoYkgz@cXْ0ƙ`wQıd7B'5IsRUc5"T{aFa6+uOǹY5d3`ΐj_um/& `RR>弌K",aQ[ӫa7]'E@~p05NT+1rcゑdd٤蛋OO6U`^@CkID 0WZ4mJHFiQ\ۣ/Oc_n,+]r>ybvjWg7p-SИj~8(V.Ძ"O ~Fnm9BRύ8sԲf>l>E߽t=+tt~R3mq2 xBOLYڲ~M`[󞉏Y" 1[l=$ wlm̖±-HSNePdA*6*vɜ?%뚝N-]Ä'>jybKW:MD|&_μ'?;^DWPwjj2J[[ˏ0F%顛``o,)rEs}YTT&:a& oCur_>rȮW01__6^U sdkM31B]9e8B>;(C%ڊ{O~V&F6s}vLk B|ĭEQܰų91f}{w@YƓ ,лA,n 39|ͨEL"I3t?䛿n^w.)B鴧KPVsN)V.0zU˪ kW/̭-|k '%ƇoQ*+ WE|֌2/|I[C/qvAӋXPr`|:6/޹Bm\7_SK]^{Ѝb("/說E{Y9_n9z2j藰x&qG]ˊ홴}n2-wX*yLƂg!Ҩe{{ 'n['[Xǽ3b/9݊Q<̤ac8ߢZ8(mOvW֔cFiv1vp 4,Tsc>8 B>V (MbWdURUTQZRkP=#<p(uxuHIxxkOډpŒB!qvM_οa~~cIȷ#I\{EƔKP+? 5q9~hfzÏw=]%BIӧEHJ sq7M8҅5iC4ikٴfPjL4u/SQ9UJ&)2W>ՈA*VTu^™C%JS6'NZ:}#zjtY M>tCf/Ӑi)ѻ0=f2vQ]),2f_t2%e@ +Y eLAjf|6 "xf8*nFb ¨x(WO`C7Zg#\=i*7.h rzSUF Ѳ*uʯwtPjt$_hQ'f1F $,㳛\*! !h[ r aqȘC'u}>?ѹvSe^wb5Kw_) 5^0pØLh*σ$Y uS{s*>X#<%u k`n9n=B7WQhwH s+#RPbRNc./{Ƒ O@ "R~9ZwYvg$Xk_̈>$MM&䰧ztWW EG)J?d&JK< Í4AG!Sz A*ۊ} )|t 'uXTRc8mڴ:kL.NZ^S8 Sy&\?\]%)8P:CL{ :QZPpg*'.sY~'4;8;\jkIC|018#iLY,i >NRlp ó[aKԸH|:URR`YD*@M\S+I^Eg\3OA&N竓3IX#Nv֡8l 7w*LcRb5ݜ%\MPJxS Eqq%p4!x8\_:3]d3{75(L!&RpmS˨JSZA lFEΈ/ p_'8},s,AdLJiL]q`m#Zĝ`driq%cTML 5LJk4ܳFng! !Nb&@U`Ba)b6I3cY2f\3,c Hcyftr\R&쑄c&#IoݚcjE@W+c0)8Ջ@k%TDȕֈĴ 4xl\suLU/ځ~;j)$*K<R(͜YMK8 q2}CD$2*" ʴӔQHuZ@b&T+da$Vq1R,bq0j6a>H8"a.-<&d9E\>: Lk̊Le:P*) ONQ$,֬j*0 "SI'ױ,&1x ;n~7: ~IĚFpum:`j Ŵq\AqjPMgCK\XA3WAaP1ZD၏9rI&,ΰYZ %Z XUYy‰f<YyfAō%X!͆{A" ad*Ԓ+2D{|a7^'\b@4mMфP+Xd"G 5ZX 20 !ScDV6vU:&R]pzԞ/S=nm1 QyMzu裛u7WWfrZLKJO_SӺS!͵]s#M_oC+gM!d 9Z5ke _5=_հEBĨ;;a7~FnӝH%54's(9G-ST9Zrȹ'4qܪE-F=W=j 諆Tr+_a+0mP_t,>j3֎hw -2%z⛕_6|NH.kv ׬ PA5hC$6Hn@5jk´{_":YvS]Zw8+:!qS笳_V(ع*gK`M?}2뎲n2wzC^ IX&o襇^2vQ?5oTQ|H%f8+N<[}A/ J *c.C;N6 |H ^)d<@G L L&/DD.97&ioo5nt9)t,oy~h73ՒZ=4H1PKc2)v+sWn:tn l^̀9_tȬ/?5?43/~XiJTFA~g4 p7ί^wz0Wf5s}q=g_@)XOGYnlcO[3~o> ?g<>}ϙRxt _>ugX9_5}@ոD@QIF\pՏ)O`ףz ̛\i>?rQE}Oɺ?{ƣQgΌwo|^-k>q*ZPq :W?skRr T536;3)mЬ${…KqfUTgK*GjFmD).Ne]|c_x[>8zzR ?=&Zw âA p7ף뻫 h{րpp *&oㇿ >t sLf*퀧 n,@UrfV7?O߶ǙǷe|ݟN=-Ӌ= 2MWmEg#rˣpuUЬrv޺ VFv\{|4HMyTQbԏ5wAQK ˒%ڋ}6zfz`Uiz PX_,L<Ȉe?0~0{7|`'9|urvk&g،nV˚et駮rzz/Bcl'=2̤|<m)r]w9 dMIj Gf.QˈZP=veN͡kBu׶мRu9 [bXcSol?Ah[7zK)3oϐ#LM}2[li,"nKQρsq~ebh`4v'g)>^ .nw'+g61'`bGHsg8@l+oz&l9x:s:SřSw\Z/sx &GVF;~ӊX:X}G3I Nq.^9tMzם>w<-?A꺃1(1˷HzF<م1ݴt 3w.n3XSyY*<٣ޤ e1]|8uK<)|wDwvŻv_KZ(5îu J3W&~ 3=r#}Ѕw;#|EǂU:>4]!{02{݃ʡosSF:Ts Kh:dso˜J vX,W% ?Wno~Gw_F:n|M`|2\O )_, c}yP-F"9~B&8>f+)H h#Ni_0 і`Og!dR%O "XӦ3Xg!2)ҳO*U.M94X;vIyzڙ~Gdk .v>Ӌ\'[Ix,0_O+2駿fY~_,`43޲E9'_=lYR?\5PA p5%}!oCCEQҾ́`>pdC$x#@U>TjGCɚ6ZPP5a_)j5}0<(\\LFisCȤZ4{B0C Ia҆PK8_ﹼ;D }b/"'Xh(xB?^>]e]Ot; '&NKʉu`A'WxPH\ᤪNTӊfk 'kofJ6NpEȃÅ{)4kZѳgBߗ9U'1|='`TzEV[1q5_Bㆮ=n|;<`ȅQ%Z*1dcTQMl_"PkFnv4]#*\b*ylHJE=kO xCLuӓy,S.!%؟fveګ>~JQKPl-mM[ P/;y.NV 'vƒtvsȩm|!Ϻy5{ #=9InlpU4!~ rԻ J=.  Yph#{e:C:onj];&5n (Ei-4v3lGO+hvťݶ$e-j}1mYBQK[4;[URŨPlcTp_ fuM+а5`|+'Oy'_rW }0z]~ӱ!*[Pv}!(G72 .)b("~X`%BdM>qPFE Hx 5a$ K1Ard. W~8VIL TO r4pԂ21F|p'0R$(݄#.5 FV,o &iQ{4`iˁY&Iױ +_`'/u1d4͓[@#'g ZRY4x(I.ИjEBTAP ΐ\Zc%p U?T1֕jlFlzIS}rD0=ԇvq B ByFR)`pnon/Xb\*5[u{#r{JJ qbeno|n/Gͱ#(T^o "R6{%R^vPzz[w/^/e؆TbJ|`ʘz[7> M JzD jۺ31[qJeۺQ *6T)&aA[u{s{%+ '@9ۺѺ_* |ۺQB1"7*x`*mCJ1R|@*ۺJ^y"?TbֺQJqV)`g`kZ7BW:HNBnojd맧 ^S?T5nojJ$|L9J L1Pw=&psPmvd+}SQTK ŏ)إ`Hpef7 Nʷ.8Ǎ5sS׺۳Q:ނ&N/ ᢏD0!R&&J,1ʨRu,5&"Rjd0ügjg +ͯ5H$vd 9dM()}OZK6fǤ饳7çCZHQK+>h9=Fzlrn6D`-yJwnW*6 ˩sN C"/c}ֈZ3$wS-J"٤H93_Eu>ìDTPgEi_ǦMɥ\7"p&Pma oa & TF*vEQ(b49*/}~Q<`q baS82Cᙱ:ʩ@3+νR>?U'wnfHVSd* ⣘6UDApBȫ}6,/Y=yw)o̲4&Yqy?I;nb'4d;hV-ԹFÜJú,B_\2`r,`\ Um5eV\ƣTJsPeVnQXW7p䴣;Ɵ7mh0^ğaފr۟BVhRr;9Žm^3$(0>b(#Q  a/!Xir%7ͅ°7Y:rDSn .[u8F)ѿ-/T)D7-XR8k r ( CN kx@$ϔ:vusD?۪U.JF#}P@);.$ ]|p!]ؗP! ϵ o4h_v&rYHWZ1UҊ}ZdUj,LiE9d@FrhQ428K8ő(g/q`,.立eZ, hVuYUBx+tdUѧ> 55Q62IPC66UAe8bm9Qm]dQ[IbGN#Z)AR;5<%HWEy~솸ŝ:Cn#!6A3*K /OXMƴ5ʱ0JȝF#d2֤6uqBI:kV XIpO(4_iΌ֜r4k;%zX-grNY$IpW6W(Smgt'VdU]4R&$FlK^dI1k\lq9$uPU.^GDŽc!¢Rg^hBkr!T^I g,ʬv,ą( ,C-(u VS!vk!}*/LOba4lCcQAX2PF(L)+rG:ec` F8#Dd"z F<Ř *Oyń/j&:C.C=^ݽnW]eu{ W669ޠgGl: \deJ=x d|U򟏗<6"*2FC1_/_lsg"_ ~*6DʂqZ/$i2\c}< VJbG*ad8Rkῄ<3T}?29|$=ibpЦa$MNG!``,<ˮH}AurWzF;jeeQNKM>ktq*OM>6DRyj2m -F`@`TY/mHx*I !tB[f(vt;;L uX74e4^шѰaUvpÚp $8쀃'@6:}VsN3SI(KbyJ!\k][N%s!V׌'8k%nsj& VF+dٶӕ5vve~d( <8%Jvt }jh먲YT,v*e-K]: 7N;Re.䋺Te9I~,I@8QG2AnTQ {EjEVٺbCPP-GY'JW͹B hNABm! YPq42LyǩR1?z*G:7(3kKȨ:X Ju{*v[TyUgV${SfI:iXҊDg22P+{r~EӋsX>e^Z힉1AH *N3~5*@Z)F0+E"p@Aj(>qS]R7E.B~D t[ưmf; FN[NIք>7? #PmQq(  7@X^~o ]ހE-o˓H"nn8 g+qP%͹JXwM رV6_c :+A'R )/SZ{v.1e S>,27Ԡ:}# GDxB.sYI[%d 4l +t8bs(#2ifB.2*KBn&[X,AH?>jUe)IxSq}/f!Z Øc gTהemI1O&r3m&X:0 ,F^1X ﳼHND-?x|I I 5z?fjz{s(#Ñ4u)@,Ev)Sґ <}JX^V右 Fѝ6#hC=cAc]p]i??SufX^i+@AR WD^Y, S;5WYt+2oO"uڶR2( +F$}}8&+я ]xh7칵Eƒ-F91HgþPy$CѽIځwA[ĝGg, 619[x2V1g k#Flge0XʪM)xMh}c%؇ipFҒ3Ĉ9p lH.C4-h1#+EeHsK4%; |%vϮ>_ ,ۋ'?B ۏaEԐ{zrvBPp<1x?;a6ͫfl8r܇…9]Z.nՕf8\ n7nPURPFEUrǛIGn&(-mV)G/|Z~cGgc@Vot%i?S;{"N3tWpћ3%'K{ߧ3[@ѷQ}{T77~k WVy^.Ld-w5~,nFŇy1-({6ů2?5..K^r1"׿xby2f>*x')_|bkN0hJ^,e1_\1{2|ROs [b`%d cOjI{nҺ`|-hT0u78@^EAʀrK@yVJ /ml#Sd4j=:t:Ӿ=]׏^H/f#=C2ֱ-Xb87 'r%ŕ؋+ & (h-P+uMj^ZUk&p"\EuԢء&Wy]ՈG c#qW;i,1PnU[2cSQ'Y+mҨ5q7D*XysoNgjϣ_yS{nz53^NU~U\Rb:uG&7|.WOEe5~ - >A1u XQhh^ usN;;H4+ä4В{L韟&no'{ݘhnzs9ц=?S'~x7 S ?Libьyj[#ܛC.#/_㹜+t^ /Chx^#gLnRi0R8vYNt¼v"שׁ5SS/F6̘)D-rS)yI{X%qe=0_ZԁM,hf9kd͕ YexD~x_];LU*n.^1L IFŏU1aʲ9Ϳ}8_˰ d>V,X׀6drCޫKӫFH-`{!һo.[LgջI#q͡4zOyw1#,$̿5tF/iټ=ȩFbyDNg.lwCɻlLzr'1毙M6tX7!I}hzfrN>/јٚ 8Z1-X-GΘ34{i!8a"G#395'᧹:t vA֪8ZJ.m UD*Sk։E\+tyظ2+燏RΪ!q~x@&M؜py&BeҶM߷QNoWAȃ!>DΘ31 +ꦕ闦رL[¤O˫J8 1W;-ϮUgz^#;+l@G݇9'al@ NaH!'|MQ-JDRM4<_zg{Z',ubkwu1%xt.G m談~贻9SF27Zp]䣲q}thzkn' }gltP&lcs㿿`^t=D|4>jzMZ3$>gHʲէ//|9ff\}p}i\gDa` 5\`)dM.烮UV^Z-5_x.]R_ٖ:Cg~P)F:>885$#΋o5Ads~EӮ!-fuͅ]g#n7Ro&LCo:nj$_Yn0 fgS 3E12" Zp[//Ev f== gOy7cfR>7]v:Qtlb J)dܫӗ %=>~zn]v;FyY++G+*Y@_C!Ha,VNyOmÏBQLH\2tiˑv $H4̫`]TN[D^X"%9d 4@xb,RdcmU u,-I'3Y. ]֣do'S(5NlNd.WmQX$\Pi~Enj fK Ud T=&gތmzzVyaFTL> R1J9zh18OZb\(M,K^jL}~)5d wiU;䟰ȋ4]sК2i6b$KLJ,eb@SWap!nx9={e{9ZK0|dE,}b*P&w,>f5&j\Lf LI& 0z`YWaMfZnLdԊZ7/dl[KzmZMcZ}DŸ֌k5Itw %v}:T nTúhD|%y16 1`YÞ:\"'UPe%Hk`up-)bn XǠ]b9$} #%gwQRexJ<dm`xyyr뜃w{"t7sĮ9,;rmܞJwwN/4z\ɰ!8!_>~WBAg4]F%#{KL]ɺ'sjuOKPl}oԕb)"LiɪdѡV/6E&`)nۗl׎Cy_čbv%Xtkx;J~ǧ\˲-VP{/ {TV[Xr M{J>.P2cӒC{ps7Cj߅e[wo(YJ-h40k$bQkJeϢ7Q*ePWk;{<&QfU\A1ky~[~)'g?~dzʁUUdOݴ%7rXO5 g픰MNk5ig¶$m؞b T1;v UD TQ6ĩKՠNMZaPcsBRB(qMw] \>lYW!nzf&;_}YŌ~g+ixu9n)qH6P*Lتleup-왁-FJ%KDҽ$=b$]X` 5haSx8aYthRXAښ8%ZtSiW }]Tܽ|Fe,OF"&ôTAY-|ӡf}!p4}w.Aċ_?ґN9YJ6n ~-1jΧyO1zJnejjДjfeZng+Wl(-1VrSsint릝u|}~y{ S+eVkṆiy`ѤSWZIP#~I}Jbn.~h2Bi? &&>#T3o7j %,3Jw o`KÖy;wGr=j~:& yL|;j'pf>\|M!U|=i]rk NUh ''8bXR 2OE_{}VLOʷ mOB۔WW=ORڋ\Uxy[FO8UW--:`% '4_//vzJ !˫tY8P+IH_Ro-Y҅է%Og=l8%ƒ{B?C<}w3_<;tGhF#Xf߷>L<`wk(rh]!|0wȲDwq5h4Ֆkkt=f('8&G2>;Ǥ:69&L6qΤɉ*ã42OBq\ѠdVQ_˓[|SOfZ9m8t*W0 h!n Xv4biVъd-4I΢6\ H.5Rt=&zmw^P8k R8'.X?&B,s\ȚDSfD O~kP-2W!FS& A,Yj`-@o%sx="ŀ1;h5DjhuJ.%p(3xCmJ} KHaLye,wVz=SKт[E-#ﯡ*>-㥜 3tȲLL%dDKdk!`-vw1UfLcOB ʂ6sA;GZր)YX.d4j04B}cWP2kI&9FϘ*h32ƇTIbF`vB+Y@0 {C"k@XAu5"K['*nKO=#덲'*)JxyUxX8H9 Yq ʄ[2eݨ6(H+ I NmV#J< 4C|X*`H YU1f6Y1zG"<(tw!\v,e`q_(Ek&f z X YFNx2p*X.I*@&5l``&$T(epv`>h&ëe*b"pA% R8o֣FxBp,@yl3\5xoL >~He#LA@[iMp$TYp&h5͂,BF2[g .RYQ ;%A' " -ޙFD8d9X[)x`L8A`%E8 & L9$%LF6n)XD5`-6JYA4< da*&((H5㲉)֚KrƸ7u) P}BTC|2ޒT͒ /Ҝ4аE`Pm[+˱Ljk3]koG+&Tcuffd zZ%R )' }OuS-b &yֽVթ,3Ikx"?&&Qhn"<(-3٠[]HlY1"8qp1: l$jD)!g} d ئ3㛊7dbג]Y nLÓOE -K?l8@GƲ* kPTq8KViACc"Rb Vj$_`B{؅RAށJ H$1- D^0ː>h.g"%Ltv7X1Fe'!y1k}9PVC얇7#2(pG"eS Q<,lN$&d1,#!K Y}{t{گ NDj ey1+x0 az%[ kЋ9H"PBޅ\K[o2@8}Q#E(Hv `4JHrv<8B w0 3ax#H=́& ;ddI'\ }ANOe=L85 (\2 Y"+1Lez$y,B,Mw(a$R`#? %J`l ɘ, S~7XqJu^ysq ^x.}R,m|*(rx0uMfɸSތ;Cf#c`S˿= ^"bjЭ.ꌥ,Q48z;:f cTY\ iAi͎HMYAI0YA9h|(;34EK$њtR"$;T`=L  Gpz & OJ2,7VA+DOV :kcyrbpIL淑]5:# #6CXIr \$$򨒌 Vrw T`-йGX ,Dq$csr-7.չxDfjeR":h+=|0g4uLڐp"=L5W 5N=8E?13cYY>@ueMF) D)G ܞW Z"ӆ DK@(=(f- ZHpċW:VH)=<ݙ#KXzܓxD |2qr DŽ.In|" v7X-6W,Rc DL0ز > :zf9ؤd><Çpu! f Rt^;ewBމhGVEb0>UJi]?Â)`@b㒴{ѱeǁ)k֩F[/r yLD0u _ `ƈn4_uW'~RdbQʣ\..ʙ_??ݔ{Zk/6՗4l6Q>3&ϗwj2}BY^RR, WϞh#϶k.R^}$bTbN_ш}~PJQ>UGIlTOTOTOTOTOTOTOTOTOTOTOTOTOTOTOTOTOTOTOTOTOTOTψ>@JIDx>SR h;\RUO<&eG}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}gbJ(,tDbTFSr5%Hk[ا}*b*b*b*b*b*b*b*b*b*b*b*b*b*b*b*b*b*b*b*b*b*b*b}lvFjY9+Z؜Oۧ'0I=8׳ nhL$^T|z$SEr6s$`e9w$FV!3d$`5Ž#čepP#XpXF,eLP> ^mϾV->Pa fZy~q}&;]Zp~SL;^+>82lGʱ}1%S1)109V3- XS"Xl2cȰ`eG ,eVұ,:#+8|,=+9RF; v HjιR#k8f,r԰@uNꑀXaFY 䵂}"j?7J }`2"_8زzD p 1[W؅U*( 澚澺*&%x21^>GOh,VpB T(|k^V X5`i忽+֯L֯lF-47ו'YM KݧTlȋǦK'u:.eWn,bs]Qg8/OquD/,<ӆ'IG2c ڲ{)|:Nx5VSv}ANieX~2 ϯ&7gkY<#zThWO6h_}hxڋʷnJR3mWπc֖X)VN7,$آmpdz!%K3"aWP\r{^ƒzhAh"cIV.` THrm X&Ї*BʁϺXlV X%4浂}."G)žZ}h|[.hQ?8Z~{ *+|%r&p䀘~?}]r5hQ#cF:i`DLG|5TH#%h6RuXCc[-i4!ֳ4C6,-mBPb}["ae[m]BOoջB/-G;eJ1Cov[j ][vK QرJ2FF)Ň]+%X@]XR,?Vr$`.U^+ZƷX|j^6ϓwHRavWץBl'?pM'E|h ٕҐ]EݫV*MWq[ +ifr] X%vXväXcVfK1N6(9b2D0KXzʇ 5Yax5; XUc1mcq]H#HRupuS~B6ƌ,5F - ~>^ԀmlgL~^\y@ӝݭF)zk`;$=aNVuW\| ([bUʷL| 25Sd|60i9_GV bzsuvWwLŅ[+'0,7UJ$peO? P M9Z$^,gLd] ]g_5~ IgW8|@ݼ:j^.z1[ي}ee?60#{8o=6vr8Iۚ Su27o_;Gm[{-k\nf(b^ܢ|cFgtˎy: yV}[ܳg<?f37@1# )mcdL2<:I#Ty[jD=W4(qTK,}[.#%ۿx:MqXv4YE+^['9p%RJˈQZ#g}^[$N  bLj}d!rV 4HE$*g;jvu6V"sb:eab̙6RR L<՞Jb@@h50#!\JPfڔ"F0 (C3ՙYJ{vEn-} 9//gTl)g)C+%hY LJ&-LTDHsa.BTØ)3ˠ,rNPD sDU h cVMqGi0hn=#H5ڮ6di aM,:sz1T03K@Jrs6:Zc2 ?#&bjE> u7Ro3[|J &5N( 9(ᬭ{xIT{aJ?a'Z]Rv;!2$,.~$hhkmȲŴT`1Hv;xvis-^snX-Qf+TO 橪[];VA O# .Jɥ?Z-H@,AOp9j]L2%68ЇJLa6x!5TiOa 1 y,uAڳ*PB#C!%G^Q=2LQ\`Q/# !N+2`- iZDWp k `" T ˕3VH̘.[8yw8HY`:Z$R_7UUN%QKUDE9P9BH4B׀R"Pc2D=`BQ hv)0F1#,PttҫT%l`C`6QD#+ N`8$%)#cv,@茚$@frP%F7JI>dڅ ;䍁"224\ɂCQisWcD8BuG*C DYl9i M{rC:jϢ; q,T#5@@ěW iTrVkzYAދ*:+ԿQn%Lz HH/kINUBec`H56atH4;;| i5gdEǴrQv9א2 Zbe -骅 #Ipd*"lt&wPpR(uZtkH繨)@UGCom2P^Ev 푾4 |@aRB^o*(9Q͡xy%\CD[9DKhLwPzTaH>Jk$=|P`Հz{jqaM$A>GonW`d7 trp 94J7yMae NHO( Y1<@JQ$F|RL =\AU,:!ڰ(RUٞ'#48pci 3725 $m=?j+\#|r\zfR,JLT@$4.t-c|gރr5Wἠ[5ʾ́;J*>@PHa Q'X9dС 5@!WAυX4S"qFd>3'>5!=Ke(1tqH*t+@7/h .`vNQc6DUK'`ebBu"JI-bJIDa8uAbɱ:.H56AB*? t+y3!z c #v-~Q7kqr,ܬE!}&n:]Dua{k*8}{'m-;C.I5Mn{\_l(flzcOTlӤ*sYvc,\_j+]?^sapy])q>n?bQzZxYժcԁQ2= ?j:@iY3:7N34N34N34N34N34N34N34N34N34N34N34N34N34N34N34N34N34N34N34N34N34N34N3<[)uOD'Ή:@ '7ͨ2x+݌:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨ\:toک u$LuQgo!4Ш#7N34N34N34N34N34N34N34N34N34N34N34N34N34N34N34N34N34N34N34N34N34N34N3<Η+ߕ|ŏ Z JmW_1+S)yYWQp1oʺ_~[f/xr~v1{ [ү{D*}<~z"`NkLg1:, 3P4 oX!SS+WL~"`L5 FOڙ=>>S'`tVL^0=R2vp8X%9cS!oXDZ.7uaOҚYs Xs$X.5l"`9aDJi,J*i{wdy2B*6lߙmٿ&T=._ΞO%>lLZ8 n*9+NL`SBY5\>+tOTJ89Jii2h#-m&'&L٤&+g;CsɉkTzVhfO|p;LBr?Z -FKiҳcRɧrQhfްaߤ ~s[0N,4?V>_x\OG7ԏ ]팻UTb: @h;&:i$tLcmi=ӶD}L+^"9U U#^9F|V#b,wj%D,M=4=GD3{ j[>IުgP=RNo:r|PsJ#BUqf_^ȣe?|{8qwɣl~6\O(_֗n}: ^1:ԿOf#7pt)yFb.Q W5_.1h$v};qMy ~vˏ(x혔i4{4gɜ;v?Vf[5>pTfgymwF9z#XmHʷ׼k5tl \0\)xpB嫋^C8zi|vvi6^>X/NwNۤÇ箘Wn^gWnZ@ĵw%IMpЬ,)EFU.ӵr-7Zzשּׁyמ&f뼟/(߼V>#Y!l:_WD~s\Fkv+-.r @/OޖE'(LO$ iK}_ ƒCO'ѴM֣Tw_݄^"n_[m1e|xu?z۪ Co\r } ~$~뷻 fwش~}yutu,{{=1~4 H)[vV#vH/Bek,Ww4mgI7`G11)К^fM }x[&>Jښ?׮ &pl± &pl± _8 ñ+'`j% ҸT Rp&,V΍sоY]cd t[p(rVTy2 }V mh:0En#HWɉNU<$֑t:ǮscWenr4mZ~e/gZhٳ]R/aSn%cjSejokumKz-t~jۃWfwq&{!4MiYimEi l< 6]~oUzYfd7#_!z/,# f4sEȕfn-Q.(3[F\찼|*d"6 {Q.WC2཰HeHQ)󬈐_184I(c:9jgm߾g8_W/蝋vzS[SaoVdkY3O̍nm+oT~oἻa}_A CHD7MN ,)L*Rzþ Gs1Z 6W+y܍#+/#l6L4`qQF=(9w8+oke^퟼t&/%p\lbӘ1_axgO/9W=!6瑥8ozJCCNw^G"/lrqB+ſv0n7ͣ; Z&Hد*ͩjj Cx.08:>R=~,oޠV*3\xزQ\"xK$@7RHqDzVLuuoK憑1U8%a䌗xQNA2"ojQ~J/",ΖIuȔ"BP>K F{u"ptNX[>ˁ{>7/ۑ*JI^;))8S$0qe(&#_1:|Ł䫡@"%*| 4$agb)̠Sqc->,o_݆PEc& H'n8v> 8$lT\de*yr㻛>LcTo@)O׬˜GU*DpY1:+Na*|$Stm+k#^m=ӂ,µZP[YM&GCr qcYnl,*/MhQIN?Lu:>x0Z u~ϹZG:3!Z~) H e"%"lA^VA*9(,SQj6K-`?sHH۔%'tɯ~KbDzYiټ䎴P#5ݶc:kW oz4xkVl/R($H'LB P阋aRHDZP.jQ~ƯrW +n|o?mBKqȜ-qpP4F#'[!"(l6f\ ˚ m\E n8BƽPcYlS/QH`z-uZ(> MwiŋtLAD6z0d)#ǭ8[_Mw_no!. fO\Q ]܉>FI0{F8,Ō>v>ѭ/9 4I[H WR,4`&=4Z8G?rNeM%8z,sI-X]>BMgKB! 5irگs~mZ#9/]8"'V7q[VĨ|H:G<b~LWx 3"9/Ӄ!9l=c;Nzh+Q#Y.tйqW%f:xk1RjM/e&B{156aAZI3ܡ2ulN[l~6>Mo(_bMP#m1D孥P@Fsİ˿rF ,Fq9^EyݺF-ngb.r$aQ"a93؂}5C帞/倜Ap%V~-))8+yMF7#ptfX;US% ]h0#V:2lC`uJH =vo7R7a%IQ"$ȬHJ;ɜV<##S1:LŁeσr@#{h,/,&e" g1PׁF4E:MQcY>i uԮ>}^F6V7ZݹD%M-#J"8i#Gbv_>3z}`qFcR5-CW9A8j>n;qއr5 C$]TM֕BBh'|?h<r7߳}7_\QШxF %$Ce9G(,r}a^ШRrcm:Nɱ#徬T|Z3߁Hi;3LF͡ش"+ӱQ" Žce{X6H~{ΗpCTTHIBy0 y H[R[R>I,65QWP8%fvFK3nF#=4*[:a=덷cQ#a1Fs:Vɚ9*E Ѧ(&@+ı2Ui0MSnKq7[]"\\PpEOfܝ3^x)gO,DPޘMD'w> 3[K> ߹ƿ,K+({fˤ3`3= 7=dox UJ5oz@A/)8blFl 5O4sѲrHsfg(€6 Ԑ)L-oIJdjEܺ.'X 7o*f3aD)% 鈷>^˼@y$[<3lZx0IK`@$EsE`湢ǮUkc}SI$$RJk[`3R@d*9E5:Ąe.'+b/?vU%&V,7eƥ !fEL8eJhYe9 |,ˍG{݁įW1=uD۔LZt!Lub)Ӕ#18:,ǁe97U1v}G 礚 *CD/- :z`rNPDoh tLRsyN|hL"L_|$mT.Ix4'35uf-%$Izh2n >r\PxzsCV,l<\^0#H  dbWD0b:zG&cw J1p:9~[M+jc0E]8Ľ>K|==᫼X~,0ߜ3%z" KܺB4gL=VCx(,7n :[o*G/CS@}X5yUԤP-?󿺼FmUo y֎ dvLޛ?ՙ0|zyWy /Yyx#oөp_*UʾT]_7`! /<^QmH.E%. A-4(C68Yr|û/ | P:O3.;p?y:7HӕڗcLQ^;H- vg!NWfLUA⣽+; lv%A!=4ڪݪ(,7|:˅f U;R_3Or@"6E2Dz2E\Nt6}>r;Y旴EBQ' ErNvac.I͋"J>ێi}Z1ʋi/mHpƜblۮШ \9b^R}Tl{>f ðF^CN;mtcu+2u®GUޡzWT.@ IŚ4KRT>hP]/ |$˙uTf)3s"55.P. =&=koFEܗf{.]dn?&4ɦ,+"mx~դDi%eÖUͪ~T׳x?j)GZ3d:}5\]>dJy"ʆykxLZ1AS+FlN٤J N4M` 8P'\eq h?O*6R;`!N uaN#:ޝY%Ij*ꢃƄ/x4ddTEQƂ<lDay)Mb2 L6LJ-Vv<Āi̗ժ>T7 L|(rPR3,Ąc9 1 jb:"Y ^A^fYLS,uǻ0ǀ]>Y/NR4 6+C2AABDN=#X@`PJDN)~Ow?>H:.LyX=&DI W4&gp!1A|8 ;u'u8_X*vqJHM2dPnsyrVJf†?.LW=\I)(@GV|\j3`t^"t6!*R#nuՀd㴏)+ 3\{_$r5`RV=TNseۋSEQHl|U~ pdnX}b\ ߻+`w<=`s3fPWf7^4ZmFW`T5*qG5ps9Äta 5teNmNAo\}ᖎW={ƥAc>-;VwFJ1ca2t:%"q>KSJ:1fVeUq廍/֌wB-G/F W4Ɩ,WN}4^j3IfB8Ac{81|fGS"z=ov`sAͤB*6&S"y2̤tlܘG9(3 U\d [)A;axJ~&ܑ9A`F oEN`zC1_ {63mpS!ؑFVg_8gϲO)w64X*j@bQe~VC64 O۪pRe~3 ҊIJ %p¯7uNUIcOz)H'b)oyUP೫vRgG,Kyd*{e慑.">p& [1Dur޺j325:U/ېS?ONJJ2![7g_\X#Kn|C kP*HKz5:cB d H{FpouD4HtmutaVQ @ɦV(5qbr,JH u`ԣ>JNψ4%V01!G[&$4%orKua6+B z^H9NUsw y;fK~_,z9?fpcPl(wFuY0!4.;/Բf0 ֤x4S#k7x57*l;oGR|c[m1E!9p4 tNA GkߴgLb.lח-VA6/ Xpz og ; KG_/hF诏J#^k]wUDempBvii _Ȇ͙9~ꑺR6;h G$Wc_H jB L nBjiؗe=Sb~N͐|LD \ӽr^R2c>>^mO"Ž%7!vo|E9>0C  u&(($W;eJU#Rf^2 DhffVS^3=\w y6+lZxn6g~bwypXaw2ef}nV"]^bvWy\]*g~q sr3 [  ΗsL$(Z{Ac(L??ziFxD:X[sF4 *9Fh P]Bۣ qx{&*r͞5zϬ7ЫE0.grnF(h"<%p,JVu-s7Ы%JE$*iIg:S;Y; 1]ۑ~MZT#{Gc^NFQGL:DBcoRzP,A?D z#E Z(`qXKԥbrm ~u)|erg ~1q=RnKNN oe5+-__(hyZ~o ӿuA]LaV@//+_>ɻC\Hwsh^Q9|K@gvQx sp]>,6t~-߸./>2/8ۡa)BavO|}_"tL=͋rPlÇ'ѯ5m-,7G~*_.&xa͠zvT\"IR֬q྄=h~c$W(S[~ K1ΥKuN-)sl3LdJ0Dqd1IXL3VO9~_", 5ng&aZAE\f<%]Q~|X,ZjNoj~ͫӍ_ Ɠgi`?rgzX.룢d%MzcT{5Dkf‡,LgyX,.MJ{K?yyꃮ&ՈL/nNR'5y[!P]4I{?,Ö:?_2,!rs=lZVP۳dԿ¿?~y=C {qҝ :\E1O`k~m-!rd(&׃;]"`٢0*pu1¿Ղ_-`(hpfϊd5{ !%h/pR> l,ek X,EaPȑditqtg.a&%*ɥ$q`0[+  [̈́ (>^܂r6v7VУۇqr+7i˹NYm ; Kl jQKm3q!QVp'^TGs"Ҵ}VnW .Ɏ1cQ$9p׭BnxoEVjdtЋ;kRFL2l=o.Rf9䈛-=F}0nG!00]2нk6RD$SH4DJ8MSu",ppY^vFzQeW "7RԽ Z>K|d =}f,+KnxPF6Amub. *zADQIp8V H`5Zrb$DǒLD6eRבq,i;0ka;%w_XTM\1>UU0wK <* $.' {+9LnBe]ڏ%aIZ>f=sY̎hXH)^F~RSƝ 32L'ft 'WdFDE#b+E$=<)-Oy^~?`q̘PL(g3-$ Ug0\xilkl`9%(9b<Lp.eIw?mzkFٸ1ryrz¼i HLHE6 uI%5q 1UsPaEK=צ}蘊.>s,N D+譹Aogc%Z "UlDz+84e DcYɑ.ci{<آH6IQr T%H&*w/  +n5%CtT}gXsk3Z7 t5[ .~c|'_{0=W{d/bG9TqCLHkJݣce'2I5(`#n4on\LSO/p r#AT.oP5tv%ؼ^n9OUe h =y}pc#sNPZԷ, k('$ Vc} tF_ ͊8pz3D"G "6M[QҎ[M4O5P73h_%w2!bMC5núyTkT$|X0^g=Ord-iutY )6< )h : Đ Ь]Aê%O f,0"|wǣ7 ݷT1 B._/?)"XTZ ѿ֤i+.} 0~G>|쀯n()?tHK?6Vm(/Ώa1O>?.l`}tճn֞2080ڂyJL_ DI5¦뵠iCC0BAM`.Xel1jɽ:Zgxޖ#>81TCi_G*UzXS`K4m"T; Є$_$.̖P]XPy%R+H/kKOa1Z/ K MMYm+M3fF[ :K%"BAz⋆d)5:vɴ7Lr/ )i+`8Jױ^ UPCc^Y0 =(:"hM34;f9t]bOOaԽ]Xx2/k*؟ع#(ȗHq lywלr ]mq}*Py]TDN2Gb o_Agݼ>2SX'iù C4]S ru?]Hhr dKS"✰SQw}X6ϰ9KYAZPa.2T|̧C=">ـYXnsRxzM 7Ap6PF h>qޟSH:,mQI2Q}oxq }i6zb@\!03FH4QRP\6g3h fz@G)& * X%`ynەFGYuUUM}򻇚LSJ̕ j~G{!J|>Jy{ΥdB=a8S9xKʄԫB xn~۾H@62-k)cj;1hȏaQ /1}< 6=s ?]IU޴J<+<|z8Czte̱+ /w䇳O>XYXq)-զP.bxBo!ֶ2`ݭ >v`O,vV"o-G!7>bc!MOq}Aw u>^xrd'>WDSOʵO$y^}v 3e$smx9Z=q}s}B$S ?_ͬK4Yl^GYJY*IhbǨE`߂]h/Q -g_5Қi7X)t-;}Z "ǵ| *"p^"cNUQpI.A{ xJe $18!X`YqD{!Pva{+DY/:H%9QH!Bić o˿ w<sZL\Ͱ|%f!9..t2RRY〈ĐrE%f6xH:_^y3%T쭕?`a߄SoT)e'ڮ:ͮv ;dF߼ŏwֳ̡$3f@N&nXh, {腑gJ%FUE2uVp$Rtl^oN"mQA=G=gLoN yySw?bP:mwm8k pp@}Xh Y%L{$ʀ:Qs9!`*N.ZAtM# I91^7 :q˷G'(N-KGptLN(') ؄!\hT- :1An hTan0 0 @Ӂ/;>K,'tGġˎQp"LNb+aQp9S 9ShTzg4L>42i6DE;|[ ͗ Rj2$c5^4"JpxAV8:&j8_缜‰FyG-fK4lM'lP+⏨mW#UpF#qooSwTdG\Fo1qk8Q/ N'*@BSfK*/.wqΑ҇ W8:&lԿ;"%qW<ގ(uV]n8J[5{pU qDѷDguNvPqnI!qy9]^yU=cCLHgeEޤ`7D8z֛1Mp :gR , ^L(MR2.^JʄԫC/ac+3\D͂GKxV^2Zr\Kpdhqjg;]lD)D}S2E2_sF{cփ. C`6KLreHolRTHC^8x o]ޖ׷ ⻢ Qږ:&6v+@VJ +-qTOg$J*4Ft'jr;9tNFۏ}AX${ Y/K2HO|!T+>L8PhNEZH]` (;U|E ޞv('6uNe,.EBv>Cj;' 9p8ם W;o:&Y'MFpLvSƓI1/YtBG;0#p8!NKI8AJ K=0I*/C AA|2wqxUse|H ]E0bpL"lvԀҁO1د1q"a?[PSDk8BQLC9:vNpML쟟Y>fx1q Sc*2l%䇌c ZU$X9DĄݐ(P(c:.JmjY-0) E\ WnMowo.x[tPiyG(h$\`-$ލgXpL ݷ*L>vXJ"b$|Ч2Nq(8bs^tiZ<#i%Sa'TMt,OY=dLXJcc "}ҊBfUzLfUJ&@c({dD nIRL@Y8Z ie_:Qw ɝOxjf'LuL^ogݬ뿗+:lkb*a2nF3{wLB = }%3+ {KkY/8Хoh3Df_n}vM-0Y8OCvvL dA_:ԕ5z7k82ӨTjaSQ mJU}gWϤd ĉgfVϺŁzzîv yRI4#dcpѻo`ԏ<!+cd+E1Q&u%Buz4Py O5X^EEJ^LnsoMX֞G@Uh=@\|bh8Ʌ*Ϸѹ xaWC[H5xU}/`QlflaJ(3HM?rv{܊04mN??c!]N~:gʵe:T[y1hKk^ nxV>p(P t4VzK2ңP\t-zZ2|D51S0/Qid,*Mf_sd 5LB$R ly+6G-f25m\s۱|) !bD* }.Y'b`F+-g6m#pjmF;XTǶJQ(i YYG'ngw&Pj n'ۮϒ$妒s9zLx53%'i(-g2c3cݑm1-sAU;o79Pˁ%_TXϯFYַG?d^J%d1tk0gԶg7Er_~)},,_,FiH#&qgH!拈?:V D$xV3g*^յʋfi#oA?SU|=IT2v򦐳$/boW&sUٙJ `տIALOCj/h[sb5rpu6_"hn=卜]Ҥ_neAc JoT!W)f]l>Q?6^2LxzsCeJuH](`rlXߪ0$g*z 61[VJ.Vӗ F)\EVϽ,)k@&*|xy"""Krt(\O{E26)A/"؛L4 ɸ7/"{qo2{ ~}EƽQĮd_FubOCİϔ7H/Ʀ|5zu ୏`SSB|5ښtS9a2ڇk2"x&/wt} kot&J|=FTJs-nVk<]໵hqS5>*x7Kb[oZ~ )|^{+:Jr{3c/A%oxS.|20gn ZB#,2"cW7JCC hbCڴ4Gi]Mv_׺謧1ebi*VڽqhkݡC=W8rXTgnָUFuFև-jص0c&) F2xBsRA(c|2;C{؃CdBwý Qi3M~FݝSӁhzЈ&2m19,4:ɁI-t͟2\vN|mewNDΗOOĭs.s:0~b|qNsaRhc1,hjD\J<>J9,t8ak ;8 6jM:_>EZ7w1:'>]@"4Ηo]պ PI%NNҹk: ߷j[nZ3cܝ/kY;csA]t8HguwN n,c1Y(.-w guLNuHWsvN(^ֶ:|pOıyk)knvNYl2k~z:rKS(OK&Odil)~|F o'/  و+.E\ilX6B7$8梚8(eƓOuT|"sMyzK<< ނO\G} ,5szut-VSf巸zXMLyݖip Wւ/cl^ Se`y{$~c$߳0@K .'Hf~Tq,2 ,#p$a(@a"e,RS8H$Zq^͛0c,'Q^!>Ы2<yVI(aYPke(=ờY>wy Hg#FFgtyp6EZ4(z:ƒ*s3 E:ŕ_bX[Վ0y*m0p_4Nb$cZ, (O&&꣒+$tH?M =J)QcL{OlTz9֧{EPS˟?立Ͻg{s)]V@ # O.*}kؖL:)(i@ "$[` )(iV3t Ú \q)i 551birW|[Ӭ|`Б?u.L;7YڗtqQ\} aĕF8 F\it9:q؆uWƞ}Wߎ˜9XWdg#4l+-{ڕƒGWDg$aPF\it{(xeI\il]? /4,|Wߠ$(>#qst>ƠFsW[ŕ2z4Eq ;JAyd^p96 ,8h$Udԏ~%Xd͞ YHKh"4Q*Xs BB08eADEE!I,i}Mib,bHDD!2FaѤ$! ӄD2%OcDB(Q4h䀦6I*Z.Z~ IK#) <4L2*:0q(i&hRL#)lGV:>[<-6 FTD0q}"؏Ac1&D 81IAH)IUR a*BKdApgj{8_!~` bF_!"%)9N3pD8#e2af5tWW?US> C>S!~mUJ@sn*]_ {YWz WUž_y~~4ci::r=cp_g-y#o@L%tsOU_KuU^ U=|?v2#.wrs^os>+ĝFLW)||GE=nk~ӰE]NzZ1Ǖ9~m)Nl<].7ydGdf@wE4(&2l#n\]D_~t>ŅBc-0=x]pq$ygLX)jpDHӮS)w:͉cY˼sj(`0U6 odٛnyiuָ8a"X0‰+ŮzضUeN ;?rtj~n{thI0]|1?OƋVv5AU:,4?=1`υv*:deshe߄LY0#RvHJoE*N\7AW}vܞC\6o&{λ,{X5~Diaz,νe@C:`@FtC`9LB3ccIS8'ɹ{ZDS*XG'65 -N ֎  }TaL+LERnw/ԬjACbUC r1V^ BR=p?Kvd-F+#Xw(FBGl@L6hۼcL 5_"P!Di6J3JJe`skHd"销%47HъpreVܵ0U=Y?m;j-[nmAªkG&ĸ0U7BqY]卸&9߈1= wuK8xZ*}tW_2i\y6W?M y_m_V+w fW2 ဝx$Eu,8$Y";Qc>%|{Ÿj<ٓ T~V]'Hj%bbKz󸡗5z ].^T.Sع YA^ƒ;YAM*ެ`˟=)Ƕ:堫$ߐ;UIg3%˛ms1!#͘/e=..׾֋! S"K&u \\;6-e}ɬz*~L~)5 O.TiG YЧ{l!" =L3u#) Zi й-=ܮ5pM8Yc+0Цϼ.aϼ.AR,>Ϻi. uz"(L>wPUV[qW9ڢxaG ; n{y;ͯVAz}iȂFi|+B~"\rB|{䊐7wz29 {a:ݰ875GWWתz`aBzAD_hcBbD`QV3C^^?=cs{3wmQmI9͵ Jjas|hĈgz`lNgWQ7֊u,4C2`dcP`L bSvyu]g2v[,!NW0#Er*CDd2~嗀/P (y2tjVY}%;K+1yety좖1؀~Ƽww]_RqzH*+,]euR]e-8d^(t'_U"ۏSj?}u2U5{d@$4iQt3nmqv*&Y$i_刖AO-Sߢz]mc߮mp;pFb,#Ld' rZc,(t+RhcD$>Mʕn-K;J@oU˸eji$.D?G|2 HƯ_=8T䏠HDud;䬣'tcM7po&:9Xi~NᏳPw^]w Np%Nn2뭿z q'sa͗?*'ٜ|p+\Q-Fˋq+s8mD{.~[}l|$;˂h|m#_!~̛l]D_~u>ŅBKp©Iē\x=qh})OT%੔FD| e b X} FSox:^fY^O&"qK58|Ep]θ~I0)\1khoNAv}7?Okۍ+"s&7o2FFXhYTɗn *I.ut=T۪҆*$7^,*7wzm&פfnߢoN0g6ټ0՚nc;ߏ~Ng./v976z*K =^l;ݕϖ/|\>m=՛Cz篭^?m}BF*b| Y!{Dܣ-;Fa]q郟NI:ް̓ӂ^R[~wz+ _f'/{sh\+WD\ه۷7 aL>xZƅ*jYppOZ9Ϡ~qwon}t06N# 5% A8UlJ')$0qcNJphC^H^v-O\lF\GՄZ[V]S;Ra<g(=#0µ{stDϽze}_彇/,l\ƚYAeʌ<s_ؗcl";l((3C;hwWS-{|oKBXeSkt:UhVy4lGj*8k2mRfm[ȯ)R5*6/\Q7%râ/>>~8-'qvY-ΎW?|I~~ߟhI(Zk7yÈWy^v"yzn"5MrEMC4iPj16VȖw.(4QkÜ][TG^ C͸kҕ>+\ceqsg"υ}w6C&ӳwcb<$1iGN3?vcMfNƕZٷc:VXdR_!'֩oӀOEF;hP{S_]w;Cí9LR`H)I8Έ5eS.Evq`e2y ~ޚ5;?r 0Qį+;ȏ=}e{<}<~_^HGpWJ \pՈ=\(zp2jj{;\!ZՈWtǡ7*8OW =1\/\mOWҪg&pEWz}p{Wi/p5gW#JfWWP*ѻ=+ uojKq_jDks7W(]G>_Bɏp_ \hOV \b \!uW#\z QWW6F[Xax?~ZmY:>sb ] 3.6w(SmЙI}Cxm ~mDJ|}x-buv~;nfy˳9ujR>27*/ҵ/2\7< K\y g;̔%plZs>!/Fj0vTX'b^HFys\薉y<y};牻eV[bVؒCIЛ[/&ȎTn:BҮD<͸6~o|C3NN'=Ͼk.ۿV(oKYuھ\ətQ*jz1Y2r]:)Km4u1XgAjYF5PBRUmSՔJ>X5wKa5FZvwnvF1he9֐&UZ0Q37mQWzNRld@17OOc"zlns56-e0){,R.` :`֞JwbP=5 }m.5뎺0Qɣ^f F f].2r`g`jb xSZ\KQfiD9qnՂ!Wˮė BLtq EWTk:+ ƥY5ˤcw552P0T4\{MGQ7ga=g5v`Ѧ J(Qkj o 8dJ ( [5m%E jXY͊ "c dBNz08gslҺ3XM2мZ`B]pYWFmU{ a`!S da vspw9 qN0t \opc_JJ vf9pu>A% `-. `bD!VZ!h\CLEwf%JE9 |aY{xGd)$h~G!H!VW.{$tY"d5=+))En450r 94Dro9@Gֱ*E$W{H@+uXFS!yCs ~:X!x`!39ƍ7[@pW3W9$Bi* g`*g0`BZش =A+ipf PHg՞Fғ0 Jr=hj|oS:>Bm=9;X?ø;ƻ0w>d#\&` DcQe@( 4({N Di8^6'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N q:,*>'@{Bh#^v8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@/ }Pn@ci-~o_ N DLWH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 j@9m$0Wa 8Nz ^&yK2֋L'-YRhY(Ę :";;3ٙO֭1X%Sgy*2a&2L.2L.2L.2L.2L.2L.2L.2L.2L.2L.2L.2O&zzI'oSjG׻jp,w͊ NI 1ڔcX{\њFZ>q {DĥǐPEԕ'X֨+OfmQW@-Gg$SWP]I" jSZ'X֨+OjMZGWN]=Ju(\H]kSʓKH[ԕ=TrѩG4żrx0g 0Elⳋ6WnNLήޝ>šifP>%Ԇ P$zw|61u&; b2hOϧ{O{l22Qw$ -t!Td\(0mC4 5QNg_tJL]S:rW Iz2(yϧO^GȵD'3u/0]8^ ?|V%:?7hZbAoLw0ܱ˲'E!7u:W6.}ckq_$8 n^Ζ3lR9a&kfEv >Z^M.]0g`ElF'-b@1OlءP)+,AC1VsYQo, V+ϛ)hTV>szpwdC P?Nku~ڑݕvHPRJ-_$Ƅ1cV`u:q=nJ/nËCrQ:Lr|S(|$M/F,d(hVOC]-rDC{Dj.}?{O>޺{3za7:|6Ep,Bz{#& ofiZnlnr {8՛Pj:kJ*'R\5p -:V\)wöl,CBRc owvs|̾t%d ؊`#8ao;j>L`vT[1E`v]zLfE 3m uUP-SWJ+ t}ʓX[ԕVcWWJ(kU_ u|ك̤6~.}b#=">#0@u0L1$ɸvܻ=. !6``BV+9S>"\ξ׍wA?fWcvLRYoN|Ko_>Z/&bE$neo=W+ ǺmbewP*BH|n_@F i8Ϟ?? mČ9+ýYtrǒ ׃X^_ ?9W,O$RHp%-(ho0Kbhe7(Є;ҾB$ ";klHgW;.2)擝 ѰN2x? &?$> 4_a [sfw.r޿ۄED*`RXmnRR'Nn f+fS }VyxrYkXڄ;&$PwQ+oyRɱ`^NnbIC3[TZOO^ mNo77{Bf٦)Bg\,(Sr)YSi.3^9 ~zKB\$\s{|,53liy:{qŖ\UTl̏TS¨L0mM0rB`* ϫ#ΐ8ERV(\.%6aBL1)s}Pʝp_ZDVjX,ILe&3_b~*I$&Wٺ` 0죬ЩOМylgi&[|0ҳFwAsӘ=V,H\,OI?ak/'r7fb8qE§i?'3> ?Y-)leg)sjJE z%D`TBqD!-mb6bnDl9$y0L}J)+TؘHሔ&1HK_ QX+p ʽH@*%(VsYf[q +9a`auFłjG]s6'4o4.;?{]5=TGuvԬ X=9?}<[B8?Li?K Q ~ޥx\aA4lj!r,/^*M|*k{)k>2{r?'].'B9y/S8]&_W, x`"}RL2w (pl%3 8+uMCZjH*\0*ɵ_`BE~kUsK{x_OCc%xr} F&W$Y}:׬>QHH[ ot9oЄ#C?W\_D;Rq|xݗ._'?M'Ϋl1ZRYse>dpY-Ջ8ҟ^/MZNOY^c~eO=]놬(G*A.:jX\twa:2lg?cpVJݽ5c"*5st2RԌ0G3pGJAX;oi8K`o_]/x}8`f`RR"m/~ݏVyT:]&G=Fߗ+ƻHAۏ˽5 J?O??K~/ë .M,vJE'οd_gC,gMTL$U_7*u~<sDƸFZ Dm'o %zPZd)]䆥َ7a~)VquA fZfd*F<+ƕ*!e0N( jwFC{G9Ec|aaz%[Uyάm4';B]3Xl]#C6&$R _K":D38GwPBen2ɾkm# A?].$9*$srʿ#ӡݬ:YxgQeĀXO΂ijER0A!$=+Qtc72_a,IWv^VM@OBC޺̀a1=񙖌BQY)Foh2ȪK'',̑>~#nyF6? Xԟ̠#GN<lU@I`~S"3mP;%j>jĪU hm }1pW:TFҝ<5Zj5|~!,m[v +IvHEJmKj'Z9H5\ 2f4v4L' ,VS MZ \3 u}s*WľlB# ~+*i-r ߆Iu|wy_ڼM?t 帇s9<K˭ZUKsaܯ~Of?D^Š˫J&sQuJDYbƒ+`5niw!6E:w'7wcQNW~~GD}u8 aP!s%&IvoG~/+`jO֍l7sC ])牎blcmy,"fFQD,"D%~D (I i#% '[ˤK߬X[))2w 1 j9 D%a=b#:q<Z ydGwBQ+}T8\',Nĭ%Ob$#cRLjEViƧ!9GZ[@(Qj:b"5#g3FZVH_ XZ՘b~|eS _Mݕcg!l *y8pT];tt tD5ԙiխ?2Q(CF_MF*Նc{h1Vk+8oF ;sl7ͶCZBZ`CLw_UbC9C;؃9޹QJ6΢:aX$,GX+DѤ|ZB|9΁phÙY`/ؕՀ)3oZ v@GhdZ[9vo0TlS7WvGQ:}-q!e^ZBZv@JbQxY{*^@}^O%*^>-J2Tn"u)7ՕөG4lUH]yE{ԕ'W@VcWWJUzvsu`uj;j5:ڊJq+❺u"X+ 0u-S+ɱ+"٩GbWWD{7\x[Pjmc$P~wޭ8aU 5E2$Go7("H$FX"9:sg+tutōDW` `A@+I QR^m@t%,%CWj ]Zvh-?]JYҕA!4Bv@ii JÏG ڗ0?B\k]tpY0vD+o "JZ:C2DVc "pm Е%;W<@k UWWXڝCWb׫T nXStS]m4ծ+ծ]OTDW.BJBWTpCiUKWgHWju@twk) UM+F Ѷs+EʮIIdu0(u}YjE9%})<^+/r_J'갥{;WͣX{k,cF> zSȍSw8fci6\1%۟^UbO'SeutLe{a"EJlBK%ڛzQauIr+TeH\h>{ைJ!uNZS.Ig؎zLdtS)ra U[k4ϳ6o=C=q k:ׯIK9U -P k`M&%@?l^̽3suKsNԑ,2.IS\ɭ2+%esg) þ%c;b.NYo[4 R}3z8LVX1kmx΅YLbrb҉Ŝ홵j?F/̯.%;F©6jAD.pt2UV9z-ZcdZWV('JXIrH.%RH0."zƬV*sJR. GM1qOME3`9KPI)Y[gdf[2wy7F.u'[;G$4Z~2^Ywم+k^m~5 bM,`]iiT `Or8GIĀup",6IRL{Jsa΁$Kc~1ޑP*χ.KVi-y εdBhJ5`"xť,elmc1`}4 zfMVvZe}D8*n=P xkBR^0T42?vi:TEHʹ%Fg<#IF4Ij}4#.ex,^Cl椝HNua%Gå&ؕLg= Dž#asC˦ˆ?_O&g>ĥ! KK?^1wc'>J0OdZxo1T%p+p0pQ: ɢq!Y iRJI*mw.̛4I>hS 7:נc댉1T)NqqFz)k>Z˱DXGs,7әU41O%*Wg*&DK皂ȉ2qn^c`p6;#޽xGv8F>"{t/S6?iتӨ# Y A L&•a"Z)1maBs]!`M+kX(thn:]!J}t8ZDWtBGO}/B6eFǖΆ4WB K]!\)B+D%-]!]n I]!`k+kH0 ZOW .DH•2M+DiZu|JWz!BϧS5'6B 9 ]m6L]-JtkcKF+Y0tp $,j|j[ d`z2+~Eyx+pϴ۴Q>FLp7ilJ×~+a+3Z&1#W,X8py0ZƓ+T-#rQ+|8  ]Z|Q6-hKWG+!R!y&x. F]!ZNNWRtut%&$c+N++X(thi:]!ʒ+40+| Z ]ZB]#]i0pq;ZCWCWVtBtute41TDWXc "\1hm:]JK[:G2S|)]A4S~էw_i"[L1\ 04VȒXxjlRG!v??`]FdMqh&0`N|*?ꅽ%i29hXQZVX[j:U'?]V~eL|;$c}?&>¢Ʌs_݋&<`aeTi%,MT'gǨLB' AqŻɡ |MsM@Z sN$-hlV=-f*vMڠ5+دDd| ﺨriEڪ|!*]r=;Udqؾ#X\p)cu-_ J7aKwKP3Do7Z3ܺ̽0S"%6NdbaK%ڛQ`b#!THdMd̊^^}UVT)+vކqyԡ 6b_0xo)7),g.xU/A cL -] Z.OW7fǂG$N]'}ʥA~֊6j[Wxj;m꣙ǦʻبI?CAr7{lJ4y'5S.ZxقH^*^f-']Ԃ] heW D5j'{uA)CmSͷ*o,FKmM|OLM`6R9*L%t䜤d* ;Ce_sY ƴnEsRmWԳRj%3tz _0mQMz]rC*]y8J/>ZehaY޶{,oۡ hТFãQ]\]!Zx \tut%LDWز` *B+D(i JZʴ+ke(thfM+@[uutk0 B++ -M+DYҕ7.lWOqbP{Jg N#*2Ժ=aʭ}٭;So4ɜW*7Ypjuu*v ܦyGBCۯItnz(e_USNHu|zißvaf1vq'o^}=ᝣw^os%[o߻_pWpY iv~^~E  E3eanpU ~cyxh~+I3&-v1#=_nt ˫'za![/7ocyURώ5{8n%6WDM T!1[6{b wn.{I?#4׽ɷ*c70QWZ'ōl1iz%@7SRfZ^Ha {!8Eh\Ʌl޴T\76ǎC.ڼ6K#&>_al\lc'ٚJ^jk0 a%Z(@o0ņbzLRE} Q.XM7̒/b2%a7Z l%' ɢZ m)yZ[`ji ڊCl5!ٲ/pR%r7f5(jk03q[T3<* cKucGG*Ucl$ o𕊶(fJ֏z-mSql #4G50.;[T0+) Ab,c6F 2[p5~gbZ1BBq _up2 ewsǞ*6֩ (JfP= с'!T ssQUUm0 r}f ΙVRmm3|fr#ْ5 _u}v6}&d-j'=38:Ck}Ƅjڄh.$ާX#jh)"jgY%Jc񞻴8뤴Вi(Ɋ{#rcH #݄bG>T1gbP(`:BAutV@N L'L? *%I]>R(-ϰdR<A i2o'/;Ȏ^ ‰RFHӡěE@1:Ʊe60>I-Kbwu4PDž%z!ȍ,H]:{T< 2zQဦߚtF)Fob5]EX#6н[ r {3DPպ$ ()v`-h\7o|G:T%+wdBA6TĄi %ٴOxE"Z}76 jo|#"c.,>@F_G{jpB]ɳځb7ê!w^KC`;1b a[췬F0a$A %iAC:SmAW[Kq]5 ^ kaa.:TjMPpgL :+=d] BYh .ZDF ByO];IIs{0h.C@Ԋ|l∽(RP4y gF RZ_ݫ}ł4$=b6(9yx>Mz0UB`BKA/ c' {L|^~zX-L/P\C+pKbIM[LW 1ce"~hm߈"zP>A=òXt+$`fZ&kEjʰ H{PZ#; kp9H.ՆX0 >uV 055OBrĿ?tXGH:ή&HYɬf5v0 PJѰޚ^IHEw;,a3>Yh{/n>z]=1ĭ7ќ, /#DכwP]<ߝz+L2h$C<`xY,hѳ񵧀2}J)xljaY:;5WZsL֍γE9a[F6).3Fvy68v6#RQ aQ"^$s˩4ts _7Vho:g(j+:\RK3)0_H}zYi,A?|w@+CKۜR;2Z^PQ j`#g(XT1Dz*9JH4`ar "[>S,h5(?=76L cs Wbh٩HFk׃+Hݚ6 |u=g҃`&SH pA%'ڗȊ=G.&Rx @PK}4 cT f$m #,P֚e т3_Vfi `3>R2 'ٽfOQ G0 IR/C)cfp`ޣ.IbȲך-2hw4&ZTtb0&!0et!b}7 \ХPk(]!=Q!wԢ${/,0X5/ n\r-n_@w>É˹2:gT]S5!KQrsg+1[0?^x?w؝]E5_^_vŧa0!\/5^-^W/_Z^LK} >њ}˛/SkmHe  6#Aދd-ϘG35dc߯jF)%cg8uwuW5uGӵ0NE3i~T- _]Ou5OI uZj %«.5;mRn[F¶F hf($%)*,0{R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)NX dm:ZcjۣҵF hџ( tJ π@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H tJ ڤ=J Xk@@^9I tJ üa@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJU[GO Q!\ڢBFQZ:$@:H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@:%Э {[.:{SRSu>4O߾BG(O"$\ʵGpip w%@i NA@48kӉkص]\N]9=orI]!\wh9;"N]yŔ5-rWJk•- Ѫa䮾wzuΔj3fp.qW]mRrWjw]m\Y&tǥ5Wk!wuJ(ojB= jwh8vw%UN]AX'n]ۣQRiKe>.~)ICCՋO*Q?G n}rvh6>w0GqfөRE?͇]*|Cb^ː}8{T+3+BQKZ^A^$m}=7u>\2>Is7=?D~ z0F>OtGXeǩN}\\Dj۩缓}n5z5XI@T9'ɼgQc}b)eě'yvIgB)#B(Aַ&Aη%?Q N F4Jk\\iN]h2mrWX=e_ JwhZBƒ:AweML]!\ٚ`њ_<("wucoB |ݗB5 ѺwWS0xou5 :&-0N]y+e-rWkZG+Z*y QjZ+fsfOoسCVlV]mRY]o4m[#`K%[mqWֺcwW3rW'讄uִ]`hBRBJ:Aw% q1ў`Jwhѻ+Di(8~_ jp5 UOɟ9yǢKUus3D[+ -t82CG^?$ىa*]9ji}q/*>0:ڻq ;|/ >"z3@dl3Șـ"c[w}R2&[J~!8t$ˠ?]P taXztX''qшV^LO|mEF53KK79=o}m4\8fdaUNy+v"NV$ۿOQۮS>Q̹]|U2O_(KJ0z̘^U9(^ LjGC. xq^б8| {qV|Ӱ4ֻ &k2LvsO3صx.9uCt,gJo6^|URB sз d7|Εz g2ޞ۷ f}ߝI7~k$|Vv2^STw!k?,uRne Lz|-W].+h|қ f/6ƭ4(_FSsDXjHo#ĒywV:zQˬkn9WBӀW 1Ibv !VKF!ç~6?s5L6$q13Ox!`e9@3^A$K|>3c—.h^ť[tqq9vИ2LW^?Âϗ H,&'߆F |IYyo}X?ǹߵC!w].:;lTkKYͣ]9TW`UݞٝSuʮ6ͱ / /VLWf.kUy$f5<97[嗲W7x8K6oyws<b`/i_GF=ohw/:2<(gw,=%!?F3156z> Ͼ)Gt_B?i!S}\q\Fit v ccfu:L04N榐0gA3}QXqPt,3˽,";0ෲGo!wJzș*"վ]RtWF1?_J;9R0w`N,}hf{h@4n~W2@z)Jj'Z./#UAVR_4yJcw6 P3 bmv-/ZcQSnZf"\؊z)zeL3Yp$:I /XV,I"RMx[^ .3i&Z`TN hO\*HL[ϋܳ\  "f1Lu|]nZ++42^ŊndsYmRuHir03WHJxܼE4t6W [{P(aA&(_`3s;3y_HgaE+!+yg90"B!aX`*x C(uT+^8`]2 L%$Ap]xyUBN4@0!͌<jLGI"MM553BF;˜aj{ $DI& xf"+ U XoZ/[%4~:'?o˿72.̞shz ãtuEX9L͎-" xbK#ބa4H}s0mvevH ExNԇ,/ƒ]Lb J9w TKp8HevF*}= ^f;B̧< i7.,gsWhn~}Vzvn[~|k4H*G2m%h'ߚNb|5덥vbU;9{AWm\q SHuJUFwGf_to O'Ϛl oJtˢpYmH?OgI9o^((Vdʒ^^ժbbZYLX^@Cٯ|Tg~:kuݫ9W*ݥ*uL1k楎0fc `>&ǛaF_ࡼސlC ěaݲe%^g|?_=}|ū\¬'hq*V *~Nh(oVtj>k=m5{r_.Gn6[$ɗe;gKYqv`RQ> a~U鎢l _QYp%UcX6WY w992Pu9ҩ_b;C<蓻+PWP\Pdy3ά߯LO/8.KX޵#"aZa :`3 v"5<߯p˖lHͨ3FȏdcYd(w\+Oe7)5B7b:B8 oCu^sRM\m?̿?`Re. TKIf7XzY JT@` :Ƭs9:0|=ʫ $N3^pnBdj&-wUwadg20B6\S9"vowy5E~O:7ԭ١P*wN.yy*`6gQ6_u~BloCM|Kr֎XEvlҿ VwJ1RvA - #eI!s/4Ԣjc:ϩ9 $` ܷws}+w}$u2~[F^ 6uӎIϚQQ V;'Dڵ)+&7pϥ &$%39 VVFc:w}ܡ"Dr)sfgL9KҔsEFU |L)M9yJՠ9{CN->/nkP+UK 2L{yb܎<@terBeˠ i~?Cka `g.ÛAVzl(e'ۈ/lO{àOzxc2&goYȊKa+[}S*mNygUb>cBĔyp1ER )d4[Io+m!kI4R ,j)dL8REXD S"h,KM4 eV6ݢi[E+ݢV$*FCW#+<"B6JBc+DK^(9i $"dzpnn<1ҕLr]`CYB*!@ͯGߞ-]!]i-{!tvEWX *jNWkWHWFc2*B,cъ/#J)[:Bvc+t,th9JjW] va>>B=t=#C=rdiK]ٖj##+~iS=RBWVӦjKWCWhiuDtc+{Kդt( m[B0m8]Zy؞Plj/t%, jتh C[-k<]!J1ҕ阌AY&_\/y~LkRyG1ydV%=\D)$w[Y~\Lj &+=1L[w'$o9Dx,mDR# Q^&ҩ7ncD^'JXU+ X5fNf<"SAL<5e׀(Vj=[h\ VFCWn DEKWGHWZNbҮo WX *tB[:(&`͢+kX,tUM+@iIk #])1PtD!w+D{CM5Q6PSKW+1ؘh hnt(iJ.0 o 2td<3Cxv↽rrOQ(E=>}vy~ xU.62`oZJlQ!2;. 5;yCtvN>*:Edioe!\7M,+gEwt!d'P+{5  ʇq. /a_xx/Dnqx=}p.;u*8tr^޻{ '=_x`Ӝosf^cNs4џe*LZqΙi)j~k뒔{y[ƒ9ݟ ݟܛm>('ڭ4Y̋ gps`_Α+n>Q{e@ݳJ2},TŦwU_KWm77E6LnD$ǗIuٚqy'5pQ{! `L9"KF/SybgaUo4|,|Xi*uۼ7a=~}{*dYq~Jt CO*Ag%o @i>8(> ݏX7p]+qR"qVꝳg8|ūEI<^nO7j`]< Mѿ ت$^_rɢ:Y VAZz^z4݄ߪI}4ԫPg50}~+Hgdկ"]>WDevPkޝ!ΫONQgCGSH/c֝[B`i4]oUeYb<.'2%=ZS*'(cF]K68Y$s :|6 w70C ˚]ZylFˡ}LIxKgMՍvתR|܌@ <4y$"8DQ4| VVFc<]{jAwt@l@5[W?sfި0RʥmQY/3 ,ISνUg&'3\p4T)ulYRɢ4'y/tȉ=aZ5gR c.'))Z 2V:0; 8qTyPf*s]ei2[m`YZ-')4apZI qT;~X@07rcAO|Y)ric\YCDY,7"gHv2f]@?<:xg#R˩#[cRfa`VReURrޝ ޙ=]g) b_M ђG %-jo[@e79ޞ #}BR%xAsb҉gj4,ᅙL}~p(| "gRm vxUu-SgldZWV(JX򜒠Ir\)K,e\E YlةKTqi@ԅ@# 'xjl&RgYv2ċ;󸚻,v$m7HzYSk՚u{ǗWʚY/>k|YnbZZ-U-ؓ+N`Q2Āur"j{&I&:OrG$PxEAǠrfH4rb4y vϘVi-y mεc̬єj@KYqCR@p?A:H}4~+;`DRr"Q]h><'PS!͔ɼǫUکi_SFK[^kun%{%&*ߕDS _΃v^OZfU6\&S0 $50`]qVvM_r)מJޫ ctJ0ډbxK?F RAWUMY0j50ŭ*5u;KuL2#Y.6}Sr༪c4a } ~mmBtF:tE d]I-I%]+/ƊT DEQЂ؊8yCo]&TTF.u^u&L)K,XX[_yWOiv&=LXܚLzN^8o.^%~y~Woxeo~zߠ&!Cougɷv?7/ڰ] ͋FlReg AvU^SeҠe@^|_n$~9/Z Vq.O6 q޸Ͳ(Qxf*ɮ|I!t)օK"8ǮVTxFsEH+^ ,`p)WpPiI`yQ*vb ,'TkAJȖTO'Y\Pr]ձՆnȕzmꮖ!zn'-EXM>7&ǚ>ni(%LWi+g_~}D-Ez?JOUOUsc$}D`R X)؎֡Pg\|Z/܂J n-l໙3Ӕk%[MA"LBX QCDJ*:HgDT  ?30Nz2/sv ~$s6jif+4Ipv@pNu㔰sq',K 9ʊPlt0onoG"탔wpi9/r^弨.7 (uɲg0|:Q)f;@d8yPF\{"n<+i3{qlyV^d) EEUo4!1K`.^}{N;~{}ۇ 8+GaɇmHӒ6EqW0P{{mu#V 0nY5ަ#5(Vl1ɉnu F߂b5JVobB OP7ɰ*雂MGlhىߍݤ~G7lC9B=hm} #/s[BPzVoŀF|>m5R€|Ka(-Vk.`h8&06ָXInj-2soGo%N]|4hb6oybaɟXj+IŊZh3NLE#MO%q7MS:}VȍVTroLtjް> ya:G9߳gh|[7izxƧ K+onL%cq׫t@Wix@HF|kΆ?{n@-i3_\*aSOahU <xwxF$,4B.Ŀ B.Ŀ q{IU@WW`UW`r E ܕTW+ ӕJt%a0:# P/! TYd΂V> LJ}̒1l$eʤ ơӲݔKKuEEZJ-ꕫd<3:n3󹹝\a3oG|c5}P|S7fYf0S ?Vr,kR59\{)4X se0QΣ\BT9o lJQY:]XݤHP<)yL""~ZohV@Q)~kk0bMGm$v, yQB[0ZriY^nYkVwB׎">q_]}YTkBWOv9Y[?yqUqr..e5jT%ZYPme\ֹaIkͭᮀ<Z+ᮀnN,y6'B4Y/S 5w]jUpvz_tpY .ʗ8s/?JqTF)֑aYqX+\$JK$@ILBo_Vn9>Y%mN` vn{tcV 5*g\8ʶ2J>GQ.r#`| w9zN{k{s.-:P]mf) ;X`f[kgr}Q  pc& gX8r$-r@:Rb&y$LViǬQFYAj+h^jF5ʡ1rjVI0?_Lp(T<}Q}r5h=-u2ӧG7]^Y-yn9Ʌ;C""#)i>zvG{eD#<:u w1@{hfnmy@4:&E"A!Jb,#FH HQ<ʏņ{Ssq4}3k0)/Q)P 8IS(ψ ?N4%0R\\IO=JIIёUR#BB::Fj>me W%X]iBr~?W,.Hb2z4I>6 "r&ia$ɾS5cz)1ƸyN\fO Sy'OWCvOLqNe^#S^JA)ԌN8OC_g@0YiC(XIVYYU 4EEa0Ic,>g燡g&? Kpݘ )&7KYrʀ3`JMTI[UC$>(|Yq"A^^~Ȗ4};+_OfFoa|]j*8[ATΉ6_z]^sKbn+"jo'ŵ+3On? ^1$ڞªuCwn=0aqF YL|8&zwݛlBxmmu6uD !|Et]f<:}>ί%>ٰxpAX n`;_{ww߽D]߽ۛIy.V$j t_o~ߵ"Mu ͺZ69W7:5~bPh{n.[ C7vC&S/nҥBjVK+Wl`xoiUO$]Q!مߥ۸5W#Q16HJ~%8cJS4 <_&2K?+à wHsIM]k߀ϭ9"fZfd*[+ϽQ]M×j~ޠ [J[M\ɼ(}N~ n8iDpR|uRSW^|go#LE>1\Gw[+I4vEOCЛ> PvR;ܙ3vTJI_ joK RB.3w{1t:ԇ>($r p `~C-SO;RݫRSmTyw1;d&#QHYu2LwZ N豉hj4BZ"Lu;_vUP I'ϡYײvz4n;f&,_vGEwo.0 ,- ȫYH&Ҍ56~WV\)NZ[JaZmrlKK)j_ش>@FdM(֢+T:Љ Vp).}Ԗx93Y;IWMZ(mz^{t -*OD|XhSbܤ٣QNE {䕢Mgm<Լ߿Yݖ|fϋܵZ꾢U Kk-~`4ExU)7t1K1vBO9]X[:L]H܆ ѐӗ;J8ҋiJA}V} m1X$.N$뗒b㾶Z>Ns4fQoo܀8X%)owKazv}l,tq4mꡐfqܓOTI}@VfCzũ,nyt2r7!/tOyV셮HGm-aY}6ޢ!}k1E4 B"ɵ^TiFb9儌MNU&p%T̫dQ $/c|כܧ#t-Bs u]k0黰2L.\TfHmX 8M,΋@K,BwD^fWAz ̨%ԝX"sk;HfI+'p˥P-po*H۸/ם\At6^`F]ScR[X,B>ߥQJ+hzt{OI8kvH/9pɕBAVwᠤPͯQN4C.uo'bM{]V3RT:T)te/(Lq!|(Dy`pÊWT_ "/VSkb c%& C_|׎Օ[.UQXY5% ̈́4bTnLE8CrƉ%# h)th4Q0!S&pZ e|k^jF5\Ԭm)Я_>uH. .Tә?~{tuxZbbYZ4`|M/C(s K5wRAF[ʍiG8r/|%($y '܊B{d#)xג9 wZI%HD0 AP$@UbpcP3_ x4:0u: 53>HOe043"u w*?y4i#HrIFVp%=zD8#Q#BB::AjVr_=oC3[9UP?O7@ BodϿ*E\<@`{?EH+TM#ED´7(4IqViw^J03."`&yr?+L\y&4$8v:գ_>R90koiN{멘~,|y?}윭T.sk>qп.hK#mK"`I]Ⴘ0|Iʞ_mU7*)3GD\G% sBO&애ͽ'Yk\%b4uj9?GcTwި(DAȰhTO\4{w x~~_~z/1Q/(0//:${ݷVyT:]&g=#x~uSÚ>j¶D鯽o?9_'.|x7YTj*'+W` 9ZNuܬ|T\$%Un}Bs_Wfe#-{U{T[ْ;b1V'o4)y}3vmkd *0dC{Nj`L#>?7st[LzC{˝gTYW 4)P#up{\?gq&Q^˰xkb47zw܄"LSG$fx.68CB|=e k#f q(6لݞ[iv 4KT3,~].\_ߓXag+jVzV#]t^Yteҩj:UM)G\6֕8j8'NR[^GF\D%+DgJ*27+:7( Ó @f\)nuvoUI<Y,f wf9hH8(F :zn|?a|6t4ykfL[&[/_\3PiJzNJ,u_& IU|R1qV'X2#,RaJ+K S#-\WS>Pm FɃ߯ð7aYrt->*LZbm~ G+Xn%ǽ\jZgA!փj4IS8?I~{ u}}.~%gby|!-.Xn V@+=-|S_ rK'gǽ/1/ 3=I{?7SLܐ&Mq'w{3x}[n\ր|lr? WkvG;F%y% 0K/AAvji8 gΛ³{7r>Hn˂I<['Ww  ,_̝xh`,niy]+>ʿ67ٓr;`PB#wPAwzEݽjOh̟{b?Sn { &(k .ِff89 Z`hf)1H-h;@ %w9>ڦ&SΣ{=w2c4R/z"_)-a/j؋G Ģ+/c 8gsѻ(!RHɽD:F0G EέRȰ, Ƹ#aREbD hl1jl8U-tH}T9&w[|y޹ 杠#ڗs0y%G+Bhw$Zt]|9k)@jkLPJJܳ@oH:=H 'g-|>_yĢ~ڑ-^5_QiuGNKF!D!e!x0k5f,`S8UFL&Z i9\FK%}Ҳu-Fӣ^`|V~9Ny,$ gR"J3v >!?^qyV+;*l)rٳ--}x!/5d>ZұatL@SNq飶ěUԤl<A S1tՔň܊{t66[4TIѳNTE#KG}4 j-$* %))R:" ;=Yȅ4*`L1W A|@N@`kC#ȁg1MHؓK'ctXHjb)$H[XA`IrY$536!cAN:I$MI;j kA(G]ַ@hnsU<_vEy;Os[/< ,~Sfw/mjd34vY/m&ufj4=n3ơMF&wfw&9CNި \x;9-.Ro7S3oy~Ǘ7>eF˥77x4x\km-- /boҬ7۾`ֽe$x.nh<|#泛.;*bDq6pqJ;!@k%)llƣ>~dy=TO rH# be}0z)#"b1h#2&"$#{Mߍf?r>l]GeOwJKt>JYE8U6hQ9b0XJBԀDYcUώk;^Me;:P*Ihkwnɲbvgkb &uDf tvYPDcc'UYa: PP0vY4a#A29HysQ b=j,4ScNB΢m0\]V况&-79(An|M =^ &cu76QKʤ# $g\Kr1PYf@~\ENa3ifb%?ȹ̥bԄK%(\rmP&qxVDŻnw]ךּ=H0Fz-UzDf Eq$18'VߚvgpV<9mY_T;. QhYmgVqmԽj.qW V1ݓb%msO(߅c/l4Ҳbĭp, >1x|mO]>&.M|3w`~ݭz`L.}y@ܵr9=QGQҖJCuU& Ĩ~IF;z~:{QVz_ſη7f>[}Pڲ>zT!v[XP3 ji)(xYoׂ|tr4A O9.^׷(u=eg)\NL} ɝ9vFгmۅ~eTd#9;B+T-~^M`Ƙ( ~9f8otګ6Ki,6Kim-hڰVj۰Vj۰6U6Qҽ8& r$%9AK}sr g'Y&h"mH*Ҧ"mH*Ҧ1ƤMh+Ċ6Mh+Ċ6MhMJ9G+8Nᗟ_ӱ?xKk9zHs<Zhr<(Jt[ϝL, =0(3K_*=r0:"sJ"90 .R/% ^3޸uu6.xM<@$,ƩFZvayLtaÍP7.92^bV>^:Q)fx$Zx-!yPF܆ִI:hsyL,1&qra_AQmvHW)e{J#,^ ;G h94w=?EҥZ47g=m6L?҃Ɠ|MnΟk`TBn.zTrxStMha{. ftͳ,Yydc2a"k/%@J&]DK9 Q 5M%y,- S<=]dpG+H X>L"zi>`ɢzk3];CRl %eRh ]왰Z1Q;Zx˳ו^M$JW(ڴܱ?Ȇm<.C|jэZ.ܜ tv|{$~8K Gʏ] vtelr $ͤX>:sBA[)!N/ '+wz;1 K4Qb$nD8OX hiVU &Vz5޹R'OR@a{=mqҀo;wzٖ]O;hа ~ؿl;G\`ڷ|kAP#Iꊱ)}o[8}4.?=4i,]q0κީLp1nT&CHPjH>3!L*W1ρ$VnީqgvO<}\X_oAO ~ЧK>2=! |2r-Dl֚O:#5VW֕^103Aܡְ(2YRZ8!Ghw3 <υ=\[)>3a;MQ  pc&(r2E^K(GRh&$\.%6!`Bqhvedk^jF5ܦ1p,s6gӤi_7lt DkQ99rI$Wt+OMjOV]J̈d*g)I.,JE@zE"0F*Rl)7Z %lqo (XAsS{`NDV(#)TDMf(͸Jb,"BD4,{,={zcE~yވXA଄1R@p.853>Fx$A<#"X'ރjܢ^xshڕhZHJzQj5Z%3Ҥ  4)%f 3 ×RXV>%N3\ކz(?c1C3{B$\+K0E? '~+kܧR"qf.;zrIOཝOC13&S05fX1P(^#S_HA) v#`Wǡ_%P0 0+:mqkN!bp5h q;EW`,>Nvg&"ESˮMHƻ;k=zciNP[TKحLYhiBqHAd%[ZRT~f4~;+_Omg|xft9lZ9[TN2_:\\{KN9znFEKӛln7^0fv^4U4~2i4ч^Qi$wF\uFKVf1$daj 9_YJ<8i ]2+gs} ~p՛燗xۛ?crvK:u H1nSS+0_55UleoF\3HAqZw7_^Х{ԻNJ;NRNRW_̯ k9Wt :$S](!_4iHLKmYmquN3&fXW (nyY2<@iR;ɏ^m9"fZȤ7T .yFŊq彠 TLp$`@x\k;R  J,y4(uDbgeB1gU _OzHC#KF.<}kc#ČcڙjXvTedG Mƥ|@i#.3-3TgO[Uahiy*?w{:EX<y, 4rH8+j{.5[&AjآZzJXHl~t==MLrH.PHx3juR cĜEXp?a*3e`* s2M\ކORg$"oA&P Y6mwE}їg{S۫i싼 ;HݲU>^&bC`/_fzrF9 `hn)1LJ6?7 xDkOoխi2@,DmN{ehd-^J&l)bCmY8?l@§F5Xښ0U_u}!7۵aiLWQ(Fb@)b hccC̕9 电񎵷ovA00jO&53Йb͝EmkɾD2=_^@Yb}) } .JOޘᇛ[сޅn3o }O F*v}d˛[y[7,@di,]-%:+ 8S {4~SA}@`K=xBǯΚT;x&OI  0v:ޗ~Q8r_ R5AuPӬ8wF7ްd0AvmV2+2xw>"hҴ8l7 ɐ属MH±ښ;fbs=Jn.0Y/px sZD)3ln%9 ,ӭf' ە vq FP2 b!Y\z%u.JFr/QvJqTQ( Ȣ`;X+\$JH!MqڎBɠO3jtIϾq6}vw6[% sml-=YjөM>/JpnBc,υ!g$\9reT*`~QjZp G[pUOWe܇A1$5=*gx >VNJQ>_/h[,^\j)8]' F!D!e!x0k5f,`폌FMFSTmxW-}umi< IMR,_vGEwoL_]`PZ &roD5ǵb찻HF"&[k]nɎ j?[1RzX!HbB6J?5FkB}RB'&@[)Q[M*Ffd j0DFh[-uI]F7URP>,W'Hno~Bnm3<*@G _Id~? ]dw/Xo C3E*$e_ 'H 8kCJ&{hjoLvĒy,cZadPg($N$&F:łKsg H\[u-3eN˜6$4 P}v*uP.o O Rz\rd@&xS&,Q5s~~Ѡd?CWd=/@L9jtݜćw2_MrS!TyS ]L8;7ڄezw@|O6{ZnjVuzx{U7y0Nw>ݝg·hwس|Oބ(˭پ~w9gqtNL1I'}ƱZкF*Xp,Z\K@K+'*_ɯo!@S&@j ҏn3;>i6[O:,f#WBVVZ>| pҫɐ25 ]4%*br}IlU]֡c^Z1b3Cl} ѐ LveN: tnOL ->?j=I/F`Oz C4"zݟ1Zlrҋ2j'x҆1($u/qԙ?4I/ʮגn}'t%ȇ[{":4}h:IT*ӧ|UBc*a ~.,zSoӭi Spτ>C$=>%>}DD[MX́dgh瀉$}(I1;`NÉݣDΗWEtybvru?D 볟kx1{ȾײىxGE]&\eMԍ=$6 Ĩ)udAJK]@c̉rb/!VNXYʛ__ʣ@%C"X`T9F$㑅h`"RSFDD b FрG!eLDdH+RzB;_7f\h؟duEq4ZbLB//}ӘG"hlepk2:ڀmT 5ѡ^o1'S ؑfd7מf;tCXl4Y[ZFf5{P9--AƒEiYnST tM={=NY>[\n6* 5FPR(F)a,b.:z%T!i$EtOf9&r6ȹQjEw9lVJHH!^y:Aڂg`^8lf#NK,vlFGaWM ;FցiV1dmon ~fz$T.9 ! r C5X Ke(TΣRYi) /ah5zh7߯,]mb@hqaQ|m<?bB㫮qN|ڐVP80$5AZX,`F|0a-- T)P-VJzbyꟹTST>Z7G Vp).}ԠU`̙ڗr8d:ԍZ&),u3mkImd|, @fF)lp +>N;Y:w|v}*g'|NCL%6mvOTxKu³ﴀ [o#;]usu$wNNAfWޘ:A`Ťv_HyuuY}rw6-M{v]`{^k{?[Ow[X!{ l:qsza@['[Ⱦپ~t&nSjm'.+, c0Ԅpsg =gj˙r-gj˙r-gjJّ2*(v1'U^˕vfR䩿P6)&S1AKmc;X&~_I8*]<_RFE&0| bdki /JMqRp\)"b=GӁF (dLtjDWiL4B($3Hh֊Konz;s y]|[NMUSz]@`5 kͨ)~,Wam:,, ̳g %0eXzM\V0#*)dHJm|,tDc2[ eRxZu#-}z ru z2(P0vY4a#Y@SJx->ǃz41-ޯ> Dn 2{4Z&[P6;bWxOyf&O0dPBq@J w^۹ua24Y_{i٢AvSKSW?P'A[)݇m#GA>ҚWط\0s\nƱo^rζ$QK-9+R 3>4눷v60n/vnlxTRu_ܝ0k,M7XH>ݬchp`%J7bi1g;Ja8ߤ>3&wλU87Lmb-h?[bR2f/5>e[oje3gn̘1ǘu:?2cexfX2Ow-}DZ}zБ7llc>ɞN>d\.A=K$6LY=.fᲺE|Sz.x _ߓnK; ؙ4Mi8mńT‒'Π?ySH̼&Mat<`]IF_E_ 34.wŴ>ܓc)tG_FqF7W.}M?}so]`.ŻチS7@EpA$y~~&po~C+0^;4U쐡eoqu1[w%*vgk@/~hg6I$x@W_+0m`:h^MYisVw+C<tم5B#cԑUJd%8UNc,JSuayl/P1, *0@!=Rr'5]~fW?L7O_R i Io\ro*{ASzp$@x\V:bD{ ,ڙiuDbgAeB11U V!P|=e k#N% 8<;zqBuK=dv[5'FЁf5#e*XW Ϲw[mtڂsqእpVa 6>d$@$8/Y,j %$lpaZ$%j`flu&YYEr2yHy}Df zimW-Ē9 @Kd.N+ZԿN#]CiEhiMBZ)CUYN$d 4)qjs5Jb6$5Yõfr=V Y`\au*z&R_XW&kW(@u4te S_|AYH@rxQr?%+cSpOUQ*gBl}`1( 5|&SQp\r &g?ZG8]Vs7nm10e@k-<~8uV&ۭjͥ&bPTF Yds4FkPs>ʣJ,JJ򶢼N.OSCj,9]m va^A+͘7.sp ZvE$pL!Wk jpLҘ.1\QQW4fI8kM0ѷŲܰ^UyoQ3]Qg18vC?L  :pbӋ \Y+Vg{%+q,+g-T=ړ2ӿŐ,xO2OX-crp Bv/Jϑld緽8mop.;=]ZyW 1^!'"LJ@8~ ʵW h ׳\=BG=5` F^*Zy- L\*Trݙh$ڪ*9WZUK7WH%J1~|stUW jśBJ3"sU*䂾sU=فURu\q&_Wgey> Sw?_jlʊxp[[ythGmOi_>%LAH6%1DiH!3)[N[-+?+8i%%9 xL- TT0 ǜ)yə3KϨ+;Al>jDS`WpSYڻ0Ϭ1&dƿ\+v4ٝn.)N->x{{YպyrLwG.G~8R `yΓѵYNj ʡ;v݋Pm{(<|Lb @[C7.͏Ϣ%UyB)/vj/7Y>߯/ךІ7.cW-߫n: af nNbT_w}ڎh2'/_ ٪5]SFd|*S0 F꘭ѥhrSTL ǘQ [ѵ8tڵv-(*6}*aL&'a:436`uПL;0),EN\bAW5%m56s !ei9VwaY*7T\LY "iThPlMd띬j΂PvJ!g"6TuM'SY!|G]=MfONvLݕvnVH#A3?=y'K#x42ϼ﵍;&~Sbh>tnCi cN<[s^?<3yRtk*&sj~$--?V+s~塵[Z;J0Kf|L.2Vq<9#3@sږ/[#gO|] ѷ| 4#ĕkMǮԧvMZ!4YBĮ\|6 jU-wV%\('  ZTkY[#goY@|kW_' x M<+73Wٯ!0JUϠf㶇Uxj׼qx8Z,Ç[~yI0KӛNo.Ro,Tt4XA٨UIp%G-L{&! xv\mMY;YWU-:A87-mR-Xc3EX}-5yy wA3;|YwڠWϬXRX)CNՅA]XvRW-mMZX\MX9-w%H]/^/:e {'D3%֙9Y?C] 6MɷhyhWmD 8u1bF],;FXg_}v -5{nob9a cR(s, 4p)xk& dbjS={?nVҷf'|0fE>H]=%!K#aoWX)zF&WVg_p/2o!b:(.a-2fmG襅BX͋m΄$c΀&qHDJ q DI4%` w2Y{'k&KiREY-@hbq!F £eT@ܢET@O $-H+3/eBl 9SI[LRGHe*!2.tAX"m J6S2:2pg2^_ķ[F&+@/o 2R䧹Pu%[9GF#CΗࣃg0![s+@nW2{p$ T)\2Ao;c_EC7wD(FHuh%:Q?:g^[ ؛P2z W`ѤI*칐TQV+.J! Z3F;GTL\ZKڲ-2Jn=k.b ¾A\(KkeS~>7bKWri󄧆ײ2{w1>}ʋҨF%(+ƨɂytˬ#-&iE]zdCH7_u!YI"oP1Rp]EP WghfLF$чFw5@#R٩Ak(4L* >MYuVߕrLPpl%RقREPUAcDh܉Y}TK>\[4:H1I))n+NnLJr$5iH*RӡÓÕ\M'7Fn7 .E\a>Zy?8 pBoKD6M{U8%.10}TaB=ee~=eԿXu Z'zkdMJa xqFq7V+!7n2I~f2E#pEg.PK*|ہj6k;2~.RB>cw6r1Ş *R>}=ʐ\B0m/+ |ncKA(|M,:IrhwP?Vݛ /aF _ 4?28@B/yp=yg?p޵q$B% #b8pgm ,WE*$e[>8Tυ")IQË1`4{R׎JQI֍Z3Wb2(fGϯK]ߌ@~N#tNu^_vϏgN~y.<}{ׇ?ߟ}wz:;[:? Ku$Hhx'Z񪡩b -ukr͸N"mMtk@~|\r{RJ?NkUM++@6UQ;Tr~J)Q|.B`̀@_]::ic-lcvN0N+0fc ~Vw0cd~V Y)ETg1l10#P)rU+ƕ*!#'H:֨۵wYxy3 HYumLL%bU _OzH(o9K[ilO8FКn{#Iu(&'QVsNW 0`ox+)a4cy*>ʊjؙ1P\gR7c`bf[E‰uXT2wXO>iW3Sz %LLr`$ B($<:A1gz N5X46D3gbveNA1XߊG)%z y6PR[6}is B=A=)KVj9KyV$an0 /R!}Uͫ"r>5y7DX.|)\ "!u\lt3 #~yl u$/r@Y\ Kr=5Cz] W(|" ԗ<[ġeY4X߶m_"Ogw"ɆRuor):]>=Ջv  B:'}U9nNցIef8!z8M%Wv,r߶A3֪C3KFD5Hc]e!CvG ă{=w2c4R/z"R" [yؐF[^grV;A&uz < ֊ j&ZpNY˼0oil>YlrHM&+lZ3(v&|ۺc훫<>.ȥ\)=8Q2cBo2(Dkn--%"H \"-r@A$Ҳ9P+Zj" jQ1;`tj+YXj6ͪ <UX-1=`V1c FMFS0>l5 Rh<Ҹ&n(wm ]߮^YUo<_Ŵyuy jvEץ^8vKKI eͲk:jtd_[^*3ZgE̯Wv57_.[{=5^yyFS-`_cσE^8݉B{ NfdkVӯ{l]\ن jL+*ږ!Ӗ[k6^w]w@>/%QR7qH ̚X{!Gr&nmv?T+ 12D !0ӽ^{E4YF=94"[Tnv'X`T9F$1>HԔ1тFQ4`S}eLDa۽1rִָ݇z8vLR`>vDܟv>%H*4hfa_mp.-:tYbC{ ژ%hı>*8g&]+ZfOgN-)7&P(!3TFGR& yCang=>N{?ydYm”+ZrOb-kvOlo@؂eR1ޙ6>f)D+pgkT>;i'Ѳyeۖ޶vmU9(Om֞VjJ7" $cLK218vtpqAı)摤v4/JEH1U )n ``p]enƄRn&x9:2,"1rbrX+/tYm̂f$3dOCv%kKq m$-m]Gk"O@MAԥI2˷Xc&MeJy@t x&!h$v+Iz$.ulv g83J?*dA ;@"/fk7P7 E"N@M˦&u,#UA/ꤦ,'m\/l~4{2̔,k囃䛭~HAs\0+/#ab|k]MZbuֺzRRr$nr|3pf˘-c1,Ckxό2Ŏ1ĽJ>CQ.2#`| K2l/F}qC^sݵā[Q;d +ުה0*4҈QSa1Q9 %# h)tRrV˘)pZFe)AO^jF5 #g>?ۨJlj@SJúb- .bIhuוhn@N\\X*P*((1RbK!-}(6T ((V@|j݌r͡Ijk9#R"%10d[IafRU5ƝVc  0B'TVrAP~7lE{cVof]w% Rv!>RAR4:qG}ת]u)X. Zs$NH*&e!J 4#tt,#mlθ9 k;:z3_S$0,_ |j!|k.G<;y0$K525J06.:`F^wTOrS9ySEܙ3&kҒ#ȁ9FGRP;2QۣQ|V ΊN[lBTa1a4!‚tY,'1'q`|釯2vM.HU}S6ULь2 ޑ[ YhjB'So@W\ ó;(&?wfT͓~uOތ/NbrB̕^X[vQ[f?0B Cn$jGb|HWuÐa4fz#2 V0b~Zݘݫxµ|juU":J 40`R7#_<w=9bAt*`O/w\~<={ww۳>3L'g߂V`\QG$&Ayiijh*вY Ƹ)׌a|$RnvD DΧooȥ83?INwZn]I\}W_ͯLQǕ]T:O$S]\(:tS[$m6ҾjcA;M°:yȏ/Y؏YYT`dC{NjS~hF럁:gPLkLzC{˝gTYW b`P#ZnޑJ>g9?Dǁ< .)6 )^X|XIt*Y!T#P؞( x55o*hExB xmܤS [% #E%Z)4A/T7}rSQV:;⼁@,S,`36,NcʠڜϣlzfSHi`k#9oB!gh OLj9+Xpѷ%gYrv=9u&?DVB)YdwOݑ|(ERRQ"[u*FF㋬x^̶vpm#7ԜGxD.XB^z0Y4 ^BY77~m휧ҡUǑ1b6qD?t&^ME5w@z|q+fOڅU"4uCt}e='p3i,eQ)gfő#9= е f5oQnI;ݫǏvGO8ipT8bq50;6J(-&]LV]))4h-(EF~~D'H i#3KmDA`Wry\˽m~5ܶ7c7Of099΋~_p.a|LGj{˽SwZ~Ej@!_s1V ؂PDeR>JZ1Pɢ.&X UG,|Ԯ' 1.iDdy,/.,׬Ͼ|2grA&o|m6CĨnv8JLڷw(U/]MD< xv`?U(P8S@eE%זKg6$PChأQFt3(|uv@צH1X\q@pJg5 Z:s&[MFPƜQZ|#xI,ޝKZP O=-QiǴoY32W`k\5r>sը%Jz -c Jv|U#xe8sըpQhޡ*EgdgclF-S7WJ{4Wmlgd׀+CF. jԺGW͎ڣz7抌߬k\ r mߧ ɹ׿_WWxumcwi$}s__?EJHu:1u-Ϧ˶Zjℶs0r(Y@,[^L1sBNG9D} r>b2i{šFOl}ـ6/` 8dRUʫ%H3ɨ22ga֡dl*/&XγI)XV:R* &\}je,ʞ? c>Z}ؘ!a|TjJT\L}S {>ZT%U,֚< :y:W)Z1\PчXM^R~/ٴk3{d)*hqh6^RNi Q,Kx96G" T2tV3uPH (E jlp8}m$0Te.Yѩ;p_[jZu' tr9{ҕs>%uTwߺ~2T[jf3Ynz텝ٞX;wrwssqCYuGw[+z7V[_=nχAsu=7Aa q ?DsC'Utzww'ӐdG>'~)oR/Ӹ(sy'Ԃ:FS>[>d's!}6]CLӁ 阨K).TJ%ZǓn<;ѓ2Pn/LTQs*et!m (Ӌk.x\}۴ybv`fv`?0?'c!_mpE?v#z#lҚ]M֡CĄfRHɊu]EwhT2$ v>0wvpCl #z m=^mJS`cD61+86UcE9`8kmmkU6g3~آA7&gGOh =__yY{ V`Ԅ#̹>zis::RƇBGy:x bX'¤2Nse#V-&VMײЖFtP8PurpBp%{#%m-)dE!qq(P;QPS m PVlx%_\*MZ%9U#oIFY$5pֵ:׆^s?ucf ejwfp>#|#,:GBٳ)r-¹Q }yPI2wXFɤs2W+Rx>Zɛ+RhޥB0!mM_zwH+(xS>{r>YSz2o iM//xPPG?Z>[t;/1NjYZA9kpF92V0:k\Şt{u3_]~jQ~깉o]:/8J~ aF$q)gRUjqJF;hI`dt !!9QA*͇) Kh{}nn Ւ[S:`Ǧ_{3*-Q/OG/{-9*樘o"QC{+֨wCNZByx@iB"~<2 +{πt䝮`gFjIZM,ZaAsf&;\RLVӠ{NZɲ3,Tp֍!ɈV[zua9xrf(j6q8tp2;#]MxfaUTJ&'JQaVsu>UBC`\ĺMAh9,1!U {޶b"zmf8>J|hOMlU49 l:y[uҍD qXJ=HgՂXn-"׾=ErͿ_|~?(~eѺv/ }7Uo\UօierP \|݆I7_n]rƻFp]4 SH&$o|3*.~ V٤onb*膤J Jʑ)< l#{`jPc&xEzֲS9]~ZM9jZ`MlHz[CQz/:)eDý2璐eXÃH`j%zޘt9 CI5L}H40Gt/{ DZ)HlJژeP&E:b(Cr1yТUք\+SMR@rggǯăY'/CL/ aוgSLB*8$iMH"Q:}rOb-(!?nM*9P*Y bTWdyڪϪ:'FzA+HL$yH{8mz -&o$VĪK!!R V D0 墭'.fK80dR2*S`HCԪ*W ;> Ōɥց"^b@aU H{( !x\ĻU S $2UA4Z-Lo#-:b)$)oQy5kAQ"BSDOY0qU>)F *,HƒZ٦jQjT@XX@I(CY,K&ְ8DzIշ @ћ !]X<"d,PMЛ]\R?,1.%L0[b%0!)VbEgŎAL$.0 &*BL‚oN") Ag QDO[9Az*]ioǵ+}IγyZEB !-￿S= MdK$k8ݜ]}s&4%\@֬ @P{T gԌQ!] QQEWWjԑU"y{x)Z6򅶎?U Rݕ(27j $!) e<5ID^,.z֮jEQ_U0Vr!㠅bLc`"@So eXxኵ&kE\uXgLӄa >5nNeo(F%̷<kc@JRv'!'xğwaZntl~ZNJq[UƏ^p 49!V˚6yd-D>b‘J{q2YWҥ`ruA"ܺJ78bgP찱ڢR:/QhJPE"&4iY(SM[(T)pV|OA2Ab)ЭLmQx nEe06mƢ"?ON_EiVpۂn2З$ `|>IR`sR ("M'PG]rA .H_GWT@5ʪ+@ @v3McBQ*(S`a @p L>\;djy=*f@ZNmJNq:-I \9viEpQLDi4@t e\1BtDhÕ,cy7* s.CpYT24c};|*i5kNoXfHGY gi&0QIЙЀ7 RYU[fy/ztVAc-B-Ia.y͸t*2 cmaHu+m0ZxҘgvNî_elZ9״$h @][ qtU  G6T0 0l MwFS]a I<54灣d`097zS雏 3| gz7%xKTM:39\nxDhfk4Vr)p A(N( j=c5q^ku&id bV"bʅs }rp 6r 4J&xyce腝 PڢHbQ}TkF|Rz i,27hâHY9zV19EKkL0Rk+=֨Ur@e<(0SA"pՔqOH2ˇGH:? E_H*R3QbTqZF (5rfF8HJf:4zȕAu!֯F`J$nÈ{|tpG=PYli3φbU~f" RLYY 4EL '#!2 .Xr좀) Z`4rMhui]~VW$<Xd3-#rWbZg{qlvza1EH5]uʶaV_.~K莩*;8b'x2Fu-{/$nc}O} k>K/O_3\*H 蓻tJjTNOΟ[ 5Sm ?~ջJzӕi>NOP>$p+vBIi@QV4pi3UV5ӊݚ e^ЍX}FdfΎk?0\;e}Yr:g'i+et͗hxeۡzx,1r25,l Dksj 6W #sȶLkb'OsO{^\5 7rrl_mo(hz%8C{lu[2GHoʑV;JPQy|!1Vb(mߣYW6b77WEezܼ\g!< iy)&ǡO$Q>[>=}xGT ܷ6{ipo5VWPe_N.6͈麟G~|{vzEp6mx8Gc+^(M<;8zp7^ɲ+)Jo.#/e.:/\>Ga(J>}X oOܜc|m&:p)fSV1K}6;I_uҦ>1v-_j-p8=qK;1֥qեS\ڎy~[ {0kk@RxipXVΫbMiAaW`6BLp>I˯UJܧ q; 04/+*D=222UT7uX{:dZ{\go_x_ -(]$:3N֋ N֟gu7u)ʕAZQ)r6b/:xh /=;]<.&,Wxx1C ch|T&U.`WiͤEQh>q+iUSZu}7.lwYZ]X?#-e^2rMt$){oe&ejz9ysIܞ@}!ك޹'Ϟ,SS=o|A#N+)?fq|zݬuZs'+5;YTm=ǁՙ"`2r=\.N,ԧ.Q燿/^H##_H2ŵ/k<3;斯k_JuݻWx"}3^GDF˘{4fݬݬ%LS~R ic#j% `T Rp&,UԐޥ!nsrNrn׮mm2Rc9*bmgw&f*?0&lĨR^cG͜를OfM2pO@v~\`ijj|ݮ2/ q~5_]S )|XYY\CձB{!vC J^䡑TbSb"\n7ZKE;J./!vwχov\3A`ke.؋3-M|阫V'Q+"Q+"Q+"Q+"Q+"Q+"Q+"Q+"Q+"Q+"Q+"Q+"Q+"Q+"Q+"Q+"Q+"Q+"Q+"Q+"Q+"Q+"Q+zE4>&-?ݶ= XF/2V:XMyI`}luPɛN8*Xbc|mEI.sϭo>AjT]6lk*q&|h0ԡmZڮ6ΖJcĈ̹"}5 =$Wzx7g_Ú>8H7x|b'?zU{&"H${"A鯗x#4M$h"A HD&4M$h"A HD&4M$h"A HD&4M$h"A HD&4M$h"A HD&4M$h"A %A[xL$hQ: *hHаO k$AяN~{v΁7-gޝc? C4:2\l֢[ O'Xdy31;w4{{Աr陸௓qzn;0ywP)">'~rp>H.W̄.wz6 0[3nswhc0Ǡld߀o``WW%Ty7ɏ$ק4O\V/q{\=חZuΊ|~K\,YW A m٤ 9Ƥ.&Qb!VjJ*_1Υ+F3-Oz VrOZ] 'c1[\eu994E4.*\׉r,ٍ `ɇP*Y¢o |[_sx8'0PSg@p9P/ܜ$wisIk^<33L Oa[)eѯww%GRsb~q3p B{|4ߪZJۿQxŤeR1EY:tgw1 U?8~]eLU):2-c1sY?De1BSUuH3^%,@2Fϴ,s@R :{$c%Inb0,X:bҙVä3XӨ0mG5MWжx4IZ)ݣIR*y4Уa*[WI'*۞ $o!\q$-+X`JJ2|p\؏@J6UE*+[WIZZwp Jb̤l\ɔaMO J?JRvyUJ.qGAQ~~/+y02G8w-|U/`B<&IEƸuY)< ҥN ~Ž䭙xCJ[|+Fj#"Ƣ&8RRt6F^!LGT_9ZI\OG$9"N. c֞$`m$BW %'oSO`,%'F{I7pu?)m{jߩ4݄"JkqRm$-WIJ:zpEh{"+"Jbqp$\=KBHmo+^}7=勊տ^U@'|O{_|[(ʰEPA\[&iS$D>C e&SJ>늆*I+رURv9/oh\ CMbJR|pdgWˊ[W 05pe-p=TI)iW$ >\u?iHItWB,U:06+Ѥ?N?U`yq(a~Z*I鋥; ݓoc蒔|!PjgEi<0?.nI.Hm^? ճ6ȦWU=~MɇotrĖRƙNA]aQYMng×V '`%ő tB&""=f6H1ئ+3g؄@gr旀O@9@G eЦwZ D佖豉h lEi0 =8I/ҼLS0L=,#Nyj_N@!*Z]7X~j0VP' qnZwz8u¨BnaF&]&TgUtQ ΠtT:\= Ri?GJfP2uj7 煷W^j^(Yt]B8FR7bT`d:݃Vg> G{eҧd4E/w]v6bU#t;ŴJfKM1ۿo`d=(ņ$4Vc0##N L:K%hiY4*kc 8@͝QFGR@0i؂gY߀'= Zk_s\SL]~EYȳ/;iʳ%C]R%v~3# mr4HEjP5;S*4WYx~qfv%0q'76y#WbJKZ7e: ??j|ܺܕ\ /kڹȮo.U{Z}]"*P76ȣyrWJ['Z[t^Yqw5LYG[5,즳,Yydc.f)RR CJ,s$#MMZ;Y64_ abynumiA<%JqU:5?/mpgs!]˒w2; pYwƮNlXQ0uӍ:ւur w.G35֟fAtXuKBz݅(Ԟ#Uܜ`sWg.^IkȽf:Rj3fȴ1K 6=/LпeRf'|yk>tjCXv1F: lp6HFjRx sQ 97w]GnqL*3;</B"S`}_b 0w} K 5>׻m˧Mtn| kVU!I~.=mΥ <7Ocѷ~-鼭!W+ڒa_l;u(%k I`[=ZsH⚤ +!('p f &ceȤ" $cLK21 Rvn -FJi "&K pF=(#GsG2>xMA45$3NRPHqC-PBhw `&(;" 8M(rXtdXD -J)Ʀ#ig;;n!74>ѿ ̰;'>n$m{λ䣣Xi|kmc5DRc>c9y`pǣb%:j*|HK.>pYzF>ۋ%w˨Lh҈QSa1Q 3dJl H !Z ,KF:R1!S&aJ;f2q µzqy0KnLh( _z$]KkIҋuTw8uu.)MW]]3-j4WΪ*$-4CLHHE-eZ %mqoLEJ/:|UkIkב DJNcRY*LU 4`V < kB➇AFm&t0c%6֝nS0W`QE\ 8.853>T<#"X'5 _ \4=Ҵ!%Yԣ[kZ%7ҤNTB8RHGQgH>g#?ϟ. yaPp^}?j!|o.m'0 85s*`ޟ?/nOʽK/ăSoəL_E g'X3P/SdNΥSwb"\I1< }U@0Ь! H m $wgջEa0Ic,>FLo˝72%T#m1vV8AFX[tĉ۩4Y׿kBg]\ZqKHͩT9L~SU7-r铟&o ar&̥b??-](Ur5)V\lpa!%ֶ.5C7faQ0iJ>o.mdCk[%hsu'׵1$Ŭ"9ñKU__vO o;TPcZY>>|t?8{OOg}8DǛ?Uo0R,:$y?L4_voZꦩb4-u5;v߯J6w4 J?~zKIn>۟ijW'ҕz{ f~9==WUQJòT,@%(u۸(Bb#2gKt*c(NYicMF_m@{1]dV0Y)E|v~l?@_c|N3(b5pF&Rpɽ3,V+U@1 BGpN(ÑupSԻ#5>g]Yo#r+}N >=k3OXZ)e34fU[͌i +B" Ir`T,MC"|/XW1X疮X' F9#2e'@霹,%X"@,Q^}ntN=Ę>ٟ T2P[̷mhs?@ A=%'zom~=Ojı!?aH^6(o4ggJ;sQ[sS[Q[hTB.E4!J 1 S }%WC*m@&>=f`J} Ww_?]9?Tuzm*' PHiI%GIYΒ.s1&(\NqJɡO坣OBG36|:|m1NOžͥF_{;-հ s-e^Vi۝C=J &spH( YlxtY dX %U"*ERd =X) ޸+iHem)D QuFmasBq PP@HUk9#bC(Jϩj%TګaUv) Ŝ%N`Lg0&id!5$0Cr1$[YWܠ4sI˲pTֳJeό1 RDwJ~)r^k͑[F:E݃ϲFH$ȉ[0(a"p&c:r6%c$_'0?ZϪ-gG=_<APro*L$]T%1I8 ?z#T7 T ,?7; ˿ddׁ;"]pNP GqJ%9*b$۟)CTjh8i`ej6sEY1d\BTѐ$IHR'[l UZ5$ej`YT6q%B*r IY^YA;ׅE[C*-MWpۤaPzcMxx|vTCCT_>Oc ~t+EUo&!M; -(r!7BqiOqZ2+'ѾFcUQ 4ZـgؠGѰL ?T*~?p@3^h^BL8-'/4& T k\!eMp+[B}8d-mfBwB̭q+ fPڏ}jB#+{#|;q} }L󣫁~\3Tv&5 olhPJcl€~NV-ߝk~`p}Aڻy"ScB-!I;4/ߍfMP$9[ÚmWrF&%VYXfrRDc$.r!6N.*{Su1n}X=z2KM>ܧ.fTgƻǜ`$eRجJdO3A#bA$YfsW T0Di5)Ev!l%[Hs!!)`3h rZm9;<=JqOů^?Tނh>>3X6$Ӷְ@..N6T-& (Z"VyB cL:+ 7`)FkұL)MjXQ$H^׵`9ƌ ) 2=Җz/"mMsg2R-clQejIơ*B1“ʽM˻ ŖteaeI4~4z=L?ŮrbeZa |L"&& PC4•S'uYKt)`lleU`LeH>("B !e-GM(!v(Ĭ" A%} {rбA>9Z' /T"EˆBIs# #J:q 0hڠiRӮZNK*4lMNun?N`ʻӎ\QSCVĐ)F.FV,{-qa+eY2[fao%SC=t"*sUkZ컹*\ev՗1WB3-^묞%kď#>K7ڨlqioƩ*10Y[5|'|>]kߟ[8a'm@{rA9$B 1!vb!viWx{d\X02FazD]‘ :4235Qk60ry-䳌I:s 2qzW[Ύ<>=wBoۛmuG;Ƈ٘Nd+ԝ`I$7䍡_$3Al=(g0I93UF8'O%8.cmsc]Ed`T[v g}=.{_^Z"M1,+2dXs~& C&T6Ĝl aׂZAZ1Z?KB&xL7}PaQLi\q#Ry}yӴɭ]w]~l11~=+kPfۯ]֫D_´?៣өfmK_GBng; !-a8@Z&w+ءJe]td)c_3^UFX5jM&"]p;}o4".,Mk`)i1ykL7AUH.,u {PL͸BoR[utP.3S1tuOq.%{CR"-); `Ԅ4;ܯ V8 #eQOnBaDHTi)48IY%gӔ(P[G:Qb(yb(]b(Qb(EbhN҂r*aC#Q\'|IUH"AB\N)JWt IRC ۀ 'MP872˪d͙e$3| BH`g∖P՞Z8*c+RI˙kR}K_ u)G8D.Pu CSZi5Q%ϥR-jѤslwFQ2a4"/_0jFww_ujc=MǛ1F? !M9C?l2mNVzx-hB|F40 OŸ?IMYN8ZM?DQaVl٠7Ro,uL,X ]4F E$V^ 7?(ȺvfkB8,|߁oLljшu+*`pS_3,.O^{4sPA1/kzu{aJbhЉw1uޫ)tۢra=n_xEV5l\SЅ9u`d^0zcמߛcL;:9oؘ I/a>Vp]bweA'^Ҽ cvjCd4&Kd&& =ˎS-;uҊ.Nu(‡͎gʕpp:»/Ov`Nfʰd 63fL@څl@6f3+ ͻr) E]N /gĄ)O# w:G<=g7d@#*~>|?<; "l꽭5ZPJާ^A _|ͬXDIJ*6 6 nY%ByxNz]}N9uN_~tW`gѭb v Gip e,iͥˊ)`G!$3խgT),J0q Ѻ護(J:&| (AYo4aߏU0(+4yB?{OƑ_!e -2 ,<'6&q ,SBR}MR%$Rjú]])#ris΃21LN?>I;W].;)!ӌCwɘIT1P+^Sl*9~N& v'Sdǻq`ٽO-IqzBodn8 YЃEq8S*?-v痑 1lo֞wbkF2%_UE_ڛ\(4Zq@ukvZp[4h$SG(B\1ԧޅ5F͈?;OG ok10[ãfo9%Gi]͑.6w@HLۦahfY>"!8< w0blѓ9iB[gedEڲVE(uFϏ> Jߣ7jN1|(|C%Ŀ!:_>{>PQ w`}b~SkVjjo>5װԪgƼ[pz9RmBޯ^8g}Υ&Ztev= 0d~ܔaOz~\J%VCB4wSimNm\1VHO{ΖթjRH~;C33<Ε nA CY[ud'F5^c~l'GF'G;myKcPg,WR(vT!HQŴ8@C";#s\դ}R|ym]2fM*"s8 ]5V1[EwUj⿊ĝ⿐vHGQ^yqoui9zx+Ƅka2yOyj.UMJ ^0|DU*@ r™*1T]^l%mC!L%(atD+54rgj4xШVP)AKеgV{gkˤG(>C[}7 WoJٶ-s[Cwz@]x){^J'Оj~>'wQʘ|rP5#Gڵi4ZH˔Z=ah$#4H!x\xg3iT=[Mm`$/y/ܸ{{5mg<}#A]nC&P(M89b @*!IM/:MoV0כXjvV, ZgAs DATd唗Z)}V I9E !1"2ґ YY)Y45)@ţRRG5AҒdbIC("rcRRp= q D'Rȉ50gM:ۼT_ D7%d%MH%(E䧞k+C g#K#$Q9AԈ҂1 %)ȄRkT9(u1?uFR9UOI"75C1&m1D8PP`DM̚j *i-Q(86*Nv~4 /YPv;M|I^yo'OnE/G&zPO>L"FHVQS%_ S2I8Bg|Lg"huΫ&eYp^S BGx.ښ(t6b@IFf6ؖyxE&t<4N{ۇ^?y%d{={Y$.*{g(Y~.7k?w(,g^e2͑q8*5LŴX*gHNzt?Gh79m<蠒)a,feE! 2L >)!8U")\XύTR$2APczZĜSPLJ' f2ڤ;.ig!7Ut%`m]/;Ki,5A6Uk~T64ԍj8Ĕ~T'*iT%[iHr+l:!".ܼEõj f <},eK:>w+Os QnY9n79[ ۖӕ4R1HL451sDR!VMnp2G֕h~ou{:e4^{F:(IhkQk)/9JD<<L& -jF ʰXJ$d|Is][@S!(`h3:RD†j1p8'=J,WlNKɦ4srj٨E˯o89ݿxs~pIus[RK(W9DsƙRʇf*\J1ytZE! -U҂:!ZY3"s$3:W YOD9=8l :%*In)`2Ҝ87sZ5ŒB( Y . kf_qKws.rsl⧃I7*;3EC4OfFsq'!(fDD02WMvٲXNT’z2B5LG"񒡌ģ%(ƞM P85),eBry QFI{T(n:J(!)m)b?ْT|6E O;mi5"p' фE 0tk2%q!C'.d(pK\x9 RsaUXmj- l:P.%(σ#MWyFʟ ~?Zj6??_+wt`#RFB=(UAy 4'B"Iz4R ă "x%3hrg Uls&4Y$к& ygC_H}| ?S・%Tr92hbš6Wۥ2BM|UQD=R\*:BcCXl<^1!H & J@Y,g /qg sRViUTPi)!R h}p{ uV?$^%5Z&G0!&n A)WDHQ'X yб@bpnAyzӤ̮3)AuQfG~֓]|\]zytՌ^x䆇.\tnđU{o~L^K#Wx4Z׼ί/xY[nAD*ئ j05 .% -l| gz5t~b1WG3U뼐ڥ3e #:s^UMr GB|!ع;WㆺBS! V e!Є !iÄ'l  %p)){u~}{T,,mV\kJ=!&ji.*@zgLD n MU4SȩAC̠d C<"{' -CERUVʅ$PLZ6T9\ :t:I{GNj3~ўry ZQv2 g+#QJx$*EdIHAlQDK#c)?O\jm!7 Ym\F"|J04C(l#gw&y~%_t) UkFK5kY\ޮCl詜/=CkD{wmZ!هs BD'N(B H6I 6@n g0& AG75\U%9g"`=XٱU'0JYʉ:苌fBc N}.hДLE!4{? ׷c߫RM): qJj<\671ynE1*dD|"=imb6'rN(&neggo^Ё|\;"q6֔)N#JbY9ߔP_[mރJ!劉iS8xL:׌gT3^S욜]rf:^Y=C!xuGZ\[~׊D+7ZOHt6m{:>M CiQS8e)UWĆv05;u6dJ-Z*糫{F+.8omrޙ.r :Ir5iu%8tgJ5#ޜwmFٮʏͅw7׍'1aE̕}8N|m[۵"d/7K;^-(u$!m>ì3bI#p*fM>n ]ߍ96LN:*#G]NmԖj:.%FoϏQ~A?gt0>>Q?O~|ZOs6M?#@A/Yq\m^,㚢Sn'{'vWn>A.O<ӟ|t5+f+q ya=G4T!UD hCf O^rjKԺG:V6[6IJAiqi~y1v0$CY[uhO Dk.=/L=yKƠ( XJ N vT CP\i !sM `Y突]x:Ts ,cڔ"2 :  "DIfrIBu*O!%V۩ ڮGQ:oz0o ['JZp'ivUNTTᳺ(%SZZZE*R%ЂtJL2su_h/XhNJ4:䃕&* 54rgiNJxhVP%AOзݳl+}gkW(>!8g9|w+Mo˶m ߛzG24%ZԻRΟ2WW">I#u ]hy8L΢k5eZ[Er)qo k qSӉ9SFK29jYY$3etҭnf;7-y[ *w7_vb>!˖:ӕORnʹF0rĮu'NԒF,Sj0}Zs.Qb6ђ ~U>x{ݫo{HB f<\%O`Vw`оWT|(0|SANs $GVK>JjO@Ӧô7]Հw]i=#-o"_f?vV$,KV>Bs&ނ0T+ >(+䔎Gv( EWO tp*🷥JehyDI/ôxD]<'S`Ug_IHK!!*q@T$crj[  V(LTFuFHRLItYV2DhRQM=X$Y3eEFT pLaI0t0+QkAt1VENqV-qֽxy_F2a&Red$j !@J'!Z$g %C$|-[#?'(52DX0$E *hC5:A}\@ٟNR9SFIV}LMm1D8"HAD$Ǒ5 hI i-^pJS+_IZ noi4tye$8iDd tݱyi!= m99Gb,+ha-ヶU`D “$="\,bD>&T&k:J;ãA`H 1vVs$GPWʆE=w!,]֏*d Jd+9h~f|dŒW53/+ޕ"2],so.SC|yvh0 0e+tS?Fo'&+a1LL451'(w[ "Rr0$S69k5ҩRۈroIw3 $J G/lp $"{B:Q0,7Ҡ A2i4/쩏!nqt5N@/#gid;zNo;uPOҰ{BmSCen6-l=˝OjKnI|T-c=c,X& :BQ NMR@sW2\p^&&[%0r|#GZC"vPJPֳxՈ;sP쵝H`EJt`%iң0}U5L_F-qj"]a /a } ]_=#R:!qNF\er<q=p_\e* +N$2JBjL%3zJP)+$pz2*+NJuqҋ7($ L4'#2` WcyF +EyB*|B[\OE\ej/2F +P~B*,Og=Nf*SyUަGϯ F'f07zTZZJUMŰMoϛoZ_Rށf6e9iXsPqpW4ގYǛq8×98ij`qLQA:w4QISφ+qX#17<г bi:LV9_BtZ]mMrS.*Qz H8:&n:g pbŚ8 J2 P *r*c}ʾXS'zb*> ݯS+J?{Ƒ@ld~?퍍KrA5~Jdr'[TbL^Y:Ε%)SUPq?&7?o/tN߿yQl<3$Fp0URX%AEψ h'"B$5b ~%i Zt9^Hnao.6[~-=M^wgo;/^coLɭq0JN,|^iKnQYd΂9{X]`}~yes5F@&>h$W^{E4(lFYTpG%Aɚ-K rHcrGae}0z)刉0܊<)c" #gC? |A` |q7|x'" @<9$ F'@Fa@X%| >Zy$)1E͇`?f.ؿ_ +=Ocb4Q[Q*X.P$9eNѠHͻ$.ulg0Yz ӏ1NoPtNj Yond ۭzw)]Ѵ s)Q?^+Q 3#Zi$Kp3"اmLDX,-,xGI4 G[Yd΋k-s_VX``ŤzjsKz,w˙K8e̖1ǘ uˡ5|Ȍ2Łbjy^WL5[zSՄItQORM]O8D6mqFUVʝ[pnT-,*{XiJ_0C؅{&RGkϷn\Þg:+sP&{&*(1P@ؠɤ3h7cު2,IU#ǮaiqO;Behk[UXՋՊgܭ噡a{?mqҀo+vz-#9GS!B(QXERZ9I9&xLf 0v{do%BRp|Le91xܠzhxnyF7B IjH:O?EGBli-V .]߫J -€z?8g1ד3WI蟅+C)Zϲo~u~?]d%My8zbԙ]~[m]#B/Oֻ;p\yw}m"jxoý\63[sۘ0q y=ܘ5/\7P9brG ܰ"H|N:/ЖцD\`J-Cߋiף޳y o8:Ҕ0*4"Z44bTnLE@"Su@BٔMHXG"\JlB Octc()A@8P"Gg R5FΆg'd5M<_$J:.ZdIIʙVZ ?p ,>=ZhANܤX\XjR*((1RbK!-}(6T.7(@|f \@3S_)TD<f&gi%1#Ay He-qObC%zt_|4f%f֍je+xW`Q)+t 8iSd(ψ ?sj:jڄHJzQJŮu:p:DGVɌ4)&Q* )efi@\ |P 5qA!pf;]_Z$K'#rFY\b$7[nR ;·3]9{r?)_$Oc`L&05aX1P/)r~ίSwnpk_·#硫J`! H xIY*_`A:J<vQo|z8=xqdw$*ʾ6bz&W}SV:9w!/`${נB J>R1S:+pR>yWx= Ζ%bnN/v;WR=]GlޏMz/Bu#IT;R/nR?LYUj00~!`*&Muz8sBvTV:dݨ5s!)&9jHGRC_1Kt/9A#9bAWt*`OVv镏3׹_pwo.w~D]^89#AIe7<?n>"M ㍇m2lrՂo0nt5_2ߩD* ڎւ(xUϿ ])e\H,OqB:6=Wl`۲RXg,R%Ɍ(q! 0v@/6E<%Yctꤋ-cPz;EI!` ``, *0@!=Rr'5Y?0m9"fZfd*[{l풊|CAγl۰ȅ.W_z7C$DxJ vgIHϟ=s<xאs>3*kJW6[Ax B#A=̵*{eZRM$' KpOo |JV® gϗge|j'~p|ٍ`NO}A a7r<RОݨ$3L QJ^J|V9r&` svQ"*l>H0]eA$ϖqht ַ-u+ݗY*ÚAE$W vc?z7 ٟ;z&ۛe:C^,44w͠ kU΂#"_As3Ĵ@; yf`11o(AH;dn69s6 ,EAqaWՀVVO5d6?Ok昲Fy9$FLèPb SȌ3R KDTJ'0V@i6(҄Hƀ8)$ $k()rC$CO) 6rQK'V75eJltl*kmRO e)g!$"uHEQM* i h=dE- Ѻ8wEq0jp4gr18"l5G'zEy! OҬҠh$0lHdD#Yb^2p^ !HqZ)R#RvobI%JjI}[ !{9|U]J@P-|l:[bto OrW2LXdi\ɄbdL2j 9TO `TB)8is(4@KxRPo!`aȼbs $Sqj`rH0( IJ@#Ԋ`' vҕ]MS :K@‘ h!!`Yad$ydRI:Ba ة4f4񅤩mV1,QEgFV 3SYRu(0H .3EE5wǫ&!nkzz+׌D|1ܷ`ojCR'`dYbPE X$ xWSF} 0G;X.NOw;Q}C55:AfUF?G-]˺e HM)đAqKcI66s&E%G䡉]Gz CǞz{z,>ٌwMv21ursiO/xs6KFAk*st2γ8`<Qɢ;E;"tՉ߃"2aOaOaOaNUTF'Jx Q جW>{eBFR#ɺbm ԈRi>jsb-ֹ(c(tD eNRi6W.enn\x7c<%oe}݌𑋅OH9Kx}+3pgQd.4l0vcۯًm€\nExޢ,w-=SQ*Bw(BCQ{?zV7|yiA~–efݼ$gsb [1r6.l .y4YTb^%W5Y~}Sso;9v=+|>bnv7r ڜ}"Ac #!EYR+"uIҗqw=čڨ]mTkN~HeԶǏm] !NfJ:TG3-mFb%kb|QȦn :UJLQ\XH(o){eoҨ@)X+(6vc6Q[oO{WVUUjF'ݺ Vg(6?=RmZfw,伷y6FSU͟T lE }~yJYI+1%rhC)D d"Da1K(ʐi+JJ%QAmbѓ/pA%<j-Mvx3JyVq_Yedg$qCb_H,dz͛h6%B2%{l("d"(auE3tE'0EkBС<@?!TbS *WsR#Vӹ]b$]0.wEj7JmXjAj*h ʠMQgfF11]٬Klf} gxlEV%]vZ/x7ĵ.h-bVg m5R?Bn1*k wm FEPu12rME\uyzb^~|VSkàٰ(5/N954 #J9k\n[ƄZ+}R)4X8Ё\SIYCND%iZMg5R}EVX.7p;c{*i΢ aZ-(H)h4zgp65V EbmLh_lVfU2,ڤZ%%e!bIY)-sDhľ!=Kr'&wW~ -rXt<-f}jKې ~VW)UȜSqSEa( ŔCg8RM[5 hu"$t! 2&.jFzjE8 Wb#e7{ևny4 k<.I޾E,pv4N W檟zDhO%f$,Tavc۫ꃙ.ՕTR?NԢctŘYͺ=Il KIzJ흈h-uY>ٜ^nymJtA(D>0iQXO"5n: 9vRQ4w秩 ~4zO|>: Y췫KW'_40gdz]<[ql$zi?g+I'(S?Fִ{]^_gLYmד9MboELjϪ }}'ɷץv$+6xHǛQqq1ɴQu&N竍}h~Hȍ*qEnuZazY=e+#?~ٌWUܓ39*C o;,.!7w}@*:xo~7ަ)X) &&jho5&fhr b\t~HA{׺g |wYG:}Қ I[xNjDh`o?p3l8uJ ^)}ueT o` RFU@ !TCjӌ&&=9z ,`]xgr:l~4^`aF?{&DtL'1 @\My2]ctμmCOze ]ຢ`3 uRZ=٥;Em$DD+--tI/,: bh!aNɬyPt"FDҠ^t M(@^mW믇K;^AjCy2}U |`urFMe*Jɻds΁wiY!Ş*#nI86Յ2\|߫IYw73Xp [[/Gk>J #<'eux'Ӛ"BE5+1XXJG1a_|+\գ>j=Zb=%!pe}{?EI=2t"uBY-;LWOF=uOd=aԈŅ(St !蚄%3"Wˠ% jꕖqEJ."ɤc٬t,B 3@|A@<'T }ltnwL<߭^0.7zY;_e̶Kŵp^cߐO;S>UJZ3fǢevZZ!Fؘe0K9a`&̻_=\Mo~n(v[ A^|irc9&gQuv!SpB8$c'Fj ?!7EQ{0Ա~9*˘}|@#? ܀mRS(DJB#)k JրL)%$dA;br یs;F5GqK _m|Vbe28J.l勚ng4գ7 I0OzSE* 9K U@9EZ'8B9[2&d-Ӎt݋Es@|ygrXdi\B%2PL_FM$'Pk:'mh_ -3ӿ,A:LSl.!d ?N C)XX(`$Zq>Nڰ MsA\v H8r9AB-5$,+L֑$5L*IG(l!^Ap4;Ԍ&"R#T%Jtf*KJвie&Jx"c$m<"< [[AL?f$r_操ou{xSK{)DQ0,bXWSTV0 H,j ю=`l:6rjktͪb~tN_@p$j* a6y`-,£ӯ*~Ώy_i 6CI=0@T' &nH=jsb܀\i @ (˶o(fӹ]B{O%ݸnƔy2Wt34~,"nW ;+<>-l*ZU_O~)/74/yګW'zo~~߫UCz^.̊>{ta`௯Wτ`:VG<ߢFbgMsZh <*RYǭ`!r̷"f=D*<ٍgɈZ@>.5<1AjՖx<$5sf"(ܚ195zr\\]2B½Einӻtp]=,SllOt0i_AaA25  "c0KgRa'#^"KRGdPϽZj+,C'ZM* %ZSt$LEt!"'U\lv4shl\-2kmi_% H6vZ$d73"2ډj@.Ũ 4MYA '&'8T|GG MllևS$YKhss5̬e;CYi8XG!qJKbR[XA`Ir\$53صo9l;N:I;zI;i ۃPoZOTQ[]h?;h zqP-≺Ž αs Vf)=.8Q` )f(DkIw;<\}DLytwdrIV"ZHBEEL50;tj+YL-<).=Oe fvC׵޾|ȵMF:7[Sjrd_k h]?.뻇N6t:k9/,;l6Pv>[/[8vttqQı)摤vJj9 q*FS ŀ.t7T ],h$=5pR ^S=] l@X>غG11!ݴ?Hrߵ71CoG_foәa:, ]sd h2d2/ ,*rhT~Oy -@_Ǹ..|>"%@4fad ߖ4$/{ (̭YzJ3AOVq>;qRRbDiDZ ns`']l7 2{<*X0taՠxJJ je$)Ì> ï%0/a,>}ɰ<&KEuE^xeMEU Ǽ$Nx4I7(ˤw}b txz{CmoiaoWȇ [秋(I ogз-7NvSF^hib~lz*)ÛKr|)Q`tY'7`"{&CD2E4^JJ8%Rb%0l!7w, -sW[d]VV؜ض7`ͤZ3T4&\\ f'%Ϻ:?`uBB1#^^4L-,av)DԓLjnܷ l lOd&kC%=5:-wLZni tϫ@b8揸7%Yl Ϟg04ՉI5mZhjB[Tumv["=&5y-+5]!uLNF]%rɠD-ǮSWߢ"yB*LɨD.SJTJީoP]Iΐ>%tcq:{W\IOE]%jJ+7Z|d4̼qrY"vXKT<{Rs~Z|עҥ!@)S1_0Ea()y`p|k<= .vUY殗,o؝^й=Sd%RY4%`5"Z44bTnLEN &BResrE`>&y$,`FZbfZjFe-hj%%}Wi-- AV(cVwmZz4w`|g]6ʜR͝%F[ʍiG8r/|%(6@| &-Yć`B*#R"%1~,04N+!ʓH@*%({< |0 ߵ>ZXa] )bFʼd]p>jf|Lek=!qMsqשQɭq/ifw-IZ*{L""L%g2=OҖ*; xN[tv)33;?LA ޜ.gO%P0 8+:mqk޶CRjь\ wa7}:?tuݚRԫ+må2 )۪ jjBgS?V\ PTUSl9Znz|<8[AzI̍AZ[vnXmݴ] g; $j#ݴ CڇW3,@}J+V1Ypp,aMfr Q Z?j3ɶQ[*CRF*40h\캛t ިHA fKxⷪzs+`?㇏omw#&㛿`u+0  Y L?w#@@V$0^34UleY Ÿ:[}bthnf[KOw_R7I+?уլNBd盯 7)esu%UIv*;0vg^42"9*#LI[b Cօ^R8u#=G/K'Bnbq`cL7[џw?@ ڵ1/P~^%l ̩Ȭ*'s~⧷eiMgsD/VFET!-( gLDςDt)WGiU5^y)*@0!q]Pp4$2"Q#ĤF8߄ uU%Q%HwXT^NيZdh[#ί_wo9P[B#ع>.P`V7 #G>i0D)D֖(#wM5B86'lxɩ#SZ棲çOf]OvٵS T*; <U;(e4(lfZ9J2r/UM) ب-,nA,ݧ]<5%/{Θ %e}Ȓ''g1:K̹ uw޾%Ŀx`tG_g(vl÷=QF)jnЀ57Ss WRsV]} .v57~VcqbyXAFI"?4B2wmU uU5mŖKWGq,zETL #q(` >xb'Lfn ANB^(2jLEǘˆS@XɁ~NHUk8g($JˋLmG<ÞJfa*;xq/ bRt cF]RAs:,#I(%q[MsB50&  Ge=˨T*8:Ps+ItQxITbF(r^k͑[F:EݍϲFH$ȉ[0(a"p&c:r6%c$wU΁zv}pZAWaB&qb9j:rRNσ^!DHꭶTL7 U |^aȮw*&E GuJ%9*b$۟)=Il'm$\k )+\r,s֡s2%p40GVDb&[%=3b8*ML|'ij`,*i!9a, A2 K$@Aؒ´^ꭈJs=&"*|RE[C*-"Ʀ+pۤWzcM?|&!jG\}%U=눚4mf{HnD [$qܘ,|{i4/ipn+D<@hϰAa~ ?dkekgerĄOZFMgn>y1)jXr (%-3ިh\H|ΧHAƹ \g} \[kC~цojُ65e#(kpۻ+W)홳'L׌ t-{s!Y [}^tH۩球zs0c!qaƢ(fF Of>&2H`.F_V|fC ^X/G7x,2)2"Gt qK\}Fw u Dyq;ge{X=dV|J鞺PPεq`$eRجJdo3A#bA$YJC0D96kRBRJ[W$ZxH7FHÖ!0*j9`[fRrDȯɗ\&'Gm. LȐ i-6M1;Ƭ" DC g=lIZ1W0XK>E4E-Q*0Y#s⩻5o%W+ܛ;R ud~R{LXy鈓oTqDLr sГ9TNk4bɘ%iqSM23(y䌝+KT*yZj43 nuFF'L Q璑 9*-((ʪl!]΋ɮ(OzB9^N^}tp{^xûʟshqOmRm uN0'YuC/yt{:]_JglRˆZ.Qv~S/Z0w;_6=G>Ggȵޑ/q,4ۼ&q7o'js-s-zJ|a}0{rA9$B 1瞻ܽWݣrOZzor FYƈGE+Y*H#3C[EQ@i*%n1&e4NЙ t`\e^m8r D6Q%t ժ?(W#OQԩ4C2}6M18Uo>,7-"3#RA+iMjO$3ÎEyݿe m:{MJHlF\_H5*̚-כn Kb2< rr4F E$V^ 7+Ȧ=c[4%nAߔ֪='5J޶+:hy6Aom,K8#w,ft /|d}oqIx޺rљt);ʀ x?)' vھrûd»yBO1TG>'_Np2&EK9.qN86Zs.cO̒&hGow&#Y딟 g?w?̓aL1AJd1ҵmh<*=OjZ k+_? چ{z?jy?-dZ^ ]=h:7? ˟s8ހneYN~@82~0"<,H-y̠}¼&_>ݒonV!)G phtM$3%;"K5eV63u/=. VyRO7BsЗÔm3KW\o ^Zy`ђAJ{%& \Flut IxbTֵ+lÀKm`7__oi@06`8.> >ߦ`FoL!v7`n ҽ%j^mt?ulOs$)v/Cڐj}T~'XN\UVMDZ=1y5?NBjҠ @3< sQd"$3 `*9,Qlρ]UAzXêjCz"~1J{H RA*ɴcgT4" $ 7XJBLq jP`U2KjQzT'_^(&f\7iVbdq:KEʑ3\N͙EZ˝ \x8"n+m-NQ6kyT+j!fGAe%cI^ YαGGBGGGq4Gai ("g 6N<@H]Mh WUfj6YLU2)LSHMO㐔(/ yh[W-B`@?$A,m6c==(ЉbD O>x#ыF99EEty &[#fW(! });$"*RG|Nhڕ\a ru< ""CErUZu5g7;$fiaTf8a5Pnl(MB2R뗸c]^<}[~ԫxԫwIkL\^^w^!Џ^ lU^ \Ui;\> \I4+*\Uq\ \IpJY_\N^\ \Ui;\U)Xg F\ \UigWUJ7]}D,UUk*UaW:wU`EWu=gWUܳ WU;UJ;DWWU`s=쪊{=7UZ'W$%,R?z ~{ǿ?Oy`Nk ҇:-gRׇWO=b\b$h4iTN0gm蘎ZIt,{[~e c2T0GeٲFﳄ.Dci,Jsp)K!0J1M*G-!҇v 5WKPM.Bf2mo,Q#\_(P.ݲJC"M y[,h pAԙibTdܜ|t:5SJ͡/9ʲ9OT6Ԇti$D1ɤE+0`@>zgmaiF""@䘌%>J0.d sG@F J}-An-LBC{ &GBUлhOY;^zi_I( ̆7Y|[tEO__=gf_n-z:n\<4'ү"Jk yĜ2'b;7Xn3e}`hb}*A"tQL%avI. i{rI/eS;xeqA TNkW&F8٢1!WZh;k&Ξv-nQTz ԝ& Rm& QhtԊ V͡`-Mǯs/p:eM$wICIǩCA$!DA:BHapQ8 ~'=o{lcE;Kܡ q( i҆I{$1IE2 ۂZynhL'M[~( !*BT6aP,IQ@~IĒ 8m|v'"$L~0'}1f*rƷ䴇yxAvH! #ayllP!,'ڮǬ1`㙌xh!Zll#~ԍ5ѧp7^? a6RɞK7&DMH ] *IqwZo%?2uV{ $by`x?<j FLYoTdWxKPjZWR,VkzC&^WQOm>u7\hǷ(treojw,g7BjlrFw_vJZkcJPZa UT&6x : n-̐id`%BE憩(! z2%L:d9a)mNd Lȸ\&f 倅'gMS YS~4.6n2X\%{l,ElֈG,+O,j.1t맥1s:zx{[7B5۷sz)9I| 3"elp,99R}߽{BQTa)mɊ jDr #QYUx,%Ƞ10K=BRE%HStFz٣7ƾ{3qON vOw?B}aOl_KsQti<͛^1$.taTZLV:Vd OY ,"J@~ 5ӥuH9Eba$t `s$͆`N#ѣ(]<.](]"='H.ưLЪAKau>b6XKb@Wwd<4Kl,B_$ҿ-pj4͹ԍ$o%ηc9S '1 d*bY-[˒δJfId7՜ш V@\I/8}_8p3, M>^_n/wu} sAk.gt6T]5_۞/]}n>k^Kۭ_> v3M/FX(Ĕ“H5[P:ᒵJj2%%$ŃQj ;6̪ ezkuuE#vW,Q- 6-5y?آdeǿ\cay9\ `W.lXQ(^x˯7 *b۩[.޵wa* hmqզNA/Kg('[9#NW]{V|?Ϳo+ПwZoY i0>w=q?B֠XyE W=ZO[+cQtӇލ90pmy*E,ی%b\,+SEY˼kca B }2๰8&B|~82~#+|qxU]3MطƆz;(!m=VPc߆^ X<Ծsۻi=hQ>h Y@W"XMC]ϣm]R:]/R:MD>86hdF@&{NCNIF[)uKbDI,B:(0;– Nޅ4!G%ɨҘ6ghȉ1鷇[z/oS{v{nu mAZv$.HW/WZW lAZs-}PN I\ }P>`>AC/x7+҆yC`PT<*>^:Pz6G j9 ] |,tymm8:p) ?{V/3p^'ӳ@7 ^YXr-],;Ym98۩*~EWtu 2sZs>;b 3@i9 vu=1TV( C2X!xK^ uB!w$$%99@H]#s)=Ǝ>OI.etی?oK\}yrFMCnvF ŹRjH8Bsl˥"l izRR 0TjIt T٠1SVzEۖޕ\2𶤔FSvS.=( AyC67ތc[ki:D0(#cA+kLqwdD9lq#}I)EN(T2cfM*1QD("qDlF8_k;.̜q>{U ɒ [ǏPQ饏ot}|oueҿwݳBMݣ3Ye# θ%g^3Z%;`\3?EݜMgd/pųV3@ G0xdNێX]nhB"k"E8s߭f秉O~0O›x3!YgüG u%Y<5д#'Iv✖z znpgL҃yr*B1C⃷lCgfՙ pGvn6H?_ޞjrG8֒a[[m͈8̺Bp9CWhɦȜJZq/[7vgs$/VFbilJZZ129'x@$9uϽ;]~Ӭ8up\B5D$A'$Ty#\bQ1!TCWUU;&jUQ{u②𭑡]"C7`Z/b6t}U##,l^y!CzD|x`OYt5ЏSkTDu6 d63߀80cQ\32mOH1J˛K>sO  6O0۩ZW>'/ zDb eHч8r Cd.s>w*E%a[icREjuSw6 $Q1c+hLTB(P9Fƫ8Q"*[I?+^~JGaKaP)lV zWPrh]W7'uPMN:\ ZΎH13oâ׷۝&L­pϿr*1.E{ !OjZ)0ĔpLsQ` +˴;Qxu]H S!Tے 2=Hzi <&3xtRzLFe֝q7JyZq-ʶP4[pգ2u 9اTTn1Jkd0M2їo eƓaA0GeIht2=R2MC|LPMM4$#\9qY8s̢TjugqZs.OjWclu 7i%1f Ly@h:yKXc6>A@2!CEGq"!Zl#wLY%ɩFk V~{;$k0L`-8"4,I*0{Ty,G.aBy#0e]*4iL";RPF{X%ཙTq`O0k'/[}tS-TB4CGL8)"x51<&L/ #ĝD#g쩢Dԛ#K͢fF[΁g&f Cҹ09T]HKteUB^; $;!rz;6ďؓjK_KAc.o;JwT)KﮞsD:T<^LLy#FEo]feql(ޓ[_77VvWo(o.-:v(hv ݅1uzv>/ 狚`!IEQOoh5I y:eA<ӯ8S9\tc8'S9$Jp*z]C}?ύdLtI HՎ#֝g}z\-%٫կ97sWr_<,ʼ{Sī7t{SKoJ{S{SkZ;ecZ?R??v8~ONԸXU.NJ: 9rGxh2҃Slo&^vL@罴 $C=2y %l)BV'G.I(<Cl"v_qp#5JpPPSC%x0r޵6#_1il3xg 0X,/X,H) VInKUuqɲ6eeP*ʌdF?c鯩(HݜzwvH)lfw\CH"tM7R0lQrҫtҵ4y:഻,ƒUiՆ:ƣv =\j{:,4KH0Rc27[HTMRI{..oU޺$ YI7 ޠ8)uȟ*s93SRy$п=pڔl^37}"lDX?zA8b^bj{!/KIWS?0mF?. 脹k\qzks^0]۵?oϞK﫤d-3VX

8E(|SΕ)(˸1LӐ%ȕ{J|Pjx*>FðCq`Jx7[ۯ(qw;+T6Owo>oma5R% X(2cyb} Ϲ,pD(tz`0d0`"fi#;  Lr9B&Q,tD&ee)#\+Of&!Fcj07׻ES+:%&oܝ _mrn0{ju6ww<"o %MɪRo(%"h,Mxxt9L0Q{ꣾ^_XoS=O-o_^ g>~u—YO%R¡'0NW+xm3舩!ZJe3 ʚBy gl-ҫ^Ys`/\wFW՗/|)n2Ϧ[yupdKrxf `E&Z\b!C, 2fLvTR%PD;٦N6~829 j4)sxpW3jWT ` xusn zwv}G8QJ%Ah}6JEDdrAld"`VK CI8om#%SMH7AK%JFkur/ѫ_dbh*xNC"  Y( T$]XdyZ8QUl!ŽH--O+$Xʵg.QIxg#~Zy su#U4g >\sc1oU(ݟ7G kȽ1vMz#ΊH{oÔIeQ#f [u-Kj-I2e6`YqBr;ͽ[bYwg*T3Z܋ ~JmZ/[XhOW wU\n4A^m)#|xm6*,]?x"B.X0(T7zE@s{OϹ,pD(tz`0d0`"fi#;  Lr9BrU`#Qr,wwhSp<`^ sDRI#v\oZIG)딘|9ɠ\nA-_Dnfy=5y_~i΁'1@"ZK(ƃ:*4腔킌Z3`=;G<4:9[kR2 LIvY)M2Yr<Ol_S}J{6(>p<¼̔a1 63fKl0trW.1=U1i}{%&&DSmS;(,\6n[ʼXCD?}~?~zHW_dtvm!v}3*G\C4M.7Rghl D%ΉGt˵(iQ." %F5EDńH$2Z!!%(4?bUh']aKBwYNqP\V3瘴9.ij=U[{_cكq%&RJ j֧hʊ|2|1ϨW ͚?XC8hi\f"4;8)>cR[W-Q}UU!]_$b3?FXl{u!;ӢNFٻ߶rcq EMݻش_| m!Jr6G$YiYN8Myr83GYm^GfU GP9Nqԥ29ռA\TpMJ"P׌W)=呼$,Fbhm&\1q%HH23WeJ:gKTy`9YSI;ER4+|_lٮC ,|I)җrͧї I.zCv6^6A h΄Щ! <`b5ji..DK*;@:$a3ڣArG:q4ܯ[;Z( CTt,>~Q;LyI%G'g)9K̹12AJvk'VJvuw޾}ٴƊMv]+@7vοy"F)TufzMWfZv J Vҍ״r)+-ŁC@١]9OlQ sT3ԚK S|[D2_&;d ~&^ rB3ZWQR(FaJ VX*:k87$IPKZwYhqǖJ0pZk⑰M؟YzWsۨCu);(, CCP)%? /3M2MmQeq,heMĄs:Qt7җrZDB,s9f R]Awx#vE(]fVg_v\9|J_.i"@\e+;"8 o lb&1K" sOef[.'S&K-y?& GC3Yƥ&pK@ gd9;86ZJ.,^'5gf8+GaȜ\P̖tI ҏGp&|4 S{S#^}HfD+jY&:y>x}19}:8[B3,9|?>tWk{mXmŤ Fzs=+`lH V #V`0י= bQe%N/ =֟e|娂u>Ua^:.{82"pKc_Wv_Nc)rMx^쟟;??zwo?~ {o޿{MV%o F<F}_ 7?o>CK mjzNyŸ?.Fj~\VA? r=)NWVszyq^ Mb~6-h⬋ݦʈk$VXUoi}9H7"goca=`e8?yW'}G!+dSdN%fT8񧕛#?>i8<+Q9N],sp0#ʨZ[6p ʦ%)s"gADr;n׻#sܴe<GNyF/e& :P()ŤBU j` eH~_IY <\ewè5q<ʩ= EKf}!Y*r,^op8ϳ *}Vҳe:kz3nhu۩95]Ųy ҳˤ[0vW:2WM;?=7`O{~;`zP1+}`pI{ET-J. 1'sHZRJg-r>ݧY%K?ώe+D| TTJ!TZ0YF+AXvI -9$d+t;In5mՌJ=`nC+YJegƃR ywJG[F2E݋e/B;Ɖ[0AhD.ΝEc:F2XJ杬,gPQK}9#K\T[ND(WcV * &I7D|V7O^TBE wIAN:딄!@[KpdBH30B$ՃZqrIto_mĬsX 1d\BR!I6HYLJzf:QEq~5FWCx#OȕN D$x@ -8$–\^H61I[Dl<"|Xd[㕅o~ pDk }ȲcƕUEŀa\e>ģx,X"8d[C|bX?}+Sm̚| r-(Qr.6fl0Drm=Zyܱ7G I0dV6YhaYeX~HeUeueYH3^d^BL8-£:RK5.R2JCP )yLT08 YC{.9kkm"Cц|IZs7)[ۓ+v͘8ˋ6cZG>Q?!(~aw Y;ˍ`}pTڭgƧ%Yq3q*q͂jJD56 6C?f,As(±q>`~*'H1,}L|XuL?Ü|MAZea ;ːq" &s9x'SA{[u{/WB/ni}=NFA$*flV{%4B`ʒ99kcd%$ETJYS"7HnZJfcew9ZL(^"T㓭nB",&6̊|R^zxr\O{*W:sǫjVRcbJYaQ&9(ue26V 뺚,C` %IzҖ[[.1ٜL$sgRR5c5r֌Qmu!TօӅ M.ʎ8öTGqW/AOOǓo\ckc8aћd/!eƓaA0GЁ2R4:X> &C(ZԦ&j8ɬ9bfG'ȹ[c ]s.EkWmu;6Kdߨ$8:#+,0u.L!8B.rv PbݻYic !2@^tH' 8bI~Mc*HN5_;49w֩,qokǶTֈӈF\( Is 98ȅ4 z^.}P<[dWg9cƨ@rGRA]gB jH1'-Ho8Db٠"\YKԋW֋Ӌ^kxG +n_'xjݍ^^v"ozFmՅ 'sGL8>)"x5 2n<&L$ [ ]^ r+MTJ AVpsYY:Ȑt.Y8T]Hj hiOLνn-8m,tffzf7*"r-iaE G;> 2o㭾CjοnwO9֮f!l|Ohn-W=_5Y zr=?5͟c~5Mx~`9dzi?~-[^s~ϛnzmyԦf>D$mm\s|'}sOtU[4UQx|#B$6פ1JEv{wzeIޛqбhexr GJd)TFfnG"Ȥ5HAPl!p,q\8-<V"Ȭʾ{5r6G|4|-@r<?͈zdYhv ݄15br2ms  u"@XHl{QP8cP3Ql=(g01sg*ˋlNN 9@(w9 ٶ =7F2%!AV;&9wKR qq4rgV:qܧγ|"Rk-ٓzZ?qkzת뽨4WWqZ@\z2A$FGqxFƤ0W`jx跟ɮ:砅˜91-NM'b!PBe !c8/=C]{oG*،$@96Ȯ3E*$eGYw58ZE!V$hBVA`Ӭō{|:N(ʔ> i?UQ!X[\3ruT:a-Hi۟gCjMjadR>dV~r݋%+h(?1L1qv'q@kXzJX*, WYvo 0dtQ2~~.%jePa߶f9&NyVivjl^Nj8[gd$O3?:*وU.m&!KLja8BuCk7)f{ƀ`.ހ1 Wl}w` J:KA` 11``YŦO@0M/RV2rtSnp09sH[GA beP*|▱$k k?Gmֆ·G` uY +H(9`O;Afx0"OOPcU?6͘K7EύOZ w~~,)} Dc2oc?&amZs#nK0u2AYWW!4W-5d߰6|}o 3֬1h6g|7)sAi6IV)4/T&Sm_f{wOxw kz6L_-ro1_w*'RmF*\a\&k WRsZ^[GJ <ጋgt9mp|~ fqhTu&+vyPw;/2 +3J1k\&#TV"!-tఝfQ|>lb /vg&#:2)BP@ M S0/$] YG ʈ&i"4D"!aRrgyDGGKI^h} O1g5]l}D\+fȯˆxThfϙZqfP.߅2gK6xmwb[dxlyQRC1 7}ݏ)0$o.DlwuMώVY+= vQMvJ=Rvudg‡6:&7>Zls`]{9\%0QKw>0QY 0rN!}:c$F]%rulURz <#DV{WWZ"w]]%*:'LӁ2 c7*K]%juuuՕP}]O*ɾDP꠮JO*G KޠD-ŻTW*-X+2ShbgV:w+L@Bː3MGR*a$ǎZG>RmmQ11h83(7hOJpţ'8#3س)Kr0#m{Y22^v͖u?GcN-GJp~ՃM蕙x%݀(fqUv`ʟ 6ܸ|vW13Hny`` Sw8m8;:▢ju<[Kor] ?_Udo/H?47[YE)W,1a +jhаV|M B3!LsK#FALᠾ5A#ΐqb M h)t6TNP&HX8F.u3 µDX/5^#uiY7;Sn{_*_ʗhMZ<ĒTRIöy;{dJZrai*N !="`TRn@HK=JƑxj+d ^u w?ffgm#)>9IM3wZI%HD0 AP$@RY);AC`(`=)<= VڪуS _  +hf|đZ(ψ pFh|`Ga~}0M_4`$R# #њ&q:Jn8TB8RHGAӓS*U@8/>d C63?k$1,*b3{2B$\PRU:~)!QM=ָO$ +g.\+U ~'OCdsdjj+g+p E0;ELtx.mx<%xy:^ gE-6`M@Zn;$UXLfe\ ss󏉋 , Ň24>驼CN>dJ/ߦ+ GL.N唁dN}!/$Fk`B**CvgNӻԝ~-uW~*.\L?8hI匘Kgorm]ZXm:)م/BM#I8Gl4Y0ufy `Xay?YS-veodr Q Z>j5ɦQ*Cebq*Is`GIqHS#|a@^tCaAy9s `W?N߾ϟ1Q~8`uHdc~یMjh*fh嬧 b\y|o5}̶D/UoMR[Yk7`>՟ {8sGT}*Ij]L`@S\8HZ6DX#E UbGsno󔾘䎧4ʇ1f )aհ/Ox v?6{%AH;GJQ&#3H/8Y~N3(b5`F&Rpɽ3,V+U1 BGpN(ÑupkGG*.B{H4o_>%a@YumLLi?*Lxa=cm\8G^Buj&Wozh`3A-" ;%y6lƭ7[&.6ҦUqZ/U&Ո6Nj 1@K؎_G>Nff=8UEDY--Ly!F*NUzpnQ#cV2jz)m):- f`k#9oB!gh LLj9+Xp7㹃20p,Il~_]ZfKk -oC&+/lL >85,roM}֗{~i02h4EQFziε`Wjm!Zў>_˘ &{~_N^鍙_(ߏHɼ I3D9Ì3!ĒUݜu/ C6ޣ@=ɑD[zw6d_.R#g4`P9  Ϥ$hoG~T:N5l=,!byaNXfFR%sXXOTD#%B`G 4"% 'g2DO?"k/%%\2pCLDK P+x *#G:#guD thK f >/ p!OsNOEfSvݨ'=2ƹjD>Ż@-b*F kq)6{@169D]cSvNw޾?M`,f-+\2ZAuwg=7h}J(&؛@.E{SB1QmnؼjFVFP5 UHgHOsU5i' &gv\(h8.{ݪj-E˓8ϓ5sLY%C'F{+=<!X!IC}bP4TJF9`Ť%B"D)΁W BJ(2v,gRv/_mQΜ0KОQx "p˼FJȸ#j, ]o~d:ÚA&.q0JrK4k=G$uܤ~܁]=\\kg iH!)s 6$X+dT$űS:oZ3#V@iA(#h.GFqZ?eYQ!Ж8_g,ᎅj!F-B!歋3"wcRB %$<QKjeN[8].W%$r'7NcٟIj2+ǹúQ~A09Q [cI"|!&'~:þN80ĀO4ze49BL0D9)UCIs4{٭PJqT7p+:2,"prX+BY.X3<8W+ǥ+b˧? 0rET;<Ρσ:m~tu;mSFv9͸J3&I\7lTڞ!D!e!x0keʈiDk45[R>\RX]P33IޤHN؋O_b56;2(OGRE4&xkVI|LY%wH:]EYzGHޓTq2)&%t\Q*KR8q=K9M) Ea,d Mo OfdG$LՓ4`06ORH(rӠ˕f$BAψ&JhLHZy}ُ'Ӑ?p3j lr^ O,c6 RmJ(%f!,mAbڱ+j¨:Z+p$UoPՌhdx"#EDJch$v# h!#3#8Ő61Jh;B@QyٌQ?Og5ycWDԅQw!ZWmЎ(#IiJ qM"DGsصңeH/tj %DiqSМS3|0"ф4;MEe@ #b1qȉyU{I5[\m:,%;⢤qtĭI'R`ΥIpI1pY5JBPB|)ao?r=@"Y(4^{Vqέ ~`o~+|!r @).9 QkdFWEpUSF&Y= :G֖;R> +0jT4I-AD$E" P)g]GpsOd%rd^͙ȅE0q"z936.}XkEeksfrb!e:b [Wغn]5w9ltfsYw-+l9?۾uƛ;Hآ祖a<o~w}L\?,Y:FuqWtik,Ts>ym 4 6oi8s? #]@~>îd kmKheD]|@=Xym)DBB Rgw{gv㤡$;S2- !h`0 +H2xQ6S&sPJpB|@m`*QH q.[v/&ΖػP]G߽(gLî[7f!PEfǿ%eܦ.Mjo-(bC!9grYrlrz{ x{9y/>"{Slo#@]N{IQx_wpmg/gXz{Y ^.blƴ=Ilox_{ Zo{ېgpiͶcZbZSfieN/DAWEV NZ*Ü ';>HWKqxYVtUi#=x#5ampdW~t7 FW ,9ݻpy\Sތo9뤕vΩ*E-"%O-* t.`&'2u1 5% y1.3{ϭYӇlufLbZ0΋G 3ׇ/ͥ=tav:6l5cx8i>闟zw#%}\ػC_z=".{N-_=?%uk:y )st;\ >ÿ֫~G᛺ @g*풪 +cYuk*8_?x4z$\N 2"H*kxTœӄ@LN̲@) Yd)_X|>}")ӬAS|n}e?>;;~?X!OW lViVJ#ErID%^Ls:D "d]q 0dԒ'kjDs ف'-8X0R*@6%B`zK"qA(-h#.NΔt(&T{u9\1qJw<oVgD˯vL>\lPot{;洄%YfX3 oyQ;H(hg4e s5JI=n&p\y^O)AJ!nxJFXK#pdR>v$DBMʣꐧ d xفW=* 1$5h`U Nw*刷Ѥe˭6-z֒E_/ևbŚݖ  mEegJ^B|P|?g"R8$eC҂k*UkricAY\|76UB%Z R"t\{ -ҭ -`[7@y #heu;"zb|6e7|\K+J%5E)MY#* DD!F9ᔕDP<m6;~^]ϩxb,^oC;nC;~kbyi.K=P!B0'1_=_h+~8_ĕW2i5;LIiL+]]_=F0zDph*̱UִK%:~)LV&ii~Z%Fqqp[?D+i4p 9Mل 65N1rN@fL2%1Ħ mC  ,9.j9fij;ȢdCDj+P`Uc,m,YWH9&6 ̏h*K+[WYJWȪGW(h*ˎܶvp•T>_=|7 ?F׽?I]kޜE}[]0VZXJ *;ɗWӺYrNjMrt rEĞ 㶩7 5c>0g59#kYJ.?Y:4a:]2_z\A3_o+ƣZ'g'/<_.˜O4k4w1,~Z?0; ÿV ֩>F΅P!\* (N5rN?VQ=RrcᲊCV9XyZvS4LpeR+a牒Ie$I"qu]Oq42ʖ"H\&.B '763sD&-2ZA[eY&(`0}% ֺ\W$ߞgY-#Ϸ6w=_Znp]tMf 8Wң$1E)1Sϥ!&I@GIImXsHIK0e,HpTd:5B ӆ ! єj\B˼42+½t= #06G3pbe^ۨ?1CY8Ѽ!YK$o`*LƝD'wmMo5mUnB*vI*R>H*¤:ѫA}K'.DR&C{?樅iOD7Vi]̍b\C}染G.;cQD</GdϛEqJ 5b>NVSNdt2E:[4D'PR@H.mM|^\+Z\ 釯uk֢86>bv~9012] $Y™8۬>|ޝ]66W ; [Gnhʢ$Cnp-$Sۛ{0Y^OrSWUAZc_l#y-_[%-eIǣE+/^2}/~:307m$4;;ի_YW6y*9Ʉo^St-}sxo5Rm^Z tE?u4Xow&N?pu0 b~]z~EM,S؜*WV!1v@\uOO4r&̝o/XJ˘-éjRH^Ӭ ``_OY C:"hͅW=i=y3R)}F$Q01#`B~Q B Fi !q&D0wFCLsUc<< :(9uA8˘6e4A7pA! jbA.\BUbUu[b@r-^Cn_UNoR9Fa(Y8SKB0eR' ͕d) zdd4Rzk: (vܹ v́5gaΉWW~㘘MO#9-7ٿ_b&R.1$,cŗMtkϨTQQ: m1JJ%"IE ?&i5"eX.B% E2>9.h-꩏PnhxjN)}a9֠Q,mwѷGu_O:_1϶-Rl-To<9듷`4ed _?,qQjSsƙRʇf*\J1ytZE! 7x#$Z|Ku,BfDK!$HD+uOrsܔq,tJ`U (R*&TR5c1rk(068c[]u!t^uጢsmDl_ʎ>r>M^^>^^߸ƖJ )#ATVmm!qHL"#DqWB8 \6T^3pMb`"|L+]95Üqy,Zwlea-;!ؕW,zC(F X!+0Ɉ5^&T rI!y,͛T;A2"C hE{QXKK1D1$G:*[99amOoex4yc[ kDiN#te"R26M)2H v@8l"8`AFSBN RCsgqy QFI{(n ֈ yĥvGD鬳l-u;t?q\RI䙵)dX6I,Xv"7Z -R@%E@r{ӋЋGsS;Շ~Ė{Pa'TO[GGxU95=Z|!r@!!KgF^;ȽxF “'@m8tWϽMʹOQG>9$x@ʵ@$SN &eF2 /5T"=шF)$y,S2c.%o* +#5=E7[^&v\zoo?aCӅ>5GpͰ?j3|A7f[i]΋3;s=}Cn} gڮB}헞b_ՎөɱwNlkOTvGs@s2vó j8rZ%uHĈ@+Ns*C}7:O;;TNxrҫ}^cAht29Wx~dnO8'TuWx<lj ؔ?a H6`Pl0 'jQwaNlO?{/*p͙JZ Uw*N:%DºFih?У gRAlKZ|rxIԌjMB:Ôgrba .(w 46F^co&^QGxduoͮz ou9Ǔ["W6 Zkr)}gYIuUzu8њNgmWX9uز–S'ͻ]5_|x}\As-Wp0X괳cބ7n\Ik:6^ s{Z 3֜tL-c_-y{NGtq)J-@w9Y|n/}>pL_+SvVyݣST!H>;XmTA֪9,DC:^h0A3J8$m\D-77Sy@Ä] YfECeoMt#[&ޙpWd6N-iƐ(+^-gP!SQ+xCErY#N(o5E꒮YCe"s}# t~1rK8l2gV:qxMјvEgYȲW&\w} _՟5i %.Q&s*@]Rݝm A;;rJ7d_aK\f&^\C@Nq=8Rr)\ІH!)Θ 9֬ۚR:sm 7iiI!-dAg~p8ZVqy Ӛ~#\sC#bNz?ֿ&EN~kVWt@j'S.0ny*SQ9v_(N~ҜoV=YFmn+=ͤxŗxx}JԴ J rUbP%Z3+a`Eg|vR5l d:Kf%gdXxLbd(bJƋI+IЗrއc `c{e !OIRՍɣ.%TV[$yDHF։YǾ)t4C$SS C570??$6fx&Q^C]s eʖ{?8nosy)qs㜠mL\YdwcCၱXjT ʔV0bwlٱ~QKW(o}L/H0Fz-UzDf EH,\ǘLj9AuHyeqG.+ ${nK+ܲ#~6RJs|{񾻸ŭ[1 z[cރy=[ NDYVXA/S=rf#n-5"Ly*fz=|)zR H ^Sq#Iy<r AjT*QS X&-+1PʍPt EP܁kZ !F5+qWl F^Ge*yl<}s|:As|`Z·#hL"(=v6Ki(LtǥQbH9Z2|v ܒNi>8@ bx (0!7( c$B>D#CB:Mu(/|2#Jϣew4_#R'.n-z5YS!pP!pJXrI !fgŐ+($H%V+_[X#B0ALjs빒%'Yz5j ׼[_5  ywCuUZl^TжyơwdPQ \qFp[0Go(ck)B[hV<])p|YzykL4FŴrIf,F( 9iP$ S)1 "kn4OҼ^ʻ*5wa2\T^S H giLE0jcp;¸ψDQTDHa> dTbn(V>bT$qsB+g1aYu taO㪲G ;Ѫʝ[忉s~mǸuwԴ(o{r>ks֜oRg^չ)K8)Ч_MJF5邳Rj|4'çV_`}> ~cOKZEamSlϪ}uQI&S ݠuVqnx.8P>Ђ6J¹U+i۴mm2'$ʜR@q2hC`=tY4+ol8p%,J:ߩG!cd z̝vX ʽ38haFhNPƎ=%ʹ˧Oֳ{=ӛ } hcw\) [n̬hJ b{J5w_.~҃%VvQA %'JL4R(Dk0OZ L%'SIT+H EȅDYd^D1N-x8&nkG eL30{-cDk45[!-ٱ<@B> l5kBtQQ;jk/[ *Prτ3֍փqu&nen [rhuHy|0 mESs06..R۷7 ڷԽyָ`s-7χ`~큗ͼaq_Xͷ؛p?8z+/zl_mX ZB]"IwŵCxs>e|s&(>ot׌jxz=$ցt(\p>JwϾ^ 15FTMtJW)u^LJJ-Ѣ2mgauiՌ`@/!{^zv&ʉ4%Y[Kz]Ecƀ!fH Y@SJx->c]uۄ ;xU{xSaypq3@t Eol/oBCd"Pmɕkm౿f0*кq=/Sh`=ͣuN-cnj^ofMNm Zahkpf i!-fFЅsc̱r<|@ve$"_jYT0+@UϥEYHUVmUK,%5"54CWT">$ \μutU0U?74k4dHU?gMޟ]‘f$BW¡`\,ϠUFaK\'-=lh~R 0uQI&SD9P:ł8ST l]` d?"tUpn m.mk^w[~h-8Y^0nh.Pu/3{~vڳCuOQ|/ŇOE>FEm]@%.9JfcJ~*phl_c$?ҸST {12}oN͓3U"T3ٕA?ZH+zJu`\>c:S)+\*_pgXjT ʔV0b}ٹ/ gyɃͺ )HZŒQrX#uo8'z|eqvXRH'A`q]DG"Yy .[ wg;~uVWrP9e\ - ʻG˯~]SG(R!(ZZJL)0#_gp8"G  xPaNXfFR%sXXOTD_J0`K)6$zu:q88-ڬ7[g|gysggiF7~ڷgL,*X֮E 5!>j&ZpyvxFm*aMv|rvg.x9+mokɿtԿsl7bHxK?0.i 6:`~N--Lq@kOEĕ\”WF rA JGQn\HUTmqB,Ee#J /[5m4. [tk=ѝ׸zf1{hr~˽_ Iv))1$-/v{{R/cdXW> ./ILP .[] ;Hw7h3Y[k^)aTh&D pc&(r2'xJ^mp DKeG`(W]Hl*D)qZ& J9np-1/5^#:v:gGJ((͸Jb,aE !y"oCp]ՄJġSbe#c% bɺ|,^Fj=H<#"X'4Ui:iډB j. #~@Q c. 1ȥўS6jo8 rIFVp%=(Wi*8#Q`6()huA@%\B1S~UvrV xsi,\q1xbsC ZRFƅH{0o\3dZiYp?gfǿ>||ͻ?x>t_~f^Y5Sp+& dy:Cۖ^,S5=GޡǕnH~=Qz3z| +s9e!QZb~8υOGq񊞪] PS!m s.ťR7H3hHzvDmu }kv). !\*l+"(O>ǽO.x %iyIJM v&BLHOV&;;}Ol}eG>zF%6Ҩ[+nUq]{apeN3gwNDVwAp.&2`0 |W! 6}*U+[zeRL1lj2H18aJ!i2] h,*96+hP%Ky-EuҥGă"1,5-ז] i6#g!{c7Nj#g4 cKrE;"rvh6j˂Bd-Y8kF1Tbж  e gP|̺8P21c`!RikeZsOyX'5lWɮ4޸lr_!3^پaﰊ;䬯G\NњThʩ ٣$ æ26 >¾{EMh]cnFsWPwlڦ1j=j-da&jRߗV l 4HA><[.Kʾ)m,f̖! {Qe8F ݰN"S-QcVLЫʁTH(056 Lj *ĜUBbNv*(]6.I0>ٖg E@C*EQ;$I>XiVdNLMD!e `FఖiB^lJ}W\Uc9kFΚr{q70*?ޙi2ٙPL`懨 zg\AP *6"R%NY,x/]X)m4αtA"읤vVtpB F&H$dﴰiE pTfCiU+]n= 9 H=PlfV[eY !vt'Մl t9! &~M E7dPY1ۧޒQbj #nIRC5%bVNX:ٶ n}A0^ ιXdIk,=׌(CEǓSqݔ9dv{2O)Ǘ,Y&ĭ-r 7vW{aL)g`@x4i=Tj zZ@Ҋ}ZL F;tו2tT֌9/{:F1:>ʃ~[ӧ#8}9/DC7D&'6>ڠV/ged~3a*>1JmoLώbFO k/ca,vUo^=j!tEf١VcHzM&M:_/wy[j+ʗ`w*;T萼<՚TRvsTGS íbjQ<\1p݌B }k|+,n1>83f$Z2t|t/߿+WLkHDžȦ8pB#|M蠛Ύ'o;QYj&֩$ F2>tBq<;]1S+/ၝCWDBTTSkJ.{4͈ͬܩBqVݠr [1% s*o^{ed͸y]^?4?e3yEoV|&HcjWbJ!&x4J AFLRȏs7?_f $?CwI4r>3K-Xt㣔.E2D!<nU"XCw'RP+dV% n![p۪k ޵A^]&{ڏM!u$G,Y׌0ⵖ%& QٵzݒXaBdTcbHj)B%2 vA)MR{ ҄dH^$$ z1C,9(E& KClK.b 6X\$ Ab.MW*(픁]bSfB-z<#LF,O٢My1uGFʖĢ [xB%$[Q>Em k`[Ա͡L vFai^(۔ [Hrl1jLKɃ9% bA61klaXdm-]KЖR[56V/!z2؊9 jFމrb ijRD 1ePζ$:1n1&&%Tfv̩X5?{qeʀ_6ATk{!Yè8+j )9 =$9jJtcfzoWݺ:S )"$ЄFckE?VGq0(C/^e[@,3 _uXYD@2k,a,> vy9<+f@57.WRZ! c¼7jbȇ4vyNc1gG̠ ݄Ü/ J%[`LREٔ|Kp|@gZ9 ` lM a3XEfr L#+%po]MEwF%R92r^cj V ڳYo/Eze̯H_u-( VHݳ+enoR62!i,J!*Nc9sj e[[ZdbVGr|1|,5h9b: 2L ߹ahV&?W\p 49ac[-ky"ATtI|tx #.*҇MV%^E Hrc005n8&8dg~cR (ЃcF<@ $9-+d^0@T. U e:Xe1FƁS@^\@$6Xt_ u<oĭ |tX:;(*@DU;͊qn[&sJ $'a Oo`<|sԝy'SaA[w56}m6F84= \ZAyHamfM~]!jUW@ @0 hp!YBGm` @;k1-xW3K:vhs^u@XK4iQip|Ӂ e#s4pqإo, $frP%JI>2~2w1`DhJ` Ü@#܁?U2Ӊhƚva0A{Y ̂5*)8BY xAmJUM^eV5вgk$Lzt`_5nI޲0C:[j:Xq{ymz1i6ʻ>_IY\ӘImky`ڢ+Z`8FL-FT[B|)h;5,FO 5$޵繨)UFCon0|u=7fܵ5V/Ȁś !u>fds\@7|+ќ!? V:)R(]&d*D@y!U@%AO p'r5쀱ZIpݢIQ!|?XyE r\O.DNa~;?" ^*o3apaBlo(h(cx)U,JG52⓪ִs\g=kU4~ aQ0'#80`ci 3727kЬT gF x;7byP`ҧ"pՔq'k{HvU2ŽRNHEG*:RёTt#HEG*:RёTt#HEG*:RёTt#HEG*:RёTt#HEG*:RёTt#HEG*:RёTt|UtVJ!k|Tt0<`O^E+%ݗyZu&<&u|@+JOMR)sy$իqR$;-픱 `U]7aY1 Ѭ 6q!ySoW3ۧaUqQю/7w8x)|u/U:?^NcY&WP|yOp>vק)mkG[|S -(.p0m5'k25UUـPxPXb?omuZiնʵbی1fi?9/ڸ,5`ب 2DoKcChH7l?-ܼvaӫHfG̏_Oϯޔj7l .ㅑwIKӋɵ3>OgQ:noAfl_ޣMt2kzb޷^ۥmAf̛uEhpӭnv 3T'|]-><[߮9}3êA 5w_d5NFx9b^n6_ܧNIw?.P%?|{[F"ъ}/go KB[=:Yp6ɛQ1X݄/QG΂i{{geSsatQG`<9gNV,l{a-ui>2wqR1U*JT*SbTLR1U*JT*SbTLR1U*JT*SbTLR1U*JT*SbTLR1U*JT*SbTLR1U*JT*SrZ?ފ_F'/'mr{ksrC/ܩ<[rјKU~V<گ0כgS'_VnEگ_LW[Kbr^WdqHg%`2e!‚X_Y;%;Bŷox~CIپ+[C?>rX F-bw$^#?ْDTB3k>Қ;ϩxS|9vU)rV]rN]|<@—$Uge VUp-53Q{R)AO {4H.zN j>QSN)X\b+j$rQCVMe}ȍ>?5Ka5lG'cfqdaԘTGZVs)= '6x ;P(ۀ8W}=PʏTK]ox$ݨFx49:DH'~ blzV8ș6v^TH6]ۤt  #UO={rhϟDE}+~U4ղ.VLX+Ƶ 2ة`ʠpmp0sCDFLj7wκcC*=HIlv$6z8bZoNYO]=iWkp蜸ŽpnmѺ;tWыFw eⲯFؼ}tAN^rkSPēy$6KaDۿv~ Txƿ>~{@o>?.W~ ǧL{sޜ>ߊЧџ7z7_ v<;}@3f[s2yJ[|5݆mvpڽf.z;9##CL'[?$\`%$e.ȵN:. K5_~)vy9a#e(BY$Sj O§\;&BXR-m (wU::12SZ[Y͏qܽ(`;|qJ6"~`jxg#Tic:/Z]46XFyBƧѺVT*:"£DK7@]ZM))fD֊<$R;ox+V+K)췎"0 qˀa0s(H`wo^nm>jCz=OgǧUknwk}i}E[CM<.\fy),dr68EBitJ`[\4HR2{JPgz9?z<[AYT+ŭ(J2t51 @J:,"0Λ߿t'pđz8b%U|Z&nw <"y$1+j1\ &ځWi77 Oiҋ'2zZ왳C^hhf7 ąȁAMEwl^gƅfrf$ F2ϕYos̱ 8wIf2w裡XCdZV<},W(ś6iyMM=Mt x&,F-fwSyu\6i5+_/nTMJqkm#9KR})yq6vx0j((W>T/dEQMid`2g4]SUUu]&/,{U=q7]<lbg# --b杺`p "j_/+Jg~'ڔ(E0C0%8$H3S c1S YcInc:{ceL(L!6x.`PzU^?5tyuHYhnSq'[٦E}z W_[^W9ŘoO1c(lo{dgJ6FR:`^LgBYPkuJY5vAF]?~Jy !66xvJS&BݚqfDk f٘x^-Loҁ]xycMPtN dC:ҁwأh؋hs2o-YJ[Ѥ 9K@^GSdN2"BqXMR#yE7I5rv =or{ \/v_%Rd1S;'>_?j8>vmsSfm b<4Glu"$ǂ֔bbRiZU- z]vOQ.w=evV!a텶Q A52sR;\Fbd;0qL3ǓHnb} .1ܵEqX`h@%+J:&A#v5rv,=ހ~}dB84hKgev4yy=̍/2zw](%d%OYsf^*141GVM@V]4~%9$4M\w̪ō)u $L)=XaQ-ٛpnH"#AHZ"ʽ *-C?Fbc xna݅w -ew)ŌPe#]( Ay:m14Yv= T[N2GL%!Pε2'qUY3F/B4"Xn Hiʚ(#+-w7`AdJ?amJlf/Ρ_tg4:+<.qZyt#5ĹE񁬹S,RK{rHIF ws܅73wo].+)Ŗgȣ1Y_rGܒ1/S}2ZJ.Ñҩ\.I  6kfqV %CMo;0cfryEnfj :V+Gn4M_ZOt=:w"Yߖ[o͍w|Q#$mTv; ;EC.ijHR]߮h8-R-05ݣHmu_dup4 -14OƟNk+fmEj_.ۮNt405a#0r8_!IOf(,RTJi7  ]$n$mzp+7c"00IQѷ QK2cL_ς@yC=K״>g @Jp~\.1[FdbRbR ;UvzeV'2#a5V{㝞/N;×L/mUC 00-_ )$-Zof.XhW| M*'&q W,7·v Jd}BuNgwJ;U(0RN-QFgvBT\*^BeTC潔@Ga3%ǽ*{PJ%҉g7bIe휕r)&]ӧ7_6*' 99 RZFI Y;#ZeΝ 2W2*QVJ}K1(9~W|ý-nl·}LQ]J<ޢ٩"` ه'ȬS(t~<)ߐMl]v򷦻ЭFcy黪a >.f%АJYeZPx'cIZy@H@=gc >ެI^8;C)i- fsyi/F{1U \*ʅZ}BfCWXEِ01]BӉeNRL8"4B2bg_cRvOZ^E/U8BɤQI.""2wށ2 &ńK\H麹 ƛ1<"*6I'1ˆȹ <8m$VR.H-r7S&S@_yMJ#_LѰoʎIJ)>(s9K ӥ0&i2&5$0!d>'[%ZbW8Hn5mz jU\%ǜ27Jm(F+#DR=:52I'Ɔ94B #'nx  A&u[`6۔'rV)4-J{$O;Mci`:rNσNyȈ뭶TL#7g;@M{AO?3KFzJlGR$ %9LSm-Aνi?>Tj`' vҝVڰ!e #X="Lf%D0i$$)id+"GVI̠8(^UڨInw,*6a, A~H󘈁> V#BM%^/WE>^Y);ӥ E,@,6mr pflRFazcM8;tT}CTW79ku!n=ni[-CHxe@%@։73{R7wFnf'/."|r` v6oy&B }?ݗ{Żfe>Uyh4]cCdKi,MaA9̘X\.Y%xOZC3Ș M:3K 1HR"'8J$# MJ.%z1HI9@"=&s#w;%20FΎ)[Uv⾍@)ul\dLuNɔ4 ʐzr ,v9ƫ jfPFt\d Ie+gO,COBb$KҒ1U0* j5rvhQle{kW?ՈF4F4Q*0YrLa9.p! K19^=(O-AO2E3 mHD /}>D.zG,$472Ȓ`3BiMXY#V#g:'^'8k'_g5.S/*^Y/A/zqw?K>l\¹bv (jeUxҐ9 zz7g?c_}XG< [U_H]v)qX;ApC3E?*emgO\eS5/&450BӋ)#t - dighHL!@^d;>GB>ɝ~Mf%*f!/ˋ_.ltdL˕M^w|w(-+5񏷷ig44{ۇ\~V{ona݆ݷL1, C0/N!WPky*Q p^a?+ uA˫ݿGߟ_,eڛMVzQYgښ:_ؗ}8PYUYyʞ#f^ƞQ(*Ai,44Fé'!W[ipd7( }O>j:(Py92yW06ct=&6Tq 6ZUl+?hX+;@њqETL^["*fM0f1-P3;[9kė|M-嚇LJwWrU^uV]P[:i e!#ma+󷓳}1UG`t23w< Ok(;׿w?c?}_>X$ao M?GZcXx\gi _c]+HS0-mTy?ߏ.J+-QjW|@WS<ʋ_ȄgćQ?bJnRE*IyB`̀!MZc;}hay~@ߌ̂Ðr0,mh|쌸D*(EoPۖ74ʇDBm9P:z%d6貋s4#*k.;[5OYj7^b߂vy/w-9=ɬ5Q?=fѼz6"<|̟M\XLh#UAG*~Wwkꩊk6I7M="+Pe_#,9&#b"7h 2]+X&{aL[e 0tX&^\=(Q* \k&؄\do4۲S'{ԯgͿƟ'_+0MJ{XPl[`]KcWۤg(mq^oXAf>\ާ3.Y B=6t[{!FWSb}ꢵ~%vZm/*=]7Xbǚ'Lِ[:|>أڦZ4 Xap4wm9qHs=\?+"ӥrle?a /hDvC (l ^d Ĕ&WNx| +o6xȊKWoyKߴ2xo\ 1/AԿ{ҡYtˡ6ږK*рŷT&why4O-n]k[|y Q*:;۾}5}[mt;ڹvZמo_Fʣ\͢4f5PGmda")4-q/[4)܃o$>x6؎΅y!ɃbrS'*4gLV*L%w~ζlH;-D/bG:[Z4b@o4V )t(`)mo-EJ!BI+T|1p,>R5 @b{xq7rUMnS%S@/u^}ص˒GTaqj*WQ|P"&k5-*ZMT\TإScN&®y'$w o7ۜhm[*6Z*[#YʶaNJy 9d}-6ڱa)jGJWԬ}!be\Jb=*_})䝋Ũu#gM9۾_|\*H^,MjUea~H1WI+#*@%V"\Q_Xf-Ś[oC^ @Q؃ʪsAt~v~ҍmS g=Gf啩 fI HAFV:{zk8F7x#6VCx-O N촱bᲬ> H\@1 J_DDhk=Zƨ|Гmn,Ag[ !>+1Ր1j"6m SdB~cC:9-эC6Y:4Ú~ڃaz0-tHH%P kEdV-Oj2#ٗr6tHCd @fCԃ5".g1U\UUlUI SLj0@iܢC b-֖kN5^Q #P16$ád 6R%*VsIjr$oBn܏njm(0<|a"yzOmave~GW40SsAǮ }0d !Ϭ4>7 {vpLe@]SV!/jwi*Qos޺A-iZ9dtLٹĜRX:3Pѥ]D`Â}t1]!Jqj|dYĜB tmM/BM`]ˣ fߝ*{⦃@){Hv19QWCFɒn Ug]]))*nTNVo{ ,eT06Izq͑HYPFD ,VFqqy ]ϾN(j OO$E'd+kqqj*FQٳd)Zj2ɸhoXUlp)9.XEj΂TU|ZTOΡd_+*YdsfUMu!vօz T.)jv~ѐiOY'_'hzg7 klGd+ )6QsUIE\$*$V#>͉\&=lCsBشM(93ing&l; JU|ӈ^ܯ8_q1EkwM묵NkA8]E|KV{aĤ .vZNz,ȥbҷcI/9 CwzRԦPgyfx6nG޶']SB zXCaQOL&ae/ޞGU@!^ք 'bIڏhzp_܍YVNanWS7>c48gm"e HD \(SǐfLcsowu]LZVսa:Usy9qyF~n>})ז=+k}im=ej{P*2 'gjt*D+o@G8wHSF+3hǞ3݈it*9E΋jj]dM`)`,6 n+[dy){J<}fs3{ϣVxG;$N&'S$NCBw?= 7ՆV{avݏOygʚF_A襻cR݇"#{;gvcl<8uR\Q,R' x ) ,(‡(df}_!Wf1 ÙU(îG#OO^?xmg\w񪣫]Ny(k< \6#r-e-žլn:u5}svEG^S0sNow]2 [59{!GrؾUn{{es5F@&>h$WJ{deQi4"dV;l4Q~)x|ehv xss?ѻ-wJYF!<՘5 hı>*83΂c 7p[k4pRe4wFm6wRp&"UȀN%nLg}v{kh19v2Uq9Sz^2=1KKWwMSu]Pwu(5qMFrJ8\dwqgK*`r"Ա ICwtT*[ q\-4똙Xb#Ub5*Cz $jRz\Tkb¿: V3;Ί\:UȦ0Lz6d-\({ l%V\,6LrịKݣ?_™\8Du9#*ʘGMI)? XBu_}).tM.\u'&xay$"kY"M->XB[ H[̾/Y @ޑ;QS+ . eЀ Vx|('RIPNRn3%*wffQBO~.7ڔD}Y}/nlit-ßNwJ3}}td{)JRQ Gȏ@Lo'9WցꌚTg,(3xn23ŃX%X=6u{l؄#"VQA1*"RƑiqc#,m YQP{> /"( xjdS[Zem!?4^kg&0;‘I`ΏpdWc)U G&)u[85@q >?<+A|zޱ=rc#n5"D8O͵mKIZF'FFjWɋgė M1\(rƭSQ(Pך1Nl>9hFfoq;?[Ȑ'&BOE:&S,>\!bDȇhdH(xuu׷!_,Jue]]yx;[+j~EGaSTKŵb͹pxTH#2EDH#c<|j%Pn")rG؊GH\qZǰ42ĊQbЁE qĥ!D`St9ǽ)jvwxIw=|笾dz_D4 gZ/Q*yY^zS')i$z8WyF+!]715-su#9GS!B(QXER*Z9 k^#N/i|F/~#܉nק6LrT.㭓st̙8WqD (jSʥ (@$\() )|gPuLc .$%|WGCJ9 ; |%{gqCp,CdzQG%) j}k͜N麢9MJُۻtM'I("}f.'Lᛓ;(%LՏi*0od\fcD|d/ɗb:4{« M+n!!zk/?!W2:ki 78Ŝ9s˭#:F Iw+Ol2KOnQX\v Ҷۃ4nKUp;o1qV{tHba*9@ƹCL岤IRspf`|lrV}pKg|(Ol0Ʌ.%=ؿ5 I?[?ָ[g>ۺՙOO Z_&<\#f:o&as>QX_Nl V62}?J 4ba7PΖ~m NCsKFDEx/Ax3Yʓ|΅^z;B!B_q _J>GAzC\ h&ѿ3ݴ:-)TD2'T QqXEx'ֿ+;jBr)H8`dƺ# >E('4qÁgDr{ƭE]v4mbHhΥaS̅!44`#uF-q^R!5lH,DjdWSGDTZ#qD&B:w!oz~ NϿqA~g"'`si/t(<)!$1{`ָtR2q359yR)"؟/uQdPԝ]o٭)I9K0o Ot~>-_~S٨7&18l}MCqNtM|q1o( Hsh Mʠ]pkg_H{am^} + J?A)L *:d2k"n<0Fh349Tyfs¬;eyr:xgϮOz1mLOädaͻp.欁'SF y)tA.:AwIDc(C᱂᱃т˓B9.# I )ZJվ$ fr"4F;T5B>,Ș{i l%;:>8ѦRꑜy#|n3rO$j9ٳCJP|XÞ<1UBzIH_$=fi4COz\cBo17*`~Orrln-ҷ kdiE4X:P:\Ĩ5:Fy0̓r` Oq5y݈>,yl ! %^1䔲ҳVg/1$mgncP5pI_JSK"OHRSb:x-Fefxxd{ܞE#=⇜䒒˺0qd\Fa2Ȋc4EPb1Wd$dSE-LKUoDiYȧbi[.A% DY-iiQXQ_Q۫~ӁY4x> W5[硜5'GE%?SuG\Ͻ'6oEOܛ]2|=dysّkj8BĔNDPI( ^͙r eR-adTJm-#+(IUW Kg0*,uR) b#26gelUf3X[mmImD*2ޒ'tyӹGqy탿lЅ٧[lh-B2Y0c խ2 QD%lDO"$Ѓ!6F:MP^J`-& BF̎I/-B"k}xJ;Φn6;ڶնM8E&XOQ +" le>Yb=#"/C2C 2#CY%]XT."YŰ)Y1wdH =l&~{ؓujb͏c-"68Z"!kd(HI1R$FQr rLdf!xA;)E`n\ hYY(v&[D]\J̤(!Պ-b3q(pVUGn\Cl6KF6n]ܹOHH(d @V3OY e J62CI]<]Oq=l|c< ȃR|iN'> nt~"g횅7~}v9׵VՇi|FkRWShQ@(gAlHcU]#^@hz)S2~'d=eaQ' >q;$XPwIdybqZz2G8>?[;n,V~MK29= " 2J4(3.̋tVAۃoj5Z% Z̷.É^.zSq*TuO[=`z/,=q=mH!WՄ(|b1f $ tYM `kB+؎囨d*0tYe" tf$8 BYbm?d",)`X E/W UT7,j Z;4 CJ&{*/s}M$M}6cTnuQ&6`@[k](b _')Z M%'53騽5vE0kg!i$!3 *0qTcxy ZAzJ+\B(\AB۔xw RYzLlxaXwLT<\i}6Jg#,Y$O~ &BgLyMK,ܨ-UK+C}~ubXWL PgiN3C)zcfO #IjbI#OHM>(.238ᝏ '4i!y5;;wF 9p41*HlN!m|Ss \F%o6.<˲ZsAZ'PR bbLݎVK`~уɯl|=ٵܝQ1QȐĶE <W "N=ؠ2q5!jj[0 k II7$/O$+!at6ilbt6 bYmpnl|Cy?v]oR7c _wR"bAc7/$I%K0qREBji>;UK%>I58I02!WWB|vTz}#t m‡ٲAyb$D<ٴ g@^AF$h[L63az-`<%cc<Ӟma(3Xd$ؘغA&ٯWقkV@se(v^12]ҞY8yDꝍ؀9j bQEBɡI_PQeaG82ǡ2hESY^aATOq%b+fWz y滯W٣>+@^-{^‡w.m֍;Eһt]^߹w,$o{Icnqknzw,ȰmUHK@IFsT?oQhjY8![~!o@xylS;:4|7xZ/cO1b9YuLT8fW k^jֳ6aCqRϮ>?L`O.+_y֐+~=LZmN+٢G+nT) ֋^2+4KV2%_3sw^}u_O5@? ۾XBo}jzmo UaӫHk6% +kw\O,?C9Yvʯ}! /j}uY|Η{o~,O>\D]ş_Q;|r:>ُjS*LZ˦0|~7٧̍Wf&_la[6|?!i@81p*z5ޔKJkӉ?  ,uS LHcҐf֏󸻴Ȝbn g]wh1}V=bZWڣ3>Ǭ7&/!-D  ‹2uBP4ZX:Lz4 ߒa0&ȤQ:*hAJ*\!l$5O$RJ&[acA$G p[_9#tvmz-o7nʩ=)BtrJEY1Τ Z YFI9h=#$=Vgv8ˣjzF,/>k">dMV2PJ)6z,uiV'Eoo&>-dzzJ"r $kS&<59S XZ,%D!HB3j};Xҫ(}w= @Qkfw  >\Fx jte 7 jj6r@M'-59,*z% '% 0G3̌oco?fVf3NVsy`V>ֳ[%Q]frbd?ʴ7<ː_kvbz~1n{Xq=pO~,yyyu{Kv -jvq>o D2r[V>__uи;F!Aq軆~k+|.w;<=Ķq-u7znn]},[]|n{:7֔ڹ~%eu^~{xUۜ{~ן` ᆴopPL!WJ3210s@g'anɜsuDQ'~ؙ? jxb+knH1TY"5kGw#F(uX ՍMB@fD7ˬ<KU- -wĬtg ,ʄesd,>ib Γ(an]k-4<}y٦|=56IqFD+F""D)B`$!tR hQh$e# /ЙRso@wD"u"DE IWTXNuHT&4=x{iwA/L>VGmcr=?kƅG/pR.{p;zHU %p6^55NgQ i*V:PyDv'}2͑g 6GT]Eu-(BDHQ1GBD~8&r(P,@6}( qM"*TG1.5.hiJ#AT6\>&~3E VֳX|Mrϴ) GǾ@W ЪVW bz8xgU'c~ɸ>!kS#SO)3ztx1xeQ& 7o(w'Th qUvBWpg3Zx rkI3?9]+ě[oBSy; ۏbH4͵펁jn{5~>/_lƋE ̫Iv%_cT}1\4#Krkq5wQME/Z J~j]֫/2I.g5F|Vv53 Zmw@; 41l?wThxaQz%~8i*f08w n[gVv'|>o ?Jth\ Ks0̗$(H9wX#+cC/צ4}{} (.}u.!{T*CW:6%]Fkz09W0/8.(3TE$0tǃ9ޗ'!g'qk)߽itݼO>;K}GduI|cOiՔZkF|y1M ,%}{Ȋ]k;r[f#f˩$xӋQl&(UEV'2J@uՍ>uEYêV nYn FO GuR=\Ľ5oA#:m(x|T5}*-@ Lu>&-lK>1ptУ~CeѪrBq)|0Lx fu^m$LhzL;cdw "hai) <1ΠE$z:4,ݩWLO%w#b9)2-7ϛFq,_VGN.zF=0vzڗm-c3g5q/[5@9P#de% ! Mk4W;84l(ջI];H\ޏnvkxGS D+R&TF8(RR.6F ;k"1)̙HL(EmH 7ƃjǹ2Deeh (n^l:[kYD/NY+#d+TuOߌ:Ŵr)wo &|Ge۔n@#7[%'z$&)%:q@cB $S ^Tį &B|ʏ.s 4S5uh;)Z<(x0DhEB!J.2(D*rQLk&Njc,Y!]IVX%7Ѭ]h!-w/dEPKC{φqJ5m^!Tp(O,A|2HBYDt{ T랑 3.U+)CZ ɳdtvXegZ( ×M41'jbDug8%+N(n_:pQXT=Zsl;{DrS")N#Jiqo%kx}0].VܪYOO3F7MU c2#ϲ NFc^S䂜*9f܍Nfs'qdŵo-IqzBodtnƧ2qGD}bC ,C^~W [ .oZzu“&wC<[݅Cߴ4iH˥M+Pi-)ښwi=vK}׫ldj9sy8NY\튣Im=y}f@HHtbH]ða tJ,oCOj;Ta2]=sx>߱8Mh稌=jȮQ;֪*G=,GĖGꂾH>Lk W&M-9P4yp\w7MO|_~B<5?c׿m?fڡmV%W-5E1{-jv藺f ~8<\-`UQ5Z:yo`aگ഑ϛ#lgRlEd" +M)i#' !7n(yKc +)n tID,!:$hbʃ7ʼf/c;R %(sh?p1cmr[4Ac2pA! jbAvUb4SLRDy;ezs{ʝ)76>Q+9ZeW"wp6}z峪iq|92Kz )6WFGH>JxFL)QJ Z<>L% /hЩ־sacsIf^]?Ŷ&|n{HP)^ˍVNJ' scsgĹͨja1켬AXYx7Th>0)y57:EH {VtYr :|tӠiQo< #k"YBՃ#ƫ*Ll^ WА$OlA<uǹ|UsDRC%0ZgAs DATɲ)hA\-9#Ѹ!0XCrX8Ppu܏ =Ԁ2y6_iYu5O ժ=W'!%.@!2WE`JEH2Hڻ@1DMyr)YsǍFr[sp0|ծ YrB$*"M:…U@rz`d+: ee-,I+ƙ4"!GA9&a a:*eޣ@t1*-X YlgǗCAxhNQD*MtI,@ z-u(bypY$ \!ӏL\|X1 %)Ȅ#5BJp̠>p.O@qTjz;F뱦k0D@3h*Lԑ$9YF  "R׋3I]x+KM#ґ d\ 84g"g:Z$Q&T[YĀ"HT$k: ŭwгG9XO.;QȮ sXt+k[jqs(S$&lnk^sԬ#ܗ8`4gP TLk^%rֱ$IO>f~,`x,`x,`x,`QVJ-FˤB@ʡZ|RBpMU"Bi8scc@ JʘT&HsO\k}b)͔ThN$,6M)tL;';t«R=qJ5>{rS%Kl` v6+<1Lj~TʆF"|ά S$NTҨXK>J;8W&09I] ?0Rڔ R&$˥3CUF(0T1HL451sDR.z)P5d( Mg˜;EA]+Sb;p3*UԁGANB[+xZKMyqD$#sI=E5QNT"$H hQz#8d8Z)'Sh_QMg FK?N/_Nv˯,CTͧWuW8nKir)Nj'ed֒vG}ן[T*V9L)CF3EP.<\:"ܐ&RiA,+RH.-unB|s"q,tJ`U ȓR*&R%c/ar3(cWYe!eAerFymˌ;B uOYgW3?ͺ7h8?s-RFB*(os6$IBO;-+# ӳؙ} OI0ME`FQ,SIlW}ˌ#fƣa^0 fAeIT`=2MC!{. &/^ܦ&hXLP;0RfڧscYܢCմX+{m=z݄7Y%5h3@`y `<rɐ32H2MJ< LȐ Y-6M1cV"ɸvՆs~]8ҏc=G7:B .y֦SXι \HRL}&#dxL>\ ^wdm^y85Z3$޼:|Ag[|5. c1 Ig3Ŭ( f[ċdB݃_uM%LrcnSNw=} 7P=}l &^>,-q[FJUHr'7!yhk@h\) |)zLA 5k43GaB`Υ@ T C{%E-)zSd/{,yZwt?ȽC_ܸ.R `twCw_nwO9pe߽ ;s_}$‹E[wׇ=W>:+y4{Y;ORF5׏oE{^ #{方90nl}n%ﺯ+` kM!1RyID1Ogh5I9:f@> GT\έQ'NI}@(Upalf#șҧ Wm8-I q﹫BWk_#sK*+FfJRi FyR]hBG(Oxߐ]E"8Faf7@λAsS\Aw؊{O?F2|Z\NfJR}R6;[g-f+WߚZdw]dA%1co&In }vwzw߳Ol2>Bp>?9m@l?Ѕ(~No+mF(жP{+!-c-㯀VT4(",h@ؠQkS 85ot-u)b_3zQVaª=qiᨎtT~>?hJ;@p!t6L5>Yx JFr1b`)R-K\/rQ(&f\yg"j 2,GѠ0R(gb!huOq^y85PE^u Gr_ M8AZk_<8uБZ q6?o_H>Ij.TT:9S%dA.DGJ<!ЉCCCC CSDIC U%(T%†FzJp `o {9aDrHcC* C7<7Zd8o2=/ 8cdY2?z 8bqb**`K\ܬ1vx ouGgлzO;oWW~i+e&#ɲ*}fqw;/-}䇟,#К:)bBϭ 7|}wGR3B`ʅ*>C‹n@Д$MTDGk2V㝲USG iMR?jQ1%.\~ʕe9XLٙTH1cYPO.JC'Gv_/8'}Y۟eV}/؎AcF%GEARKm1)r1dE7d a$ G8G(R1JF 1K-HѫL3K NJa:A$̙Tĭ<ñs .|xgλ+4WqNH>\(u}Q=-t2i#*UV_𵕋~aǓQ`(eU 3[]st@F$Cxoqt=g<ߚ硥Qs"=`yЩ}H9Xk0ˤGS)N~9W_ՓK!hR2pmuI4C.d;ṿs<5Z}=Xы}kE5CCE6R '^ xuctaUNwʎ\#ɜVȼ1Ag9 B$ra:nr}|O {hWӐɼ{hM}CtSX#Q']]X\~.ҵ$}q}sƇp˝ %v WkJW,a˟wJн;ZI DSvBxB!e{Zy,;՟nvSD9aa˶W48i4Yωt{ ^neI-▨Q)Oݮ7϶jig] z`>&PV5Ʋ)l:r2Nω\;!"#x$+Gnv4&[_Y%#ULlQ]R:%a+/ag`$ՃZa1yȓ6;KRV0eh#2d\BTHBي*fU13:*cP1t⨘KYfF޳ a܋YT6qUJ hHˢ๜}YO c"  FFk#BcMM =Ls-9xe[G0p-Cv*/"Vjmq :I邇<5Pkuqn^ǂ薦e>=̀qigU $Jh: az*lfQ"獛Ҽ3UQ 4Zـc΋e<DM߸q󼈙cYA(~Qb§hF@-%'OJA.`!eM]B GnOv>_O@{;؂RunJϗ4..W_牢вY~PEWMIj@dYcRKnF}6TUkp!K}`rFQZgf $ɪ#r]D!$G}DUwmmH;@;_`X`18 aOp@XaEx[l]lnPv;AufW}9l4 \t e2B 12Œe:GQVNv{!{@TQOq)LL?GiH. nlv f֡v!hs&mpcL0q*")1^Pb M,}5/?y"^wA(%BIJ("QZ/0ȅ7"(qNf>+3j9!57?Uo.?i]X;&e;{=F8ȸe.ȭfY`,b)'HN=HynYvQ$׬0柨rU;l_ߦp2f9q枸2VLHZQƪ&dM-,cb,P)>b@ pa6o=$?s>̯\_>|'~@0z'>^S穀Uon[y_=o齛s.ͭ6[x_C ~Ʈ;N+YU |sm `ʌSw}vBiX,C%gjW8<8 dTIZ,*;*G@ˉz}fy|;4COi|&./V:۪l>vG,)A VcU8cԑ`4鳲)so+WOk,d9 Spcǻ_ -qYL.Y_To}r;?ܩ4=u, D(xXdZrL)׎,c1\Xv@9ʱIEiH6^7}^W!]?tax~BN{ :Ze\uAP cidr +Y$Pty#0))YDȃIR$8*qiGd)% QX1{*"^FO%+ &.zT>/^bqWJ|zo :(B&)P 7+,sfDjMHEwGz9>Փ*:pdQkޑ#7% -2s$s1L&Chhac}HF/b>6ۗ  ԫHQm2pAVK5@"ºʄE i&vRe<+kD QFbij|oF=ӊY(1j+|, TYo!GX$D1[1u 6*EYoke"` G! J П񳿓,nBʓDЬ66e^ %Iٓȗ; {;s_&Y~3;Awċ.R+vTfb2V}fC~Cq%;^s,eG3%_1]]\{*y{=Do<w6mb:&W_}gk-,VjfTig#պSw ]Qc`9vW'jƻ.(hDz=d,pU3@Z;8sFu61?^4N# ?>Wrvˑ ȴe)IH9& u4>w)rFЫzE7N@mjdF~,M[=U押pXZ8J% yVꘅNYvr 痋~U&|d <6 o\$L~dcAT15qHqxKЩdPXLLs ZքSTqTKNSN>q=aKRCTP9T0^ HLq O{Crq|pfC=MǗP"ߚX^^sjV>%0s6$fWbՍ$߆WwwR9 S %iY=UOGg6GmB]oT$GuymuuGm/+c0uY6KtCwmB0ag!QGN4iz ʵfLi kER wJt^,,"2l.`A *TayVFD>ļ,2VGweLWWdsTB.&d\ƣ/gi\jiꕜ/"(t&R$wn5j<Q&C H;7ReC:>Gm,,H`ew7Iwe+ "¹cֽϦyr?0{iVZO V|!o)\ߕny I&=UiI5ef1u_ol{OHr-9!KãdǙ,_ Z';rZ%8uJݰsk2"Œ@Yf}m3,EGc9̼uz2p&2[Ѣ<[ 7|\0 Ykw>]Mhd=K4qǝϟv5Res;_jɝ^wWnhƓ6ϿZqAKH@ZQٻ6r,W Еrxk $}wnN#5,KKv:YŲeAJ,uG֘fwmFg]镟 &,1Ega?N疛](BQz5iK=9$E:bλúc11ᨵQ8Ө,FDo?O NTF?u>ȮvU J.:'q#m KJ#Fk(O{q~yݛ??߿??ӇLx7W LT]~{tޣ|֬ԣyGs ~D۫J%SCkB4b݀@I";$oM]$6XD[(8 T\IDpmEC)s!(&JzanV}ۑG 00/!,c䢷id2pA! jb&HzS`"N;d^99r%r}Π \U}LJIk>eE}/x4`v7gQs16Z8\i>3sHR gsSQW[[.ƽeYjNUgw.C:t`Ɯi"9g w&C锨ăFXAO}tv\ ǫ[v5w7U >ao% !Ew8ɭ]^><> ;0MNM=d[DkXoΝ`WwI5j}!>9 GG72Q ̸%8DwguzoW8P{`PKy˿] $G\'tqfI L΢"}RhN)3mJM2Yt3Za8v;Hid{Uݩc<) ^붿ɱmgy.b?ttrz^L  c)1 2C=aep[JjK%y[?f fo1URǡ0:- Fk>ki#;^%Ifn̲ˮ;ERl j"J4J~y>n* ;X͛Ap?N?o/ZbW'یषdŒHpg^So<4Opb1 bT]Nyd;w'!ƻk>w'o{3Kzbop򦅼#V|Gp[{^I톒!/kH!b^NSM]o|o]ɔӬSΡ(d_oUK_'%~M}Z}[okmշ>ÍCVW}[okmշVZ}[o37TZ}[okmշVZ}[okm}3"dYejZfYej$YYjZfZ-Vˬ2kZ-Vˬ2kZ-Vˬ2kZ-Vˬ2kZ-Vˬ2kZ-Vˬ>L$8>LBZRrxUc@4 IhZ$W寄BA՗oOGPjFpX$pi-: 3H`U& lBMs)/R4 ^$sh\bS,An8Rq pTw6ùlCalvZb{'wRbů|OBJ\Bd>E2#d!U8#} QS\@IqQ\1Usins74YzH )& V"!?G+j:,Y,W3iEATrL*0tT.E\ Rȉ5Ζzv|vNAGύ+MH%(E᧞k+CP gK@I~.?3[GNЮCN1 钀dC!N8fp=p.O@q$Z~2<;zX}LMM6 OrS`H5 hAT҂[(b8갬L<ònbjT0o$a,0<!Hs&ysM˜GLVnG,r;|cQb"7UB=̮R'$<"dlRM%M ("BqT( 0cюbvMЫq^XtK"n54-v2wH&l4{_Lr ظx`(N %, @hּK\c IIOsJP +U2#>5JtPr0htQHUC+OJJU sa=76 ©I@e$='¿LIܷt2bٌZcʊdd^lȏV74c'I^+y$UPΧkl?5n&”@?F*4*6mN9ΕF &9cv3FgG}) mJBXF)qjP*`@zO#OIp؇E^mR$v>sDR.ܣ)P5d( ugõ&⮱/x8x* J'MyPqD$ɣr挙I=E5Q\2T$H2JO}JG&L9RDŠZ;[lQ(՝ۍab.&ZmYjj+] ^0zbbD∰2XeB ")DzIHeCȐR@yDc!/&/8ƜcH":*[ۮXw6Ö.gx4vcW [DU-bk}.@Yj&1ZOW$hiv 2[u lєqP k2Jx#fUcX:\"y 'XQoXY2c%%or!e$iU8jAKc<*#a!#$?:GH8"GH8GHj4K>]S!S4OFqhOxJTT4\7U'*-9Bq) bT; !dHfa@'Q1ulyo..G~>&=$^s܏mnYV~D Se֘D"!I@UDv1ȋx`\CكHD :'QJq&P82윣i,Y֗s+֝J@]/3T 9kL.7m^n$gDG޷ _,-qݎTU ĴP#ؙt3Gm.} H6yTc5A7ؘN3v fS_3S'DI,S&G033 ȔRpQxTt,h+!zGd'<zإm3C2*}Λn:k?mf n_2Nw7xw;|w3{}8Flv6l͛Vb'[tzyִ7>}͘n G%N|;^)o!o6#_y=G?qs*7_9Eynn)+xcB=2#80瑻GiB}HTrݏm*%V e!943hdzLxx"T=U~;wDD=amfǛ }htzVg[-Mz#e e Wck GDP/C3(P)~Ũ™"sk#N}(o5EnXCc"s tl^l8q+Cړկ1՜bJ%yEe,њ,U}ۏCoϐfV'B:#࡙\^ RA\Xd p4^kfE4 r~vghÌ{հ]g}=3oFŤ|4-1_nHEZS 1w?n_i̗2`9A|8s<,J{)v5GK=5#m1V"YUYKwQ ğ(Ax' !2[`6ۻXg FX2Q]VK|/PDIn~>߷<}?}ePB|ؼ ܻ6(O%fn 5>< >bkkgBHx6@jV:BgTk_4N\#/\,K`)1 jBYhl픐Ȩ9e#:Kѧ*D}B@LY@Qg *WDș%cS"2P֙h}u< % !nq#889egoUiӇrn ss3 RˇȇY\tT_ +R Q"v5pXl|g'd@O]=!wz 1GҼXvUN ɨ7QM;/EQmq,YL$jJ2ۂ u½穡j,rIw"[>!3ALÔOY#I}2BM:g|$jV}c)Gx0^z>=98/09hPGC1ɩ=~J!ZMHQ !A lJ,mHV^UEZN?\'rӑ6 "DVkc%*Dk~4LsGpyyƣb3(I553N&DM6g>nC ) =;$7j5~?=/"yn@ބEn݁o>OQ_ue{b qo5HV[j:tMr!x) |T<l|FӫzFij!.(F:Jʤ{mIb*BM ʣ5OԽwGtOq?W EI=U <̃Qh.E2‰qV%,q`-MZR_NoOIى}#E%CCz(*m]v Y/PYW/o4~j\(,aL@n$\Ε5[.WgIS-Q>2YB Vsb  ѠCt(54.H7kB3)q|VBꝝI뎂&_4dGH_R>VcY[7e?>KӇ">pwg}MroI1l&ZGfKNbl>}*zP]6h^\ fYӝf͐)dۦmxq9a  ozt?5Ր $뗳ܺ [_ή7#m![ľM++*7ayo jyѪ>7Es}r -xu}Kݿ_ޏ‘,fE;7nGN䅄"WV~SBz͗UoovMg^1D5QJ+<:Voy&ٴ1i3V-rnuw QYFݷsepn}/1fMLv,~{#:lh+B_T^=ZٖXid*- '/PrT\e۷t zn".^=S$T1JZ[hc?(:.-b{Tt=x7-Vn+1 6)RF"TDiDS4I'A2FPl,s81GO9zv( (㇡_N/ KDj=$Bt40J˩ aJW`LJE['dfWMbG{eą'4[ʷ$<\'AF 7#-g OJnh堺)%XX*<:UÛ\e-\ 'kN g+X{QǦ=;=z|[7X7V(t4>uZV(1g=Ծsv7qpgkEdك75EUfE +p>jZ>jm5yg+Q8 eĶnLbp9fKšЈi'OY 2w#<bq}YmڋWڐvGJJҶ Z'g |J"܈5b'za~_pN3Ib44>N#^0݄P.QKxwlqԀo9{oU__5j!. 錀jQU7z>][KzUAwi2q֍?q"*w;Կ@;կ1! )92C>H}0}u9E(scO:rvW+'E%ՉEV NB Zڛ )V'RKF={)Q#(½{ۚ}eHKHE'P;Y2s:#nw2hL:du*X]bLYxwI]nvH!vw`ټHsTxYШ:FěC_nSNLp㿞Żͮl=XR(YTCyw5~<}<}j&ur6Y(ů Q]BaL8j4>,*k루IqM ~Zdۜٞ_t橞罜dB8k S$G0<0$m(RV6N{eF͠:.@>ptR8aT 2NmԖuxɁqL>h&׸t=6ZjNMNLJt^<|~./n(og$eζKT0r$AfR!ʓ Ɇ;[LrOJ("''e2uE 0)&A Pd "/ۣUJQG A3KqPdHЎ)MX Z W&nJL&Fmr #g5rW fF͉@@i"UQ.@Q(2?XD\oIMt*TG(%E+E=@O?3sz](QKR.'G !19(u1hIIh$Z~9NZ ig0>& 0 8bOrNㆉ{$"I=(It2[GQDqT6bZX<&*"V|AxG][E%" @*X;1G9XϮ>t}-^/|~CJݼNl{9Mj0H丬!az(,`qyxy -@P3cx-,q$Ny1OG9< 䊅C6@4*}چvE;.z"N9]-Xiǹ66ň~vJN-Q䧓r/ZC@ tP* 1xR`=<9(01L v]9BD t3M ~a1rbDrȥE7Q BGKҠ1dpZ j1r7|9RrjO\Jt9|5S%Jm˷E\6A'pgâcR=)&nQ<gi} aڥ+gtdI1RhyԱ(Ue5# X QɁʘ8ezuHX11LrK6*[3*gJqƾPH1Q/.Ϲȸǵ}:}˺ɢj;>M.>klT*Dhu } ALjr9ۍH!8ڠ2xmI4dcFsYkJ{%A3w&:NL@xmJF"gvi01OEk}*ըG%_d(؃fD +iUL1b"$R""esTz0HȐ"G8"!l #(d 9LԶwmH_ߏ!,0swb?"mecq<[l=,j-Sv;c5EU_(Fn}XY1v׮ua{kĭ Kͥ)MJk C)u2IhBzl"x`QPBhG)8H KYxGZLh4!=+]?95iWQ/Npu鬳%-M{t?q}Is)BrR8"\R pDnF&@E@AD"zz3w?c_}XGyyʮv}3zlkW\{FQ;b #bt.T/7]z|Bċܿ߬j0qYmΫv<IvyD&>jI"C\A4d8"Q"j*H(R-$ʠ5g3%CAA7'h,ӽV@^L"A Z*ym0+r@*|{%#$؇Nx/zڥ3C6.ܾrMDn%m: gU] w{=3#$1^sr l]azu5o{\tG63n]ͺClYa˥oJn-o 6iG"͵/d0m)+xcB{dD]|@=Xym)DBB [vm`4TC.˝)rBDK&1)X0̢\$c \DW~4(_\?/x8z"x7ݴBwa|]NmZmGL|W2etwc(򣗞AA2NHgG. gʔﭭR89 D VgM>RL]Q* d c󊑳[7⬯@2įZ-mfbnJ6eB;C}FW&O8N(H9@]Y@.A`b;{rߌ1o:^G+2\gq._iO:~{2[akԁyQYE.R7@c:Z $[ DHFE!285 ;aMb+4V+A}:3843lk cGo"Z{ozwxr=nF˴6оxv(O뀴ineҳ?Fqi+@56o;Rm5jl{UiٖHʍ/d&]^RD%jg_%Y%8j)` s2Z&APL_/*0Vj'g= G G5l= _g4^\N- /J`)1 *댩 Ѹ+!Qsz,,E"}J.2IVJ4XE#T D5M)g!JHⶬ A!\%JNoqj]>sx qrr(eLNׄy_ܹ9Бƚ7 ɵGHՉ"U=%k_Y CM =|!](\_ y.JH\[q?8TwF!.Gx˻4:/3ίAYQd)cSe )04!1ll<uUoJ]{1_h"?^̖_=.ChĠ {zBFZ2Q8"bsz2&7Ӝ^#D#d^^qzAF-yFpP с'-8XIrC(:ڔHA] 3?%c؃5f yӜb(SDEInso|5S(-xmYDBF;)KF;Q*Hq4BpS"^R<)QkXոP;!@n$\ΕYk"Xp{|Ȥ>1K1^8d[EH/D BɧAXHPhJ>nʥi2̵TAqQG"J+;@=i(ӫLޟ<vT]ڃx?2.v\ǨN.oܨO_oU#|L ?w@"K'7w1+bh\[Ӻ.L.!/΂Ap/@ A(!Cs91Vd7̆]ƒFƖpzpwҀ`1R^r!ѵCo&8)~t馧?O&EgSjoq;*TWdi߲0Sr&aa^(4rcٺ:&-EejUEUg Ǽ Ll瑻iYhţyQʚ1(qzџ{: #~Sg"`)2ڄhHQD)$0e&$hr5U?,/sX>W ]cQHMBL"DGc I tHT% k!eM^QzC:s'jFxq6i}f b}o; W\]㞡á'/ mkip:_]WL@ 4*LP)JN@BێBBeݱȺKʹ%^IͧX&l _61ĦK{&ek{% m?vB#uT5i |ODNl0U /c5໴8Q f:}?,gO &p4gL)nqD 5b>Ŏ.щVSth:C%x~GZ<[4D'R@H.mMLEA^ 7 pSnju~8AMuM!oՊdJ?,[⣊ 9Qh \ijʢ$WS\^q[HQń43Ey>f__O~.>x6c5+bݗ8'vmƿ\pnZQ iIƑn =sx>m!qTFdӨ sDyc=4hĥ%ϏːJ|"d~@aT-*;axq~x~w>Sfǟ\Y k =_FyڰRCxˡm%gXڢSn{jvV{k-@r6{߫o9:ȍa˯ (絵o|¢Sٚ* Pb? P,4qn?~\@8Ź߸ITX8Lhp\+%:iG8B $\7aZ|[#_GQH%c4& ʃ(u\Y.";({E쐒=zK6DҽIg az&FyisHpn) #"Ko$A0=7oufdYG42>:;Yҡ+nk.*JDHJÿٻ6,W} &z0;q3 6 X,ip nEJ-[tX2dߪsV1=(F ӵJ͕FھMnnGvX-׿-כᷫܡ|wf-'K\R/ tp6itr}f~lޜ|rioY&D 5ņw]G pFN8>]8lo?N3\z!26{"} ?Q/DH~z7W{czLE9s 4HIx'wVZwѽݲ(,B$RL([NĬ\1IePL a(^ɶ~bֆ 3'_\? Ӥ=oCʭ+~jԄDYnk]|tv-d´e\u`xX.zݿQ];\قb/ߝ~\^\7}լҺSU`F;6n"`u˳wpAQ \uͫ/vZvQf'QcwR4l}s"s}uCI1{h4}|sʥC2xԚ|X"5AjbD)j$[ xAoq^qũ]eoox/umԎ g[SDqM;Dlבҡb9hN5O'tkoN~U^C?zx]A7]lHo|W iMۿz|Gm$q]wuoڂޓߛư ˞/MCl"2iҢιkIe]Sh-_u2ovG:wW`!SN[X|W[}ٜR_dhGfo݋?J҇ޑz;C^ ^Wo{X 図Z X+vn AR?j_cx|{whמ4|r^덝hR:2M2p,NW1D>Z.!rD!Ƭ}+-ЊlBogc uԊH-rlB96cZkK C.:K%9a0d.RնYj)x Zu=ܖ^P|VSZH),gUuRSŲEr(BTp-\vLF#1^›1Nƣo(&Uk`t,9qblM{&̞7BN;6c9hlQ%V6*JZK(H%)Ȍ\2%drVÄznLn:@G6+y. @KJ7!yA3 vXm!xTc!#<(9PiDLiyUBA>Xbic4. `}QPV':7Vdnh ^PuvT& :U(ɩ᫈;- + ɜCpac1vnX}o!O".&jH-6 ^{@#B |v)m^AU@@"Q |Y]pF ]J!v?a,C)9j6y-hGJA3yA.Az5|< lwXny=*g @cb-PJ[#khA4i29hsdEQpQAd&"6QM @*CÕ<`,r0碬"wE"*B :9|&Nlƚv .0VQ{Y B52) h2KoY5 s*u{EEa joT[@ /a@@?0:r$[ D{0sɺ6Zn~v Ѽ|wjwi/NO sMS&AU` ڢ+ZQ0hFAP*mtn ໃHQ#bj]Э!58/UMp9Aښʏ޼Y|Fz.N`8W2#Y!W.}NXZyX|c W[_]YHwkl{5ݰ~ 7[왾ܦ[߬Hwy}AoV+ځz>+0 OSpk =$NWWHTp5\MWSTp5\MWSTp5\MWSTp5\MWSTp5\MWSTp5\MWSTp5\MWSTp5\MWSTp5\MWSTp5\MWSTp5\MW޹q$ ~s0plylT!Z_s-Z9V%ޘF.| bfD癠~t;S};J 4nq:'UY E)pNbI4?l’[a#pf0x |G 3{ g *8)rj pzOs]׈IX>'a3dՌiG~.{6c!_g[o`&~X7̓֘ŵ`]Lpe*MJ3%I)ʕMɒHA\3%QĘWdYɆ$]<鈍eeVZ)BQl u9_-TOT*+d.?e^^9 {lY~0: nJpr*`WrVFI?T0BOz2X{P%$ri?`tqZ֞n`]<꜖Lq8+` 'R'#)&IE]+0{{5(*P}l " Ԝ^I;&eeO?1J822YjAGbq&0SeBLc']3+> Jd#X5sIf2̃ƒeY0T?tsyiKFGY[rقM9lM 8m]`Ƥ,M \7I|~}y3AsEĥ|Bf1 О4Bb"MzM0,4N0' MLsŗ!&ք꽾[=T=z8e1 V@cx1J`b'`QN\]+?='0^fElJ`=u]*ԁh6Wׂ"/oCἴHQN$U)be 9}n{׮Hf=tru}Qhɠ> ;EΟiʂ$M\q SH/MFr~[k6N\xy3}j8P̕Arڷ.v%P6fl* ;HEٲ$MZKrsIWmŰbh-fUXހ"45c`2yG2W=3VjK[5DEuIIsP<7Ư+Y'J+7c*ߧ{} sӫS~˟ޜSf__^¬I 01UawPyX{EUP޼hn.E.[=ʵ6ڻsk@z0_||jtu=Hص5@*Wnˎӯ@AͯOfhjZi2 !l psǀʩU^iݚ'5{%7;JLЁq7!MKD!egoQm (}|̫|1娞me'T zOZշ=ߙ9/OS6' ]S/σ(wYr#yC\|1IjfV"%v:Yة@]Nadb-@v 5?WlA}v^<!;SͼChe|9L!#B);*/K՟,@(Q9xQ`ꁘr,1g\tTƧm:fE┯&6*6K397[ %w_xsOD1K:q]`hq.B[ƚv -4cwtC5طO=mGﺰ\RBKk4e0yYet:g*EeTBceЄ5+rXR׷iAlFyzi0}L+GbjL^o߽;鷽}~88l9J7Jò}^1)\f ;Z-fW}NZxu]+fgyBYM ZYWkbwa=.DzVJmwgvogx4cT hK9ia?}%°ɠpy9Q.P#3)咣(v; _uV+ǣc}pߥF\J0:> 0bp_|[)u 4WEV $¼3g+ Lf?ˮLEMoNMٯty2rtRC eQ5[Y6fJJ.Py;Qޙ-.X4f_e AGVGg͊]uw6ٷ<9sfhvB=ȞegR>M~yU.Nby*@[jҲ&.L6[UsA[zw6[ 9.ne#0 c"LD˔*!L 7J5 jQ T68#3#OۚZ1l2hхy"F2BE. L+tx&AV$*iDV 6YRf*̈́z+9e)HFl9$LN)E=5tu;׌3e AN҄e,3IkK &q#&%mgyzYuvԳJ= f;8iUYE *@AiI/% F,d`n6IM'2}bJ<*8rq:n8u5׶ HRVJ*"מsm`h<1(1Nx\`Ku0t9[ aQZĀCS Z-E+T4ZnauJ.%%33xCmJ >罥B$rBgs.{gҫGѱaYuvdT)9 aYFN&g8'yJ((MAqBDr\N5 -+GSmH2?KZ~Q$q$ 8S^F51cE:;D/ƍf)Ͼ?t͡LN˯[%Y1ܥj2m nzLNN[r8sntT޾VYPaZi9xdͭT…Գ$Sݎ`9FH[r1e0$KS@(Yt))NrG. Yu*ݥt%㱐XcFmbN/6;m_eo0?r_O[i-J RQ'h%ȄFM<#k PC+dnW# ٻFrW4|50 L/I>mDz$$Yr?4[*ns{Ξ p2j lr'!(f'&DҥO#Jsx\̾vq*j¨-+jWw%^zpk(F Y!+0Ɉ5^&g$HtNNg]#J\1xos;NˋW/59C˨Y˕ZV ɨ9QlMFPtB$iU؁jA%1K5K'#r!h|Hh h*5|ʂdFשd@EE"XVѲ;g+CpasJ?X+Too>qAgYjoM15!0)KqU$$_PG ! *v6pf E<0A^ފu 9Lr꾁ɼ^n^tO%ަ^Oc{찮;Mu\[zΜnS_y@k+ͶA+)7{t3J 숖׮?&{zcQI sʺg/k(孆|y[zJa/rzn7oNgͭF~']2|ȬlÜG=8LCw{_}TA֪9,DC:h0A3 6Lxx"T=UOOߑ͗}fy|MCcfk:*l2펔 88,OOp5Fq<:(eHڂ ;bTa%|oHh(ES8 K0[mMrb*LdS+fg}8K!~ݻ_s9)d_Q= yJ&UnƷ{tgGU5|akI᜻cCjrw,bE\ǛaԮi;Wu2Z/๎QyEdkM'6DB-MEdwItOZU,NEĺ6(W2Vnuz 0 -ƻ8ufF8d>bxn. K$}p+f{,҇)cSi4)0k!!&bf6d"Mjy%]įO,c_o~nB] FwW /Fia !,C@rR D!)r3ʼnpDcy1+0T`0OWTggq"!;pDBj|ZxanM'^+e)0B s,RzA.`9y4~M&ټG ar"7Ax}9eOu+k;Ca~B?wyY?,Ͼi^e6uJ$YX8_rP1uXA|ūxJl!c㴱h{.#Ñ8pWs9/~U\Uio׫3e/ ?~6czpKG0Fσ mw¢"x1Fsșߍrwv2Wo/4z|L;>[.9>f+ccBSGپ|%5w}UyB0s(oS~=3E*w}fzB#[9.^l3,WAG:Ԫ1&7y-e`:&J}گG Z6Sv+e͙Fؽ3.1jIMBH$BH#%@RL$N*MF42ӞEo/*X??gft3ĬD"u"DE IWTXNuHT&4k{iwA"GOWMJt_]llllg̞ ݷ*/@90/=V[RN~T/ mU[G@&_G6WSa~+m aqڤd.4^)_IM>S"y"]"tFF"VҨ_!|8t.\- ) |R\,3jhS.P፧:Ǹָ"҂F00-f1A-.ɵק=&}#&ZߚXۿVU'IgIb*_}#&_CW|uqޚ*3T"S]UzyoُS]OoԆG-5巃yf0Mn/1o?Ll^Y+lI>w}0w `}H>( lݠl[L}gYC7۠$QRtSnMUf(V*TtS*T*T*TtSnStSnMUJ7U*TtSnMUJ7U*TtSnMUJ7U*TtSnMUJ7U*TtSnMUJ7 ^ lhIM5J7U*TtSnMU|UItq|)^ot68= 0X$(H9w X#cCjvM3U{,aN@9^s ~uAxJ/έUS-"kM'6DB-ME 1AxGU::K^7Rڵ__SaZw*:3."n} 6ܢ v{zwh8/̶ ?^d_dl?-qO9m8ތ\1Ziv,srչVu!eK0ac# ~ .@tvMb5Ihxa`~7Bc)QRiUUi8,xOV ”$T8iapy Q%LJ"HDg86s-@с#mt$X| 'xwʔ.Iq}sIxKŸAv ;˰#F#u k ~bqi"3h5ңJF{F\QhQh1yaDb#M w2*cR\!%RN'M>eHD)\Xύ$؂I@exZĜS" M{•W?|̖ܠC4>g9p'_/eBaŦ+;>R'lh޵5q:/9aH\膫8ǩJ^K\*4.KԒ撎_.IQz#iXEL_nE͐, < vOZoVx kVڜJ.njEb}1bC:^&ᶲ9`붚I@DR(Q0B5@Џr${q$]mE"اHռ4C]ekDj1t]t)bB]-g@X0pr g__Y5=jګ; a Z"#՜Zuy>[cȮp&e44Hj #6|B+B@-⃑HzZqnzjGrt7_iX1T_啋z\[\,]Z.HkOk=nj̰7<>v5l>*? 2.-OTOGҙg)510/)s|_Vt/D;.gaX_K^BY51)U8 7eCNӫuq,ƶ&t;=u۬4r? Y77e 3)\-^wɔz4Hr7vY~]FTHJt3>ZkMՕ ^_Y]pbZY̻颞~[nDHmݛܳ#(m&T[gLMcOj 4wGa\磌l4fs;pY=!ͺYQZg]6zH 2"Qc}t$dPA;].Ӌ7B_|wo_/շJz~فZ%%@dzM-mL=z{>yJѠ=M7ud k:.Ns){ΐ! -M)s.: э& !zvCOw&snn ޹-X5L7 Y.O]S}lyS6 !cXe !GTx\WRSbuu9┏i[[⫵EiT6f{xn|R:Ơ!7TEBdj"ejQÜ4%(?duҊcĮ H1Rh<%׌bLUj_sL\'3H^ ,(CD5Yc10+ dbyh +& NlH;3S0eYȳ`iDGEey1IYF[Em%Fzg|_1j/F Ss}![al}=3;^1 .ʦ>eh8~}H. [Iq cZ`c{KyϡE,*hR'1b93;Zj3AP;5D⾀NۋoIhBuXVP,]>> m[0a ?Cz80dPr/=)ThDO%ځʠV3gy :0x\amn)l.yb6#$9J~Xf]e^jS΀sk dd.XkdZ%ˢjSݖJdJya A. /E膹 b~xo~͖ooA-ԷJ֒0ލSʭD#ڡj N`w̼]w޾) -0x.*=og}#>9fZܸ(H> Q:gU Q6P.7*yn5'be6ђ/b堵7djʵɓvF5D< j܇5B]͢A>Y^N/$o$/ޞ'wToEkՃԵ\\bd **W/6N#BaJL5!D6Kf3x kT0]ket춺n-1 'ʺ_u[mU^vl*ݙ)r HV! PNGP`wEV}vjyZ HД@%eզ: |` )@@"┉E;r[x".)d=kKB Ơ)e ѣ )ﵜgSJSHi-[;u(dϦ,)r nJweA3ZLA,U Hу|*TfBdr!̜ۍs}&նg]s-U#V=4(YW3dr^&%<;FB.xT bP֧$J@QRbU 3u[Ξ|6fiv\L@4EQJu!2 8EV-dQHAVf&u Q,8IF_H1.R#TT3)xc(aE,]8!d%rGI0mVj~Y[+C΃E/|HPӝo,{D:UxR9՘!"<41TTژf7`lH(P0|NeS) [a}˗P{-e0P %om^\d_g CJFA`Q͠d/ԈCӿҿӿҿ(b2h!ieK4sP[a9Utˆ.c_sG6ֺl% npC}a<HZ5jQ z۷wp]ı[t A˓]]Dohv[BW'W=rq"=Rs/7%럋U*?~V:DL9S r&gbUEńK#9ͦ@d9 JR껧Z=EZ"T`cT+D,YxF*Bd출%c?/|e* ,4,|RYxlW o -\] <~ʺ|LWg훿agg>;]^}#:҂ASb CU1 ju9g4lنflenRCĦ<&3`:Bk*U%ާEcbNEjwC,,g goz2NBX3oT WT5((MϭcL,0BdAdpE%o6%/Qd|9WU`mm9ѨT$jq2vP%"q;c3dvLAZ똴J.QYF!=Ә%zbd aIg1|Eˎ :yL1c8)|2@ ;6`UU`) j=d@]7v69x-z|Teo~d+k4:Tl+f2%ZnrO$EN~c3vu ׻?+>"TdSbAh,U3P粮SVƘcdtۏKgx9yh:T2㺼^SB.ä5q[Lh(6 Jfos1|}{ ֈzh3N*,kvMU*J]W~6pRRi(cv/|gr7^PS1sw A+"Ȍ}X>vao>+|JA+=ן~|^ڳ$QK tJÃ&=(VJL ?@5Ynr d +tS=wnp0%sHh [EADr6og*4եRh[(c5&5lﵝT=[Vp|g,R[2PKm|, m:zGvv Fjy*n1NY7Mp)H+ztu;:"k8ﬤ21SQ ['ȅk`Yɥ&XNO-Vw7*Tޗ T!ST6Nz M(ΐeIrM\S{ۼ"=EB͟.Gi 2)@׊LvQ ?&;F$gJ\.݃0aNN4 ߳**sJGtD*iCj<7{g 0ugI"$KzVR _7o99CQ<,|-# ,oΊSig_og%v[լ|>M@qKg\`:Jy z>zTGZrf1}*v#ҒIeJٯu6W6k/b:ɧcjgygHh4xЖ MH/ZccKݿ݇9Ko{Lk1YB}_/8fn6M\ݹ5(ұaΦjufHBR؊A%W9v@ .6J!^ڠLiL )&K탥 ;wy^vgֽApD@[#*j="K$1n1"=qNPm0 isCtD,E*b+dl0oi=O̻R#;ۓ i #Ƥ#?We؇E*/m\uň[iKMaX$ b)ώ` <\]>p| k Cs\HR %qE8Bj{1` n Ch&x98tdXDA -J)F1:l8OC劄~% m@2e>7Nhz 8 1_bs&y_&:?hl,9fq:= \BOE:&S,>\!bDȇhdH(xu |XP:&|Mw3_ۆ1dvUryƧO>]`psI/8:ìko=Śs?pxTH#2EDH#c<|j%DRD8DH\q Y0`ArTb(1"q)1`T#rq· XH>%dKI DpDKAyGI4 ׵ZoX/;͉iKǀU Odt&5oFky,x4Jb=a+`>՜uk9.},\# \;/nA.!MA1R)FEP LQHT}}"fښ},w4߇ilzOEhN 菱,k)V"Rc1m7dD_YO,L&vM`|0ɡ}ןmсxHsQ8a1FCw܊GlQߚz(?VoiְdΖn8ۋ6+tK10?zwU#:X8펦h&B 9륻XkF%cC:=9Q#="B*8\iu_/ %t?4 3tu!ԪrWg:#Ӏ<@yߢapoǼOɤTӾֲ[Nf%E9 @|jWnvO M$H5:#( Y`#穣#$1D,FN׃ɻ3"9GS!B(QXERTZ9 B(˲a)mk*6VLŨx9Q~mU!pĮĎq˪@QJƏ3Os egWx26u 8S^$c㭓 Kt,TqD Pd6(*(@$\() )gPuLư|jx|;:=V)n`L[AXB7zfɃ0 ^1l8m+|L NnngPmi>F˛uzFL ,?ϿEqr3h]083 Uʹ/H2Lʯ^EYj+R[2Ee2ga.s~4v~^Hndd<~tb@ی^ia$f#ڿ:=GMJf0+ „iJB@ c)8fΌ/Qq|4 ~4}wK3"dTh5M5ݜ$Pg]q0:iL-)bnuA+8ҩe (WKMUy$,ViǬQFYJyϸkYǖΆӲx\o.WcrFrJ8\t>]o}ytǯ .*h,K82JJKxt K0J )he aGEii !',u!|RiBH"{rIoAA:ζ Rp96pYYD@ 2#*`q%4$@*bv iȫf]3Jad@6!j%pl9+"HTHf,"X\əU%e%3)2qߏ :C< s/d#Rv0:drH9F *C_;1a')$*qp)BSX*WC1gz4‚S>;Aq>WlZ|&=C ]_$ކac˶smqgBPJEʦ ؖpNCKK)FDOAs>o%x{fK@)Q[Ӟ;a1YKaa=Qab-6$vz 2q4<dWWLɹp^ied`ϘS)E ^XPrFͱD iΛykx=Jh6_Jk'bfN)S77w}V=8RRR' R08:LO7gΊ92$!J Ex˫\ς$7_ӛ6&9O=kjw' Hl49A( ImK:I;ఀ?1vYoxJŋ>A)QTt}ɬfh#}@1BN':U38D7;uwHs=zgYufd5x$}$uEcH"IٳTH";Uǰz Z,*APШ^Hf"`T"P^腶)w X^w,g^`ܬi>׺FYHf~ &BYG`yK,Q ρz :<Ű,x]2@/fslrHMTA@NRwV~OJJDTxw>z/Фl@Fd:-I=;wF 9#ptYO|#eMU x+OͱA4*ya% 0e% "if\MTLlf!]Di,j7^ɡ_g,ɎoqUe ĴN@B$$",z']P!P0 HC !E8Gw4ƚ4:Cvձt1z95i\pEܤi4&֐NMaq1&Mqy1&rN7DlbFDLwO\ s'ϖ?"fc:4$ɒv %(1@CO9&cRimB>vH=dHk\"#_:xgoFCJcxrx3aμ'-x][s8NR| 5v+L=ф<Ӿla^jLwax4ci@Qh\ʢq6hN C,%b )'\tҗ3s 1z PZIj2P ^o(:6 ;#g˫wBޘ2gz=KJZ,.b:2.Kqro ƘMX8kLRSA- ЫJFZrod_f2T I:vQԱvFqkųYsNqVqyUM MWm q[w'<9=omE7Հh` >~{N+1S(`$*t6:,i/7],:#*PY(g3ŭ&r\zGCɎ21Hus?2vgtw|+ c,TG,|P,\RT6񆣅mqwY'W4L6OAd#E4!e0c 5'd2J؈DHLC}Puv7i셬æ,D {&^wL |o) ]׻]s?b0ڝqǮm;Fm{D{gNo(zl<*ƊAY%Og˅ (t{=Ő"A2[؋Nĩ\Dͦdp,5)Lꂡvs?Npƃ9k슈1"x Y C1EJX)2$4Kuٳe! x6wRq1gj*bK=iJH*3uF5^ȸ8iuHθdG\4c\tG\<}>FB@I@%S0*D5Pfs!Y 3Ԝ鈋w슇aM 'N&rȭ..6X2G㝻.|9fEq^&arSbe?,Ihe^e4,~8t,W>L٫``]ծ/+q^t|1OOel72l+2jm*]SVϳ,tq񼕠EF/g7ӍO~kh6_-fhߙ^D=z^PrVG =n{Ry3"\,/IkMB}~/~|HG 5LhcƇm=F p5gXhPÞoՏڤq&P㛳ΞŲҪ#>,nB%FRE5En<hs*"+0EgQ$o!6|l{J yCMץk9=.g'2.W&ȦHM4ѳ ⫈r I]6+ȿ޼$!: KdR!tpy,ȁm Ϊ)*]0\1pw]ށqyGz(eU4Jz!e=Q' |䄭(”)FWcE0t1@:נ,I JpH+e q43r3!5>\M=2{_ת 6l߰~竹=7}-sV0b̳f"2J4(+f$] l Agt{G"6`A"\}upGFʰ i6ba4PrҊȐN9 ꚗ%[u߀0WX)ِ?(+I5|%5 ݇x,;6Yn,fv~qX^ԯ M73:Ks];PɣY!ɣ ;.ZG㫽=8/G,~:/e?rۋ>.BMQTNt:Y0.Z5,]'iGg.~s Fx^c -:u:ozWv&y9-?<$?FC|s@peYUX[fGhYA-eSit6^x_ۑzQ5cV ]\^ q~}-nd:hg]K-[C[6 +~_G S=;os掝5VǛu7Dz oaPy\Z%pf̊ZwfjkS eO戡(\0\Sy;L9,ȅ`lPEdEhU ˶ i X\;rDb ]"VIk52b*ejd "ېv1,nr&&^"2&娴E2BJ$K#[R xS$"y2ɬC.[}$dkXymx6&d=ύIئ\I)Ō~ pdfg})yÐeFfhB,n6 z4Hhm#Iï10|P/C Avy9<+fH57]?B0.P(SP|B C>c⤱{gF`&|D`P*c*ʦ[L+U"6V)9nLWn_ iJ.LEwF%R92rܠj Vڳ#lh{X)ۛĨ%;i,%c;+O˙3V; eg]o#eیYUIB1 D%mI9 !u@0- |~;Bٴ%9ܫ*MNV˚ Zǡmk/HG1^ BT8p`Vi>n:@G6+*ZHL.=SnU L͇(yA3 vXmQ|QE (%)x$L䴬yE(C`rQR0-ac=7DhEE YR:[Z#&H܊`m mGΎd! T?`yrx*ƝfE 8-&sJ uYI N;>lw[ƸsȓR`QD6 ]{ #@ |um^_b:Ejbuܡ]cU#Dꊺ]LFQ{p!YBGmhv)0ִc[ 8As^ ϐB.ds2V5U3qb- FQ'p;JDڤ td."ΨAb&Ybm4$?y2ZnգvGycP)EF WCQesWwϢb!ʉ)jƚvab,o3,4MEը Rh›WT^eVGae4+QI6 "Q}A׌I'y*0 \n F`ⰱAsji=_Sc|(kZetC۳DP]\!tB$4z`2o6:7  m#MAہfqqպ[CYkpjhPAq=).z~3|7A |xS!9njlz(A#dskV:(R(]&d*DPT*|Dz;l҃bF7nkŰn$!>NW؊"D W}\N.BNa|;?$ ^q[0jaBlO(heQBaT(:#>j z Ai,:27iâHY>zV19E@KkLa֞@jXf=_85B'e<(1FZ2)I1e y~9(Ws΋6B{*RsêD88F6APX?˙P#,+QZiP!Vў iDF:|G |s V;ͨ>rI"VŪR盉`ebBv"Jɶ@AH"b á %;)r -0ɷmJM(uiU~՞Hx;mf4< ՠz?,7.#(Ϣ~ %nhFTwaGk*-~6B-LU9)bY ,cz|3ZzU]߀@=u.rRnߩh x'lߝzNyYZfE4ܪQy2)WF ;= *d+X{G>K) "+"+"+"+"+"+"+"+"+"+"+"+)W0\\ pe`\+/$+"+"+"+"+"+"+"+"+"+"+"+z\ )SO j%y:Wp,p"+"+"+"+"+"+"+"+"+"+"+",Gp 6W\+ pլs`a?$p%+"+"+"+"+"+"+"+"+"+"+"+"jT& ~'`ūO]uG1cn~No@Vy3O4/|} pY8 ֽ5ٻ2bzX즶:J?wF/ujCE1M+v5QmNv;LW8.5p\,JhŞn.Xwu_>VX׾/pw_  S +*R ީEXoN;ɠPm- |mCC4Qwϕn$ً_oվ[>׳z5>ڷnwa{_?o['tE=m]+{?peBl Q k9$]"xeH2n鷖@$|ʵi`Aܦxnogf]olNv_zDE+_tI{}%ķla;OjDǮggs}enkkr4u(s<?^mv#Dz3m.3dbT^il}O'>斎i6ѐoٺU٬m< BOqtx{.ƱUomN^뀿hUt` 2YF:`8'/nLotiSUr> uY[)H M0E%RJ:{_G c#̹;0,bNGfv5|ۖ.ޭCMRӭuGLi[.jnzZsYOaZhq= Qruk?לUR?ʭCΫ.kUv>/i޹p}>=SbOS7u0Worsoڢx>=[yr.ͱX?)o-2:(t!nTˉiOʉo"[@y<^g|VyVۮ\Yf6_n-lGA#:6z=!:}W݂^Q݀kldUei|WerU{g[9Itvu/,;6 CZØoCi8ߛ9*^M 颬'[gh#U@zԝ<4lˍq_0ėeӉ>_\w/l}69}b.;Qmznmp@Cr+>f}Wbr>xc[ϗ0?#ͱqu*-  ONjp՝!3Z##; >w%7mP jF;b frnqprCm8G_.rțҾ:Pۑ2%NYe'N7z\o8+wMꏙ׏lS[:fM -sGG:[hcJwp G6)-8_L|uKG}-ݮn]m.2zˉvs &>wƝ?mF1MS:ܮvg{wtD}FXu`*pXY]YoI+zCJyDa`z= L EԢd'xIb.mV8h- g! l V;<v(OqBS!CZY4fr7WT8PU yU ۻءK3ULM!m5hsdL* 6du׎𬍶2A5cj q[ԠnqMg N.ɴwT[YϺc6}]C.܇/^gjk fG«*⥤?-2UgB rbL)x4 }*3|IQn@zVғrm$)(RgzCk[K݈°I&Fr;+ S', :t''7t_ՍH#ܹNONֿMIA}?'&&[ud&E̪=f n[yr^8=f׍K䶤6{o[pE;W[퀴48#13]nƥd5I̒≡)%7f׋,dUzVݣGQS>_ڢJHu$He0p0H{d -Π01Yγ`V 9``:{3г'Rsi :xЈTH\rS9Vc)V(\$c4 UsƃA#Ogs{GR%yƢ^sqj5Iɾ9utQaGAҒVj:::rQ;AiR\*(D,)VΞSt(4(4JGBB M-jEfE016*hRlGP HH NܗCB*}TsEl_0 :JE蜋E`媴Nh6!Ih& xXpj#M)CY2p]ؽ"l<Dg$o1/ntyv[V pӠ&b-tgVP/P'%775`< lN {=_`ocR޷c S7$R!^]CJZqB- ur %AEFR :,I^99o.`Vef\`DŽ$A F09+N' v.H<2p8x/^{ѽzm_{edV|Q4ŴJ2Ɣ`uPElj!` Es w.Y "K`[77!*V<)q;R:FK-%!Ol:; KiR<3dDe5p9(p!M\c^IZ֞oI,KB+e'H#wd=Ѧx6ͣN>L_)GxT1 BimqOm/B* (&K[[gGu:BXQr% (brOǎjw`c7bVylfݩɕwE]RT"r&|*5,хh‚49DbJ @G!) ^H5W6L#kuzKU.zL: S1Ypn`ƌOy0٤LjO\, kt.g^6wK,[Hw%7'a ~=J%nOmzMqnG4rb'ڪSMY%_ImR澟zdT]#] +ϥr ȭ˛ϴ!n۵ɓ0/w>d]Ϭ*C>'W$]aoܕEgк݌qhȵ f$\ut9|}?O4dv{u{cv7n3 Ze&e[w{ !Rr{yߞrsyW|Ho . #3qUb1e'h.VM~9^ln9.Z[*=4bFݲVˁvNc=/xrC?ױk4%UN'L˙+9FF=PC4F8:'roݏoӏ?Ͽ}'{߾Hv&ǚƹm V>{ x3&݇v4bh`mU7|q钷IΠťnHO~Ihk9ϋݩG=lӚhI:I`ur=Gk5+$^N8fV'hrv; wn'EϋDa`WhS1rA8.%"s }H s줗v8,DK]l=! 'I+^^ tR"8ْII&$HY VbM\\ɹN:C4:C!+hӝ[w-lUCqBwN&tg:>vQO1cDLf*2LK , A/ "Gt@V$2j!T jPgVhbI*xYgvӥ E%GBKl,5}ޮ־sxX~ ʖilz_:0̷m`~zGz@}0R6~>^#(P4,#=dXP3#: 9OTu`0Һ`hx<(LHNTJo{z),!'S O"v$ L*-h rI5vMq~ݻf@X]5[\~tnpc~ M/mAkorČx) "Eiej^59QSA#r6\vW8Y.",Z=4IOS$HGzēqϋWV*i ", 岀D鲵ȕ:cBVܣjgͦ#<J{S4M<;IX@񋨜rzg\z+*6h _qP@gM(\PR=Ks0rtH D< FR;oƃ4IhD̘qt( i$IHRN&Ns[VAp4*&ڌ&b!*Rs\ZF4RiR 2QYu\ 0Ej"!M#Bsey0i_|1-w`oj 8OyёdYlƢs",'x,λ1c XhF!rl{tNb~|V2D6[(HMlRIwDEMؚ2?} } %Y)|Lax9F / / /I>EJ.D%n>{iBJ pKjdۚ;TJDA[,ba ݹ@s,Hk,پCMq}9K /-P٬s$#l}.ntli} h^in@?09o%ْdeUb] ¨ e"Ab>P:PåYrN:>4Cb1J&i01s[ p\D%`)Mfp2Q901t>2v.NY0%w൧ U!R: m`1j-j JD<*''%i*jDJt QQ$) #w$2dɔ)E/9#?u(a)g7خ[͓~y!]8Go799_{E{Пݓb{E9*|h@ R3Ϥ*R I!y/hʲ,(RF@DKƠSVŀ:,6*m2*gJ͌cm!/l i'{fdZ8@ؿ~o|? ƣv-TJH Q)Zr,A"qHL"֣Đt%!=(aefI%wEM31 qDnd1t݈Rn: k\vqՖV"؝N8P}k(JGBVN%%x<8% ŲIHeCđ!#2,ڳGC$^RM^"pD_X#ʖ+&~{ؑUEl&֮8"QUX-\YQD$Uަ`=PH BΣv\8dA0`Y,R8r(8@@S3.8yQbQ{LrŬs8T<ŶY*9:͒#vQWXN~bƹSkS n I`5Rh/ Sų)5;~đ<=v5?ィ;=0. Fяa'~WҩaTFPLccM^qd&hM9r80GN'(Yʇ9:I$Jj Z&IHgӃr. 8A#a(|BO_vcf\f \frh_.O:z(Sx Jxvgwng39#|݀xfg{>g+\=fVGw?!s \g`t'ïjSsuSj"ds^Vˡv#M({*7#n.R<7v=fxQ}LՑ;P瑻GiB}Hwe+w *YRdhx%tN]A0ASF33qlaڻzWtA}+ofOofn{B0އ/AM99hF77M'm`ݡoǙm/{:W<qU3M gd;XL+m] a;GĖy7tȄϟx86xz/K<~'?"dՃS=8ՃS=8ՃS=8ՃS=8ՃS=8ՃS=8Ճ=8g0Ը+s̱׸_k~ۯq5n׸fܜq)>q{F<ʊyjպᜈEm'kZK*kUW B$ UHp!=e#7b&\j(%"BKfZ5'm,$?ygLDƇ ݘi-\c>(X\lEQ@'ڽ3߽ g{g{g{a(F$cDDl\PQu:7a!!^;ETp|]tF lXd69{ⵡLd )|#o8) f+s@Zɋ!(lACe9k\v="kdn_j×=MljMfۃk.f JE܃440D>h~HoAhBUe I%˔nQ:'|}d~G'gyBf:oըKWYyF}ybӠ(-LpHJi+P$B 6NN 妊_-I,Bi\ C5 o`DT%S9588QOC=0MJ{/Y*a@ĤH@1<_JaQL@9@ Ʉ pŭbHW$@:ùVDT >D?lz=y8+x'f8]l]'hz 2} E@ՙu5B'% ~2<ؕ򍡋@,|f6a0pim>+L8twzoo.g]I|/,b~Ȼ }'"\{ߺ8ខTY?MBWAQ+x0l}| R1 7/og?~0˶ p]+ȅl<Ic&rgA(0Ml`ꗵFD"x/+O7xVz{'G7uE7=*5~FNJj<>t%g҄v2')"<Kҹn?c}6'Z9hx{9Ju8CNct " QR#I+J"#pB$!tRCB˸3/VІYM?9}&h2zHzf.E"BY:քIq@X:$)Msa^P^^gOæۛZ^\'w&:7Ϟd&] p{q۪w\QIOI5Gr5'|أA"Ͱ687Npх Nd 'J6g.=ڜQ-sx&ZIb9d<7JŐYE!gCQk&^ 2Ep7t4h[e"X\!ZZn Y* qsH(~gωfgӉ~x79i(սJ_~Uj7h0}V 7«FKOVT;pxڈUSjoP_\DƘk516&3l EՊ`-_qGaCҳlH:KUkש+؜u]n2͇wa?v:`ꯔڿ&BԼ;k~#f>l?|l(B7zLz:R.~~@x1 ΐ7NGǨm$S4"TG_/}޿zJBDtYq7Ax鸵8XP܃tD#Rh1Ɂ5iH`*8?dg=|ځf-]befrU4*#Cݵ `42ut%P#`5㩌.׽` zIhcpjo߼)ooxЬڪN]F#0TéAnTjaF2Ӥ ~*w|! inQ[wr?8Y?;=Mj"cuYBXbgZǘk!*eQ&:-;g>Nv@r%4 0`JaވH|j/^l٠p%d"c\CÙBK*,O\HNΥ:sgZ8B7r? .F&BSZr[͗eEelqUuuUsQKd XO='eoʣHLFSTLigWQWc=ծjS.Ϻ*P^(C!dTż%A #ȹϰצ'땣܆+%Sq΋MwbwyWp엛05ѻÝK1:GgW$=GkU"zZZ j.WpQۜʮĂUS,_u`ȳ˥@kV[}K68q32 b~PkQzz߽Gt^y\3T+P"QxK<ޫP*!{Kf]tB, @X0C.$Iݻe4M4ݫ!簤Z*#sK(k\;T\MlsHL%\C S&I%[T*Y*g b6!b3hewӫ݆Y=HCޙק-q䏞l1qr9o xUW[Rr]yfOknt{x/緡ݕu̅Eʵ=[1&lu`dMhg:3^F+SBBo,.M;[D 93vj,ΔtJ"/z3$'2/1u:_{hx׉a? 5?b DK\.`C\([TT*oAL^O)N׊dP,^\Pl"9ł/>NSDC9eݸe_zt㉣z쨞=V;*ݝ/T>sm-X5$ܙSI9?Djsc\J))-f ų`sY%ER1#ijצND09*d W:,Yp df Y#vBЙ0PZglJZ`H6(} 5[RB !%m H7r~i),*ъ3U`EJIY +fMjťTvć<,&>f=МvOlT!cUE&+вG빰1PGYdj_A-#A\q0صO Lq}u[3c`s4G+{y{6Xeu\A!LvҐb2b|V h, b.vaQ- n7F19`-{@̍T]uN1[(v.!# \6*$A0>Prziȹ\BT8dSub|Ʈaͭtן\]Eǫ3kt@Ǫgec}lrX@?*udpB.j jF89n[qSt=02S,JĒs1VHu!%S e#>IcV8̦cJ6j"kEGNl$<&;&ݍ{ƤoU'{*?QEk(-e4EpSʀ*"9+UP+($ӎi/`N( +dg*Uȝ9w j}Xxt;GN%$ݍO IJ6 bYx>2r`#JT\=o=KW:/]Oj;gk(럊U+߽v "R&fSj%byqrNBfR%ЇvEQ=U28,LɔXK`LڨȢzknܭang .4.|T]muِFF wgJ,-j<77 q6;y6]>sUi /̱JT(X,8⃺ѧdS݈`54g/lJMkڴ䡠gBVc6@s2޻ȹ[cybjmYkQk a3"A8 `E%/`QgQ00j+uՇ<ƜxȂ 5m38VF|®,Nu;nܭN5SSDZ;kD5o=xUXuj6EZǬEUKJ) \V B цV#o\XMBnA 7=J^!g !{gv#,uuJn\r^t^ ^}#clbZ8: "{S"D?R.8.עtV"z1cawdTX{(rfm#UOǕ(# n~|GC~W%!ovyyb=:Ė8" 8 zWdlc"G{>y"$F"`8n'XvFҺz"XVr2VZ7hV=^,?ӴUsvW;i#ǚJ10a³D4^.ѩjW(,!5Dbl)\( VE bq"HxȈ7,[t̽u#n!ޣm9- ILC!tG!ۛ"~ϭ>au/agٔ*2O: R( .^B,@uN;W\0 >{jBu8zG[bڰJ5sDNX[s(D]s |VJ5$RASMDVͽysC:CO[>!b_mtk(4<Q]!ց|khO1)zZj+prRvDv]$ͺ;'XYDRA^{Dqpu-G81{Jd2vGOJ #WFe PIY*pZvnSN<-*OkX=gT >/ZD@ lCt )KOYQ$XQzυz*ɕDK`CA %CbR*SitA~'}"C숨R(P"RAYBq)HM)lT8*.ZuϤapS Bd:1`pY1B eB&eEQEDִBL}n,r·,ϙ]v*=ebE摂QAS1JX@;!Ɯ`~c'U+t?br kIa}˰;O]KHM(+CD_bbՏ?}>YyI0$Ur^&KX3$N*[]i|xf`> vi|5ɯkiB&|8*^1b椭2:5i{{izq!mzmmvdE8mwߕ|Nbm&|{盾?_o˥X|gu/7\&_r[ńE&ru? lH;{R/oEMQ־oOҐqCi<0[ [伹5t%03~ؼHy ,gu7ma\ W'A!;u^mn 4@$?P,YbJZ.8Ru4.`* )&3(g6im[b\)1~T1=( 3_ =!|J-2$O9dFlP{׶HdEاyi<-f;O;`F$$`bIVJl%Ōd2OqBJ!-ZQq j'rK(b#dέ\M|nzTY\=HG|Xsrx}{I?1VIp^>GrE}곊9zR5pânQCE`S :syѻ9Gsd?* LA$BJ $ๅ^x{,Cshf2B 1zT%hEebXM3+>ߐvb_!$ف̚YڅeMڒƘՙa"T,D4SbfƯ̆ i5:Ik^~(OYER+$Y!Z\9cGvֲxğ2lyraYU`PV{(}IB ) y',U6"fL2IIϐVs p)4"L]d$CbIh%=ҕ@( QtRε H Usqz3fƥX*cXS1|Xqkb[!)<-_6hx<}9bk1q s`SP `YĒ"$+lP ɚe{$*bLXP`SFxI1 SF}QKӈ21fǥm*P{`G7& w4YEdE%`'sA2 EmlHh" !WdE'Id ڔ G( D@ ,QmmWMx뒭{kWk~\2"D<{ V6u-sҲ 9GdE@uH6ZII.*g"T6p+K,i T1UFjFħL:.Λri,5n1E00S';QzgDTҀޜԚa}~ l#|?kY˽?ml։`ҍux< n~|'GPս'V*KAtL>LrckRDUp4 2M4 m3!e) m:ZRRRa$Zp8/,Oʤ;O΢W6[Շ8%B(ViT%wY$JFE40#|ef]M3!g1a6K"KZo2a X72<@he ȸe.ȭfY`,b!" B:U [&X ҂r%2X+&E HGH5aI T;̭8xúƏ&bu߁2xD|NJFyVV`w6Kkg ;ٮJk'>h+XRl-F(cU2Z1F1$()7ez,-hX2GVQM , Y'RtyW?DGD?1?}cBW쉮W];ozhi_C(Dxץ׏nA? NAj>i,[s2?̾GGZ6rr~džEύOw^Pbg|MG[_=>ϙg5j>W{#6Ssf0mֻuq7FX p8Ulv.h-h b Ȟ%]I r'URB'i%p*_l-'+!^>v*4=4B4>Aw6gmgBz|#e&A VcU8Ǩ#g0͓tAiʦHW ?y[{=DpJTN1-FĔ:DXؼj<;bcW͵oVxئXv`Yeiշ,[(&/f_nwBs; &d2rMtEq9BbsVHJ& i{gB&!Ʉ%rUrhA h+P V*!JRp|D[ 0g Ɨb`$%i>Q py n]>n8B9-W։ߒ~tq5h=Ncw L^wHW߉ZךּZi%Ԟ)JRy^ѨRFoP}ҁ^Zauy'Bmұ9n8it0i󭟻u،ajM[ԃ:E]FI 1物3eV -E0J x,Bt-v񶮤 ; 2ll?wy)q -<*uF<-]YhE&;a46 r5/N>NO E?DZN\9 2+[Hl׳H-{PN :*Q0ivB$Wމiyx KbTRb0jBؠȢQAVMrNPA ]˹'Ͽ]UAZ5Ъjo GY"g>h41eL^)ge6T+E`\힃A/村z|Z&nwdMB \0\  ,8Z=h}=Xҋ(}ԢCg5C!Cz)mKe/d%Tċ.|0L`exY:gb!r6f@l㯇p7yO1"BU Yv<|I3ɷo> Ngl~[:ϋ ξFY,v48;+G_L?JJ7)liT-ѬHu&0. o.X?owofkZ?厎Y?>8{ W\^MI/&VŶpC~zݛ=x;rYvسd~oyWy2NՅv(,FH ۛMTHLv9T3;"cU>8d/"~q$|SykBޏ/}9nN&kU"gUoqp˺XmIh'{)˴N10s"xwWaetnry{eV')EU+_.1@gUU^Fr:QJ0jSJƻPU26x|J4_]mnlr37w^D&}zhsrhe^{kJ%H~#N1Zz[?Fzݛ?Fe,r4JjC#2BGr^/`NWs|W K eg"es#z"bPxe]uY6. iBS.c ,͠LLR;b|6ؘTN|R1࢔:D,-nLLsfHM{cop++ISCKJH$5*/u>] -ڲo/Nw6;^NMyMF2R=PΙCɣƼVffIC[WEsBh)ouI9QU|Ct45Yҗͮ踢ywsK)B6R B^Z}l'T%ޓdWiLS0,NvL'tvuȒ[II"ZDٲpbջz|ovOAX} k;SČJMzɺ|,6p!pa:\# LGtG+HDs. #pѻ4~h. 1ȥўS6jo8: RǂM/X. ZNH*&$J 4b#tt•p:W!ٹ癆. ^H}v5LI K210**MY)(60R\"ȟ_*[GSҼ*X+ Nj`v.Qbzf2 vpݛLA ^@յٿJ`pVtb P!jv5 >T}0pXcg&/Eɧp]BTX>Wt1w( $V:uVw7j@$>m~[q&$kY|݀;+oO鵫³yYSMm-!f.0KRm-A6|}5-ڜ0|jZIƕ_E2y3[DBG% 3DOk/[S7ʟo[udӪ JĐU')f2RThR 诘L٨X[*'ޕ˙_;鷟~xq^pه/``\մIv¯m@oVaT6K.b]) ZB#mmk!@^_}}6gũG:lYXrzH f3?JV֬r:p#J]IRS%*D#Q18lN4/of8e8ɽ'uaBz8ہO"3 qkD!V!%=e k#Ns8xɹAu[. j趧;^3?Tn,Y:p;S#*S@ݗL Da<2bB4u%Jv,yErK  e7 H.t|w7ͻ˰H{_Uqu;tl .i:dۣ͕gswݨjB>pg*(Fa )b dccCP5j&ZpN{{ M`4rw/ke~msË;c|s? Gt\\ F4%+ Km^B1X9_K#F,B0#AFV!V֔109 XyZSp3ƧMI ToFNgKݬO»{|z~m@LBRkMH!; UTsc%b49NxQ >^I8;8E443{#YC; a^,0MTڊA:uYHRXlBxB!' MgxVW~)f>CV-Yyߙ^NaͫL\#t'% zt5ՙl??T,OU/U9[Wl{3&f\0F/{J*Quh V.Z f[}q۟SvJ9L0˧A .MWn`^,#BTŇe-#K17)Il_3O!"F ?LwǃI^=zYGH(򹖜N)bxЄ]Iep/ȽpoOۤv#GrW0sR+R3Ey:3;˜܉#e6u3*)'_2A Ag6C6F FCqn7Ay1ϕ݌{%oƦ Mc3+\@>RP#ȩѶǩLj.ik CmWO[C9^,CZg&R&4'@!fX?W{_B^F|dح'U1$~d41NZEYvJTؖ ,xY/M,g_O~_gM~e/Jɿ߂?}Jjۚov\[1m+&lNo6>V1lZ+'Lނpzq3TSz1)eПLث`՛@oXMQx\JN5lfB Q">1$߄ 7l5Hك 7"'a)R !l_^& /.{VƛG⭺ئH|u 8uKo/ut)b\67DۥF6kJ;G#ڹq{ڵq5VӮf- MTh_T],ׅl!k~ŷO;,LJ9tZ4-Kl&sl'k,I̲R;U 5]\npw r"dvw^yi&obT1:\';me_~3zޟ|e ߸ݶ+Wwmnb/KUk@Ѣ̇J2/h.3_EӲɏU[dh Ϊ/Om0EMMGfW{s8k im¿R!A2(W|^R4O\j}nէޛ)Nu u,6bdjM[}y/BA>RP-<6pg.Ks2VNZ fJSUKnpCzr7ڢd|VQjUOJa[vod"r.oGMZ0v\KI|!+=qZ5dEcL/T߾rRU X( ǞX,f";P ep |rGzZm5Ԉ-=MzI[O^qu]f;4kT$pJTT0+,q0O[V\Z_y+52J (QG4u$\k=q4 #x̢D^@;9$h5o^20".w+C0!UXJzИILd4zl5ͭjMmȩj5R;o 2r1̓lsjA2f& ֢Ƈn ^pf c7pC#1z 1Yc^[XiF6z92 T[K]QR2I2B@lmRA m@^[ U( `! ?A*J!r-:55lK}aƟr`\ 2RR N` Z[r ZR[Åi'"OtL n7T9,Q 3!J198=PEB[CvJ `ԇNy tcM̨3ʀYO5! ACiBKN]n @IIPyd:.x C^5q OX`ک]t9(D@hBD*vCAcZi.B"p]b# !.v[ d$K W\+M68rLZ-oU9U8f /8uTT,iX$+#8=a0xHSR4e"I\:H'E $yV̀j@o]Q?B2.@Ơ) jbȇ4VH,@:Z$R_7UUN%nǥ*"xk"E1 Ü/H J%[`LREٔ|+p`Rp 3XK a+XEer L#+%po]MtF%JE5XUάbKQ@+ho TER+ؕ(2כlh $!)byIلv_#*DgVUQc%Gi"j8(C Pe0>j2a+֢ƛ ~\uXgLӄ~ #n.eg,F%̷<k1 E%eA; B u@0- |v&/tvM7?nTirB#5u ZǁmZ$,>&b‘J{a&2]Wҥ`ruA"X L͇ɠ<Πo _jPcA#zPr (1de+ JPaZ*562Ab)ЭLmQx nEe06ज़"Yԏ/X?ON_EiVp6SRe%)8 dxz v<ܮyqP'SaA[w5`QDD_b =Nx@6DP1is*@QR"d2{:&oCUQXÎn: L>\gjy=(f@cb- FyXN ВH۔YcXT4I8KmF|d)e$bzc"Õ, Ƃy7( s.}p^8,*VJXG'R% fcI; 4]n0AgY ̂5*)8RY xAmJUΪrݎ2+{уEXY{ oP[ 3$]$y͸t*2 c`Hu+m0ZXi̭wwv9Nɸ]'63 DPS\tBÑ$0z`2o6:7  iځQf.֐D;k sQS9%  `L95G _𙻶D,T@s1%%\D[9hPʥP4]&T*d@y!UX@%IO p'r5c5X߰UaݲIY!|?uEr\O.^"Nlg,+v*vB!DE HbQ<T5uClep?{ ڰ(RVXGϊSZ]FʄȍL#vz@pkOLA I d:USEce Ϭu~=(Ws΋`jho:k/5s%f Yk0m@p A,gfFZV2Zaԋ@+hB;?H(r|d8rOYS:(0qֆsgH`~="i0oi3φbUnf" RLYY 4 #!2.P,9tQ-0 UZX~NW$<Neg,2B Z<v|ߋp(B17>pAAO;_~9~FzTc8#ű Ͳ=F|\b|<%0Vjt:/l>=Zh+i?'̖廣`A2k~/ry89>1O<@qZ[|Er8ܛr9Fވt2z?9|k,o/&IQr&L'c7r0^پ*f9vc~DµD@NzJh@mT=x%Ԋ@_" R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)^%WI Q.ʇ;` @3o)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@_q' {DJ @0Y(?t%PR9R}J $"I DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@z@`iHrǣ=%U+`@ZH DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@z@#|ފ'Ziu^_7 qRbw?Jl"s$ H^XZN>:hf9vu1o>3&NYef](fovRy^ 8޾Jz=^Wunj;%^N6.If>hzrq6ӿm)8ئdekWB*V㸊"J˞;/=G׎s(N7S'-Wr7ێXrn'[Y[ۄn9`>Z U[Y ~Tc\9^[Qgp؀ ~ޤFxcWL 2ijH$xHq0 'Zwuz^OXvj:*0kUm;yͳm(,S})pe 2׎wCl^Zs iO+LW?߳j6C޿Gķc~ ˳uޯ͒I`״x啙ؾ tyrwE |{~.W=raEo9MM" C &wykf4K:l0|$t7ݍ k$\) ,6s7Fo'DhݽX'0W;=Z`n_w~xZm^xvή159o'::[ϭ= ~\-=; N[յ\{'zw:ﻍ迍W9t\f.Ga,J>Mm'z|ap޻ v]kX-S̚]%.6F֭/G/]Z1?6m /MfGirqwxqݳ]?ӧ߽8>g/~x$> |pyoPNvQo&ϯ>#uvv )_~\}:,/{6?McQkhWuviFa#_d1JKtV}UVt!> =Ѭ;:sc8s1nHWĝxiY/qG\#wv?uo}Zl?3N{X AABmϬ1p,&f>XbZ3\gsEY.]?u{׽|^NbW#߾8wvӝud^ZyxJ>t׌MʭwMﭛ6Ƶq?xs;4Q/<;Mq5T2 &CE4 _*jy M/'  v cG"5"eYՏ(D k6WuިRS9I@Q6 4(u6+ȆGf\į~<{9l_(m4=d02ڊh/u)R! ImGY5de>¿*Ɵ[2BD`WCqU#Cs}';ֹgqIVTw=&o~mWvhN'm Z_ӖΟ-1=۹ow[ʾt||\:3%7/+ j-i6v?k͌>RRB%ME{BW:$a3Rh Ks@^h[}Um9W$T|pף .HԎ65? :7\8O -}G B+]>@DlWGζ4v;P â.7 -w}1KNr_[e}CT)R[y{]Bӻ d8d q졍=h,÷GYrVo*hfyGfѳ⍂}g ]X Vv~|5:c|)Ф:7;4wASZZfv?^h?l|vQ A, 3d>"к"9KJ|0 dIQK d)mA%7NNA3?+ g,GL]4N DRK肞NWG[b&)Gt;~rbk.L1L4ȵKJ0:d8'QZ$LjꭶT%?7=@MTPo!%#őx9A撂t) C AHHg`Tjqh'oʆs]˜u9f̒KH x00IGV$b&[%=3GQEpTEj4T] V%@e+RDeI @ZpH- 0mbm<"RӃ U#]WfYuiע-b "*,"Mg Èe-*#} pdz1cm^ wT]CT#:AGEF? e&MDCnH;=TD $C)paof-gN/%Jt> ^ a((ZalD2td1c15j S@'I>ukl2K GExBT r հQJZfQ5wJs!Q:p &2p!k"'pmY`WmF]m9C)tH;'WoGD2nt]7.sRg ;˕<Ͼ'Z&ڗ{Okqt5ЏSkTD6 $6~/ Ìnt.-%xǵ>gc")g֙Y)$$wq|rc(s:w%/G ی zDb/!EE.%L2'sNz.qalVdTܵo}0d>x8Em*z$DeqC`912^Qq9&.Vk˫De!!ɒdL1ʌZm9[x?\u(`)p>]˔WQ@kn^+$Z<(>SNg.3p4#սU>JcbJYaQ&9(ueNi2b*z]W2%@ =H|iK" "yL6g+TmXm9%c=R/|e* ,GYWY8_QuA+B 'OYG8Vod@M|4%6Fi)yL2/dXL2V@eI6Jd{aeچb$ H R!hHFrf$YYCp:G,[4r6K0Pv5UjR[u‰LuTGg3@`N `EI<G%CΎJ6>A@2!CdEGq"!Zl#(zÇUbj4v]lQ?\բ]%,Q"%I*0{Ty֦SXι\H0%: dDuHid!"9RPF,iFz!N%blQ Qv\F%;E+E{Gֻ !l|){Z$ޑ(ʪ.DI$ģ\܇\=z?րKFkغj4Ie 64Z$wY B c#RrߘgΊSJp8/,$h -ӥׂ8 3/Fgz|Dˏ秤Oˆl@Y-72=?y]JB sY2ㆥ(p3dlI .IH uR~nsտ.)yI7w <聲c`˯0X,ẝr]v.tm'޿+ޯG#9δфg!Q)gq n;dk Nk1h(!\,G& ːư,HgWʸ#"*WEeȌ4bI{ xLTGH(D,h%UQj<@`Lbʃ%QR"z *&|Hf9BVQ/?NJm]J}l_7S%ȋr?:kZ|MjRh؜ܜvh~iAſ?~0ft97oΤ2_}yM6>e+ׯO^arِD0D99hJy;##=~Wrtx^LURH\iR|T8"1j @Kз%0!cA$CiIZGR͢fF[΁g&AΥ*K/dxu8ܾ>1iӄ9KHTnz^~".m4/@he֏vԣeo.O[NFs00|NkwF4rKoH>yazz7zmh~"\s:Nj?ٻ綍d)yHsRne:orIsM Ix[ . ;Tmw WXz&S9U*^{E4(YFYTpG%2EA~QS9>VDvټʔ{?5B0%g\u݆O|gQx1a>t0 +@ ëOWSG0W)׊DgI 5,U)6gF\oD+I;`RY⇔J8u;~sf٧}v3*Je)H˧gC^xfH!tjΨAVbk U cJ#68R$#?g)Oc.*5X@U-b#arWˆ):ކl8Nk>l VY4 r5%o~}?j""Xv:2shyK䶔5VWXhKpUvH뉐F'@ZU Ƙc|:@vy$"kY"M:@W)Ka :V Q[8jFᨚTO8ZA+˜R; ,E% &bR9c6([8Sv`g.Sb&~wdTvέ#+:%DO;LIcS% ez 8EhO[g 6cBwEPf' gxm)|_ }Xyfg%r0u.qf,;\F係Y"Ѻ fkX{l0g#ٛx2IF[}`&ƅ46u-AxuBD!Lߴ'!0 Rt^}$($ԙnfύ{f̊*ھ@*4|jRM;TSٻS[1t`f9arPH\)E6O# [\29gDvN~qG=9 ~{>qARt^&~&z7MO@UeiTo Õu#}]%}~3lxhEԚ{d*s]S.ou!ӅY>%T5<2IUEŚjTAˠo6y7&,3+Ť Ұ{N,k&~"=n{6-V8_޽4);MBk@$d0 ,C{Rsed8Bog }vbBjC?9~$`4?`,ED\"w*"$R0vQT/P|"ZH*up ńuJ$~> Ɂگtw5F 77nE&]O܆7P~2. e|<&GiV>0\]Aộ{5ffrZ83J ^# ddGr%uG q6EGe奼e,:uDqtl.] -^~n'dv*<.}.Lg T8M|> S*bfWJ֒+תg)CQQ8+8{g%;!sdΈy ˣ`=%vұoeS<`s-:`zkkK,ܤr!X0Wi^KA| Kv#z$btCEseSY;7zeޖmg)Vj!zM>qIzj " |K}E/+'UL_{r;せݤftS‡(;_@a_odՙ;:C=,;$}҆~ H`t.fɩ-p.[3qVq myr_p Gz\ %[a_ ]0:'&f:KnRW+Od\n/[rAX8D$$M&W!BҤ=to>_k;0pŒEt/?˷yPp`r}"Xhqj"]JYd^DbˉW3lBz*+|(w[)Dee.K]1f~|{lerbh{a.2*}yW}Iqqm V|Wvқ~^<|(Hil|hlͬxOYL? ]O\0gLdB+=·)i7U!LrZ赦όQh+Ap O 0g)nH~)cNr/9g\~5Zh!EG~RۼVi]T2.? LU~*?25FS.'niZoTY~DPZX[|W c}ei5ߙcaG;\^nX/ABqFGkMvHFh5;D'-'"$0w ->=ód^ )=ہh J HZk܊&괭h 6W0Q,&rwS :`GR&|kZ'>MGDWY-̌#T,EQ)ѾҲLy( DQyTqhd|Y $3B0q8MHTb_6_n0 T8%& -MˑNbF8@4B}oYWEn/kk(\R+BjjN[J{zʣ.D-F+ O )AQO6k\+Qe8Iv9r=X\ZfkmT*%djvȕj$w&hm m(Xp\^뗤gQ3j46H#H'"`*P\r)+3&),ID:# r Vl|Wp&vF0OU6ީH"0\<fg9s~Tm)Xd11,Fr; u  K^ͳ94ۯcJu ^"Tޡ3k=^n{A2QiNF s<ޏɪg]5kvնûl2Ò">1zxYW}+n˭^*&ƁK5n*j0::/Si8\Qq]L\9'|^i"3I ȁ _f]7/J݄ ?$%ڂILKw+%e[5ejFo[j/? /Em<(0[J73*>ur/N nmt"}{@4#xx/Z/YLexrqBf&eͱEdYya󸢳WnA(*Ӂsk0Ǫ#/?wz"7yGe|0dz[e:~RqgUUvpоiTT$1 u`3rX<1zU: &_L_8 FbƔ~6s^&5f3&nBp]2u3m=#sfjO]Is+7+ /c1n/ H"DQ(VQh?QlH|L<1`}Ϣ$=IAe;H]8Hx!e~-"%T * 3i]7Pl0Jlߊjp;I^˂tfb4Q e#9e0NA0No! pL5 !z0Fl9,./7K7"H x׶Gy?RL뫽diyXxݯf򾈘MM+tQc|!\ksGXwV"nl<-PdN< 4$8Ki,O5|]bZKM$RR KxF!!Efq,A0L)5BF<=&ddrB44KVe91;WCx%E88ͭ<&3)W).&cUO0B ZĜ[%֊+kPpHxvZ=8gAf /M(.97)]|\R >jZX do,Z[Qya 89Dd0z~,aÙG3:zM'BW]M )t51Ru) RrP&]>MNSBG;Dά/&jCGE ;!ppSL4^ r\K.aԓ ZUO0D:m8>OW #xY>GC;o:ǧF()2]̀ L(jNfb rÔF(RU"4KPBp̏'d)WB#JH~(Ӧa}=j+8AXc3"8L@TF8P;`tqTCXKTxl+^BN>j$Sa"L5a J XBZP)i_),jZQԽ"Xڽ!|Nک'Pr? #[ BWŨ{t6àᖯhz6dGH(1S;%.1@oeh|.ug$g_e5}od͊K҂21t;)ad:/K.}o02-9'IhHeh6T~sU& aJ~r o~# c273hz,|d `2e<㐆I)Xp?_3&-gk)z"Lo@pcJc%{ \s: 3N9G" "+$>^ _^Vc3A,$| H! K}4i@uQgyպ #8꒗x CrapucԒ@PJP-&7T6g'(S%K*`Kj6`z_6xdd2#rd!q}d1ZҚ-mLlMxi;&[\+KV20/\*&?sPJckL/RCAD8ꯥQdۮ/4^+pO^C%p%lgK2* ͸E5ynn 8!ScKfh23hK޾ I %͐_)I^̬U:-"P| DL xmh$&x|e>2(~^Ф *N@iy TO.CeZ,MM؟Cqo1у$-+wP2%R3y!Q &vH;&fh2-|6f2i|dz=UGp󦔑Cqbq~)Uڄb$A gȻu{4:\DT[,M_ , P|>p8YɐS2ठ 0#['1A#|}^1a0Q8SИ ل?݀F]ɱ7=8ǂdq3gCa ap-8EV,_YKKٖ$CLd׉X?{lz 7J擼( HWh=D 6Wj%v֗hl1f"ߢ`%876FQczPK$INj#0xq|-%LvGi;X`aԎ+@ļ7%tQۥ&ok ԬY^cJ ŹݣX PFs{@pSiAT8@Mi?bC烋x qX,i-d(%fGޮϐwnrά&YL3+Ƞ-M&iy 0 I=_r.F ?VɿfI[Z~F(ۛc:z~I&_( ލf˧"MF"(N9N@" E7*V18ho~p1%KyWLaq ĥi7QQ&HCr3+&K'Glݍ@ `vQ;m={d8lhx15N#@h$#@댁 A̢,dlt#jvTf?WaI^jy?>O5#͔V )M JŒ4e1T?ho<k=2zoɣ;z)I!+gEDfQ*!YAK/U2LM&g,{6w*IIf^N |:hJN)Z4{ QD`ct H%"2f7|Z&=""9@EOH)ܷ4:}hhZ;quEh-E %Qmws=_ VzɪFCdt9KdC:%$.c ekfq)G lTɂťXa*vNpR[9}zNo=[| (O'"%R# `jRpؼu(x[}9rǠTivrۮf_J)i:N Yšhv8){uB,Nx.8-il~-6~lkw I^Kd$G BnAINo  Ujw"AV]]g2p+ }])þ(}x7(˖"MapϽBNG:#ѹץ$j䣴hW$V~7"`ó{xv}\=۶$YB(@1lt'Hpg1!XaU?#̟L??s :{m>QeƯnKN{Zsx}.x使ͧq,[dEWTmz8O=5Uu8z v˘Sa~wR)>p/v \5Za]5pn$89_7`L.pDúͫ L\μ{Oɛy v=V=veT15/Txu.yIWEZ.*7J;t{h_A]V_~<^]Z]]-^FhܭFENst2_}PhU'_fbUR"yVt`2,xup 'AqdfQl4a{N%™7=y뮮N|QӫPU&ۼp8?_ܹd|F +W/&?Fp5H̰>*! s$DBlLJl6Ϣd bHd++ѧqսf]NNYǗcJv;4ZYFݡB|Qaρ^D$ 7 lJTyLB$Ԇ{CS-z q:% ͚Cnj&Fױ,c 3+_GI{?I'z$3Mzvq8'wSq2z+\>J3,,Gk8ZXgẙfYfJ( ;: [t !P?O@" 65bf?K> VT%Bӫ"2G%XIcZBdEھ0XO:#7te}571sD}PV4Z\F=CTwL CVS;G Qgq$# I}3G Ip;<ϗAdsSG^)iSǍey_DdT$/ؒ4 IRQ"C{HMߠ~|ehv3잍3ϫw/Ds"RO)ۦAGqʭ TF$f.ۭd% t_ @ rYk5^f-CYt( &*K3`f8&,nƊVڋ[qnU 8={k@sRt>x )mp`nm*eSv#I;5f|}!iۏa.ܛu`m=$Qr4I% VT_+nR=yLm3c=ձP˅gW2`lob-[{۾]Io8+A- aыBW T xĹ?vٖl$eEP@Cϧ9B <w#0P>tءE9my(-E"Z+*gN!j\mR6 ➠\tɏd~UUGy4]L8E3R6;YSP:io[ |透~*a"IπQDTaM8Q+ ix/1ڏf qHX8a{ {+˰%"csF,+2"=G@HOr%cG&VJ0@K dގiJ jJ$?r~.5_qoqsَ . >"L$1rƅ#̜).׈#,dbe/5c:K?`#/N#Rٽ|$8 2Τ ݝ Q?i.%O[s{7co[v$y$.'Aaxg60s2@n9_!oHr5Grl>,nPhx0e!{͟=" jFw$e*͌ 5s|U5Ț,wL]Wj"'x3ՊUcjT@._ C/n+84`՝a߬jpH~ܗzp]dN6wxtGAN)z"Sٴ?1^H T )rFqng)i9OR Cm>4JMb 5!˒S0*e3W%Xg ).P E#%R @!@ed5"*#%Pl .ꕻA~Tfw]Oq{ǣG@ K6@wEX'Ĉ׮XHt/wnx!d B$MS0[S<SV^9zg\t/Za_8n&i{Noxu~A_F0uWĹ]}%Vzo~8bms/’S%1 eQ 8VX'2 !'@4eH+!d3_}/жL) o0Gvxw;7{tWJK_­FW`gV5<\8T2 ܆S@ y2s1 ܖ[ =&-==5sܕW* /p浴Z86QKq&WZ*q/N{Y/JǕ{>=̝iJ/١=m{OŤQ&UeRqH s[^#wa u- 4/'"eQX'QVlϠ Q:+nY*jW8$2}F94*6H" 1)Aa2>5s|FlYʩV3تfo&k`k\3G9D#7vi>k7EM$1Z]CUDJSRPߢ4 D寫ttv׺I6#{QGZnT"7Mj pc&$fGI6+Xn"7U-i(]SE 28L dx%I,`J44O@H+Fd j8..[^VZL|Vބ=q *d!eZQ<kU_3G$WEXdDNFSY3 tSI5VI7#輥G*1l`Nn3سOOşpܺѶmێѶɶsϵh@c'H*F,<$ Os9>s|c[e,FWȓ{^>[LtH~'d(j1~hgcc?KҶL5nQ@j;}Ew qE} 1آX=X3g?F`wH~ƒb% 3@zYq+PWҹiDZ9u_kѐjy0zCl~ hOuy~jy$,v{zp;Ԡ!F"ᔃhIPJTwUGօuZ޲Lk3$Jl1o}Htqќv~(kM;汴c}Kd'7SAf~!$ZL%V{ߛƦ5sܥ_44_ul{VL#o*sNxKFͪIb Sc1 #ʼb")t1F ikF[Rx_\k FQ^Y]̏- i9+j#/ \*25<UKatGUddĐ`+-QF+1@bF -:-~N㤻4Vbx{(zltrVq9w(H@F{Tii]kkXlQC9BL9B߰F VvЋ̟4w r [T͏z;M,5|POqqՍg0-f 嬤=1X1\E+a{F(J2}Bg5h{ɗdd _H?W.z.kU#CpVՁ;ʹ$܇G SP겻ęfwXxrj xݭ^>wkxx]d tHTj_A3(~xZ@ߋu4}͋'!N_ׇ@i{a1O̭mi |fk8Tӽ!FVD!5x^bGۧ$—=g߾spJ0?P і+ps4P;#~G]k>NO <hHgg R枿?`e>з%5z\6_k#(kpMz.+6'i3wGz"y9y*[L[Yd֫pUVSq(C-Ya+8+zφ +X+r΅ܽoyi UX\`xHުk< cY+AZkZoc]h4iQ.})$qyhXoo H洊9~ƛ#l5sR(tаؚ;'#Z:_|G.Q*mE֎ +q;_5*c?%0Dm4sW'"S¼N@g-eه*ȨBk0QuZ T^o^`Wak-"1v* m-\cㆺ]/IJNK/\sN+Q{8 -]#B4Zߍ*5O89{^C.*{*Ϫ)_ycaJR(0 f9w[vW,[39$—M!Ս`ONqpSق$Q(,3Z36b (;n6o|96CQMT Fs'53Wc*\%L$ 0BY9KBTIJ IAg%f͛ReIs6@b@0$qf?V})1$!WlZuw4GTRaGŀ"S 8VX'2 !'Ҧ;|1^i\,XoV 3?$ yOdmELp认LT " ;l+h3e5@' Jv mASo  jQ/KSߩ_ )j dT m~jOQGbXydQi!P)! (K9fRHj8oRVf {)sS|2IxjXc@@*6җHfcb1Y\.Hr AC+SdTv+cnciл _>joVJ=SVkKhE\3\ 8.;¬jGX_lsT(saf6selJqsڒz DJ~Q(6  |K#$F `HyȢSCa?'uٮԗi\IuBKtD]eHgm 6䋰wWV8쐘v@Or5TbyF4er}) qtpRB_R΢9vB@C$*GP;pXKM=A{XJ۽ a>},;)YM':sX3\pb#'y%߬x(aE0 Q a^^X}=&8~]$4bp~P߁T!}" \|4/^uT|=K(ηP&_LJ$-4vW^VlFţmhs=*.^g||iEY E /r_\>"cnkatL_5X_IA\};x,ha2/?d!ZmOe689^]¾'tpOHν=J<~|/Ӌ e0AڙvEN.fZF &˩f1o fpc;((1c q{ ȴˌ_^jqlR#?uVˣKs| KjpJdIM]\j%[xӭoކO`-wv>>`1TDo6-VT&lyՀWS>2ڿ.o]koOH"$LՁp , w4V'GU-[}lSd9՞̜z+j) ]Q^khtԛ6+2~0=7 k͎WPpV_In i#MmGSo_oRKǵl>% rZ+I0gyL JVT74s.oMީzC؆jgN1eWL;ZQٯvboLcpL՘1;SE w!8!drF ZJ￈>,n )OV O`цHv@ |?6^1 FqfIs Uwr#{ӈWv?>Oyny)gxwl 1$I"Ä"q HcMbIP-< BAc4e^sZїwj# 2aXQ`(L@B0+]G0OXThJRȔ$C0})BZ0ҺtC#a T,l Ga$ JTbdTlE]{D.Rmn 6(exgN/QQF2H (*Q!GAd;Hyu/vT"r b/"q2Լ߅gYZF/ѣ{Ki앞_gpQ,;BkX2S3J̽[Q0Eמ0! 80LȈWG„<WX֚3h3Q;_[V7Xt' d+V >isw^8MB"ǎ1ʕ~ڲ1wiS/-A'@hq?lZkB2rZ̬4{(6%{?;lN xcd ˁRE5ʤi.HBdiP܀UH)ׄ&_CcUHbQ`-EcЅޫI_.RoiqzQ' ;G^r6RfKKN2q?c!v7;Ж!-;*ha!ĖO0ڷ7ݎHג݆z ,75lxͳFUPm .ŰD]!BR(lx"RxhdQGN4Ur2Y/*sǝX/5Tc.>RK3I3+J,jt@8sˊ2?  |@S8",fx>՚9 Ւ?u}V'ʣ>{}]&9SmEMǹƓT"LˊaLau_Mhva8Wuc M1Ow19Qhէ"1:UƳQ~&<Wd@/tvTg X š-둠Az{k[5OkIiGKhu T%)S99+ c"Ll  %!|[R F+'HpA{\=e?E(u~ЂQAt{= oSE\vYqr>-PH;S^G|'DCmɍ3 eh/ ڐ5BFt7\nڤTqL5}8 Œ)ho-fq1+*b^ Njo>45)3FB".A!R;N!hw-ͱc!G5+6-r!Q+/׽J`Z1OmHnӗ PBO^ĉTTE\bV(C.E:1t:IX)zS#@ŖFeq-*uYfȱwono:LJ ("F%2BpD1f44ҖFE&O*{tRnE}J=)(:^eTk"uyJ%S QOLs!94AObabr58;5@@/ qdxO52.6"uUj=efHIjmK|~FRŜT[H) toUa1KCFaU99OCԓWi7J3q %J?F"#;s­җphɱhP.s_A,'zxa?f ? a3C I4FAh|/(Өh8xQ]h7 PG w>_q%N(24)@#(}?2Lm )rU8Mp sA"Եq\e{CrO COJ\^x)OW;$I5u*߫2>%9#oըb3hGjђ LohXӌ);`axVypՑqxw-Ut$Śqݡ{` kޱQvY?QAc%th`"rܿΥR.z.vg =[;qa`>_0)>l]xGynRm7e {9$@봶m٧>O-uvWsݱfb>`Ѳ.̧tpnu,W3 NjB'V 겞N1K83ƚ.Xi'G =۪c7b6ŷ-^och,wɛNޡஂF4'>^Y ycv}k֧~?_:/ٕߩ3vsk v'zG*QeT g%'֯I+S _iSU; 85xYR.+'r{Xy(rNN vb+?55 .NSzP=LV頮Ti祡heytCLtZaX~`e*]>V|)JDbBg\*geAL _hyy\,QnZgxT]4E-U%򫕂K~l%?-`1Jt4&.w6\DY` <ބpi'iE'Z˴} l.9H|S:~vm:R x$VeSn02$Hp:77gp(Q3G 스ZDP #©Ήٷupp6N_h٭${ީB$_WNw0tC.!!e6 !׾0tyH失0 _I~$fgr̐F[ӐzZtt >S=mUު 44 , G Fмp_y/Iڠ ]Cl;I.N42d#g|ai{(oJ%GbڢG=E B^>{cvϽq}Lvg9-pUpDoCȮb3֩k>vtXjzQR!Q|3oP̈;igJ :]Cl*eP~Zp;p< ǩ\~C:}.6NJӽ^Ax!㫙&?9]a{Eݮ(gS {3{h=2YDp_[QGboNΠ}@x+wi F=aw>tU4]޸:''PT0XЅ8*h$R'dp@˃ }6<%Pl3B!9TWi;㝬0)ߪF$32@W}G=mw={^`oͼX[ f.;ȹ>X{=[_+V]IrIƒR#[b` ̏E̷]"U˃DY6s8I'p.V=ƿtsA% &dpIݲ XzЖxUnpIԄ4i5_C'┴P6(/;T! n=0Tln@{sCaHbnhbuxn4P1m Rx4Z"yH9FS~Æ6H~h0D)6hi |WJ83 n<е-37)~-QNBMZճtfėf<[bb23 l8˦Į0<;XRj h!QoDtm:8(cECd*RH5mQOaZ$#sm `?ȝԺG ׏Har;8SgΕ;zRy*SRDem9M˾rǜƏXF; {vNԆJit .}'{HQvdl)RqG6~z -˦t˄vc;.AK ?3Dݦ$y_*iVJj<ǴG[5"S$ YR1bU`V%Iɶ, lB4L~6vm"tDc{٣q!`9 s<*XJ\r Iy؂ZsΊ9}b!tI?*Db,.' Lt1sR#,i6{Og{h*4J0c x(QSi4jA޺w,ffv6 N_&NYn.ʻ[/&ց=?~}d%/%(̕)x_X;gno%'մC Ob;Ӄha,R_33!q&M)fW <7ivX H.#=XϕoAEɵ"EVn |BfC#L◆,FUqm8r[xHnKZ?4s "η]"Ueݣ7mJgO!yKG"', ̒vQFajƻȅY󆗲M0 |%{dotVI:颈j= :Wq~Puah|JF;= P{.e-/<RIIcRrqx^xvxl7_?2͚gZ¿/5[lRJl}ӧR%Fߞ ~']>=_1ZOF)_yNH*8.HmtU R Bt<8*Qq"l]וP*:6`߮Gmwa, 5]?ye0\gv؆ƾ,c1g9ӂ9y'9X$zNJo`-:vg~ LE/R Nz'xrqV T nwe* pPz+JмW*vlrI &7 P}e}u >x-Ey#мO)D:^~ߛKlM_?tymNackYy34|?g\4Zχ;Fe)0õE~5@Bhj'ޑÂAg{'8Ԯ͸+l5GhkoC-Ҋ`"Wbil3˨II>PWWqZ/jC9i. ͐C$FX9dHp% ϐ`I rřs3BuyˤsEHz߆̮c/Q=iNw9 #Rp ,Zwob1PxL'*r߀qL4:xk  x뇦zؖ^9vR(۲NO$cܱҖ&I#Dk@~. jm m!2 Ր 7@t~xӎ{om&'Z8ؑymlkQ\Ӵz'IJ+}vh6h8/X ^?<ŔCNIRG6[aO@^~7ԣ?sڱ Ai,^߆׷Hľ qkO n _ї3JqnGZfk$'^_w,OA)aB4=;S.U6ij[vq>!,88a"7S[EK#qù}7tw+LO{U9)-C '&b&Љ(.b[h4NNGj98c2{~ u׭ m=+}S?)@%76o{La̦Ґ*yb`ʥ`ID60i,%7͠ Zu,~Q}b Z㣋\?ZR;iwTwqG{"Jk,q'V1ЩJNTii,DuUѸsfxZ`j }rħ cfZ4%3]QTWl e*-,;z~L6md:2#Edb@ՀCW^63r(f&A}S~vŖ a!眴 t3ݪeߕp΁c ҆3o9`@DABcaiGB陱PSrmhNBiZk⽃dq|^Z;%B&#AtV2Bc-~3.L*'{s&6ڛi.*ǥ1s"bf\7+GLB2Zt.n&i,D#aI}?+m5O = % Q`ɑ398\fj  BYx-&4RmH  O3t&ża&p˽_xU|")-'1ևF)Ѥɖ(:g*̐+#H8PܶfؕPa]~rc7Wae˒iN)JJr#rOsnW?lztwSGZ|GO7>Ap=e}k{X qx0Jp#.7ڡnKVR &ēe_Nۇ)iiՅe.q{ݸB:Wb>]Md,pq dA3 |`62;>kLqϻcpup,AJ3U &Vkҏ ~h-ݐ^HZ ֵ A{Z)n-Yٕ6[XHLbD{B>oKբX9k3s*a3ыlTʼ3  , ˾@X-J=/4p[F-Xw lK''顚v56Ht:''?-(|9/ie7ϴL}!T>´ɝ']vJ̑~EJդ@n7h[_ Bd%o,s)9:̻X?ZVqdzz҆F%̄#7TwsgiUEߔWI^_=!~WȡCYqLhī=L@ 3$UhFAI*-4ևiKx֓PjAAp9Z+c]k{:ldz}ӥtZGhٕyzk EϠ%&&N o] dRDG g/68?Fk@%>ٞsuOl=K 7wPXlkOI(5Y辡{G-#ys-庼p\+FEpKf:w#Co@dI:܂sچfIY[V&BzD5eܤCraXeʟJfإyySܣ}[F(0 3a&Yqw3_NBOI fp+uCԤ9|0s['p.r(re־-@__E}R.r22,xV&Y0|$l,Ο-H4|Laʌ~ؑјz;!7msHڸ!7n CNW|^^_ZO7+`QE؇ !@/O GӲ,ɒF++EKNl cutW:4ț]w" #IbIKe"[֊ klPG[JY\n,rIq #5~¸w+~c}ai) b?vDgxH6u7؂M (,!<#L~{~'˳-383~C h;t6]޷:4EF:ݔ~>ZL}|MjK< VRM_@/Q9?B>QipF2!|YQu1BEok"Mx7;Ii+QozðXI_U>Aߧu6)"HRwl-=!L,ऑ?"Q53:&\a|Ae(F0u/ Nچ<=Ȟ8?!jGCݓuEE%"%msǃUO+F zi77Ysҩ{SM,YE6YQ]ܟݼ=zG(fRؙJIxc$ d[9O:q AJ^֣v-ޅ =h< fb#Uvoa ,/vamʿ[&b;Nl/:ɚ\:"m"?GL 3:>4:]Xtd^BWZjLֺʍ(;fbO=8[n*h!R.w_|v@8o[a};LoM8gէG\AMzMB2^㔐<%$o:cl/ȡcs)cЀz5{L5SHh4^ &1/~6>} xs w</-"ގ(kma%^%qBLS# >fLǃD^(׌Q⸘aQPRźo* 唓y{Wu4l󢸻{wqD6cZ;vxK݇1==K-^2)Ib FU?iyJ75L-n&{lJ;X}o-X7N\2G*aCzAB )9]KdǮgPU$Gz{[FnT4DaP؝lxNŹׅfPs&a,~({uF,GO䲸vpsvOtB{h42H _r_TRAJG!,^uv*g";6 YX% 9W 5].12b}G|P< P}$C`#(]C*+b~˩*Йnbj[Eƙ wkihZCY8@MZ dcJapi>f.$6dbShkGcRmqO|C6 :'|p4Ԣ(5-=s0k};jJRֽPhp~.ޭX@k<+m+AU̘ RB`d" EH*lInu#Mΐ_P19mxhZ;W~M1!kyQJ*{-J4z~&Y:?TڥF?vMyngd)_=4KBNƒ$l\-?|^cy<4:h}Sދ>WWmVW>& \A#=v2]淆V&]-Gr|wRasai uy.z ag\`jʖgzU~ Xo./r jE:D_Ũ$~GIzC&ojJi<)t鳲^o!a khmYpΎmn\7~vsٲ}+88H@:qF;=)I+iM`ɀ~q(֌ۮ$:~q}zPJWp~y"~yi喒m&ZʒB6J't:OrYcs|M&SyY*YsIRڀօ[Y]5;kA?P&CWz C>0d<FD}+__:pha*ޗBa3wt3Khi{-Csc,F b{UG 1sb/>+͛B1e*3Z>\ ?K-azj4}#~?XŁzZA<Z)Z?E7_L70(lSPgkM~f1<^{t `+V•},=#7a񯤽 C5x^+zzw.,xdr0WC)(ć4ɉZo 념)-,CX!Q=3Y=ߧ?vӜ?F 8~ۨ3wm8kQטּ{w[ȨLQ/]#;:&'L* Pg)4}vHR\:=o0 xM 1|fm^?_ΎϗTOW-c̃3>78ދٙm{e|1 f`tyҬiʆM//(7H9n J"ܲ['ٳh BIK1:k9Hx﯍nu`u0mi)vʓt~.B\p㈒ܔdoZi7h4#OOewO괖T+ߣrӑʦ` 0) * 4t~8<_m'FQXTM"%,.kv}  YK(̟{<_5<0vSfD2#Q̈!mYGZC|-2ZTjQQEEj!Llla8BKE ;ZD`EXtN Ṳ+TTYsOT~I<ڐcft!;!JzwL!$ъ2ihI")PdA٢"tF[2uDE2V;i3ꂅ(Rk=n㬗 E/\gǿD>|L|or`cHDX'YE%c$!VV˸&ȰBWi:ƴK1K AhYA^].hvY3EԜl]r+CfHlTa19i|IG ,"8ccjڻ&ꥇy#P;p%jR"ɫI'lQ%% JľIf@A&e,GtSa^`PR纠Rh {Ƕ_/*j溜`YmAN(dҵ*SaFוR]-?rt]E_"\(-*KTEt[ž)f&] $$^&>М+:/ٳCbW`؁=o ,# d\(L ٍT2v5 hxxXNzp(t=5vֳI Ǻcb)'_I]l SXXTT 1.0L7t:ٮc4xaJ*B׃s}G-X?N{Vćh% v3]`w-^-\AJLI/`bæyd Ҹs6vCOdc.NQ0U[i{ԖgfZyIa9{`|{A6jǩ.+0>2` >Cǚ]0db &e&hن/5 :S|FX;ümKť'%7[{{?=2Ҹ3^>g ,w }d0ݻB2~"zpٻ޸rW:DT<3]`_0q'{lr)VNT]WKQ9$Ebw?Np~"<1. H؃02 K{l竱,>۰gtOօ@lv7NuD kfKZ9.@=V~РejR7.Am5rs[Ͻ1w`,*#U3E i5Lgݜ8D3,Ytq찪?R\P?xY &w 9Iqagx cGŨJ3o%Zq[`z+17aI.:>vV3#Z=0] ]fMut`;޽LL]6a~{zfZ0PЎwK?Yw}veH U- = ,6=o h{Gv[?g+G)t4gijXRє+gG/6]K2,|~-:xdIךȰ r.#2F%S ;Ֆ\ O $bRoي/b;ޤ,s[_-/}Mt@݁#=`/B*ߒyJ[LtoֿwKfH7Ζ-^fV%{sVAc^{pMuI} LRBJ'/='uCOO E88 VON>,y!,kXIeeRt| ҰռyKM;i2~$ ~vۏ6-Ǥ9jƒZqsEb$7b=%#G*iA΢}sSq?;=fvٛϟr/7COϞ?{F?͉{U>Spy=}yy'a}\a.#0]׋%rvkH7&;8cʾMy7{Y^qk}uc,z]xo# CϘHC.e%IC&m·S|m-zT8#xtIQmzu4!ͱ3H)ܬJSmk2yl%4pMXE/_f!YVgS)Q*oqQ$FڊoA#=Ŀ$ %̑ک,^GԫKeɨҴcXs%W>eEN"{ V^U_&?sq|n3^z'9Id͌ ^R%Qv)9twǎenkWST(iy*11ض=]jb|jhĨdUD"Xہ2b,(KguձvxRf=ywi(HCL\r}4дbf qF.N)#khK(*Vu̖zΏ'iP\ >|qUU QR+YdgZ؍C[HKܩ`RzPMxZ=>̌vXXTʞm^['f~L%À.M%3#;|ϒc+!7 2ϡ> +ِpmR2̬3 ?%:ռK 4LEk[W8ks-̰dtkC_STwH}#)7fX — ^`Uh!vHmo̺5Ѣ-"^3|6PtsDMljÝәU֞tF'$ǻfkXLJ7I;Y;#x9^Nf2a@XdJ5~8 ,yFpv:03(>p~3f4J`I@52&s* 5uT57 ^]C4c2;5`0jc'@Qm_E1Z s<LM=Wle%,UJB!r^<<4;Y+MuWM<|ꢯS<+&aS^&7ko}#< 3P'\X߳yњCsHӢ`F{LeFpcgFm2(xIw;M'[d6j%HnOˆ_< >x}_%™S[nFу7 *,طN?&u ӓ;l DȤFŽddfrXZҫ8r̡pzr)]r]<'5o_|}b<}u|WN62`8ށtmϛcf\uMe4Sl  QUHQZ.?|CHjN|O7)1r c6Z@o`7NV'jܣg-)>٬*A5mgvlIeF[v/<Q3: KXd|i>Ec"~Hyp]S>~/3e5zQC$CקU_+ 'c^:E!k߫ \I[ uSԄ8]5 *T-ΰ&edeL| m>l\y2:8+nA%}y|eɅ> l%?V?IGUNO|36(ϫJU%7ͫJKIƏׁb#x 2{t>%zw[q"Ռ  N/-'w{1F^eP86i,Xs;5BszJ@][/hf6w>RES?/~na{㽿.Y\zulzq GXϋ}c"Byߛ`8hCl9hw%q]46Cr+Rp3ϙy20@Ͷg@vz='?WR=p%- YK s RmqRIXÈΞx00«I{B}0R;'l*$d{bQۨ2ݣvwEkߘh0%Oj7gHgBFc'sCH nLq\JnR%| M!i6doYک P匓Mjw3Y6ܤvY^IqǜZ\nR%|s6{LjcR;Ǟ ='j㕗6*Xq,;<y2(7W? s"0dv F?::s4GyB(fRInL/x걊{U^eYFy1)*T\ #Wy1r q:`ZwIJ+C0H*ُRDqnQb-~po۶\X-/!ϟ߾SkQLE-ߖ%5E.n -VSj+ڦPZ Gʖ5cJVN۴tʾQ3ۧWW+H \z'1͆")'!b#H 6%mM 5*:*r[;;> _oC8*~-C5tZ|SF;?J=(Ek0slnf(afFx cWY!e},PEOvWǢr V7ޫ2 ZB s7KI>c3e6]ut%`X\*;Aa>£u&k6ږȮcs$:ax0CďԊcXIֹ[?Pz\Ի?-JTcL@I @S<wh-ZϘV\B+%Y0-Mg݇i+lZW:J/]+F}eUg=]nR J2uz9(#k.҆[N!H9E ` h>|t6MMp_tspm.\Q53WJKޥj^SHAl]8`KO~(XeS3jk-JF]]l<\]6eֈ!D!*l%y45bZ  K%c_DIlnslcԖVuԖ~wQxY4[_y2E9D#P*;V=e93_-6ӷvt7zF'R ,zXþYrâ%vqgF2g,? " E1x_"!_ζS'od P#3FTƂ 4@ :(H#&!Ĥ0Ţ5!zMɂ fy9aYW;?p:&+)n0_P6@Ր4HB4~\1SI@IKՒS!!< x#vp9cZC1NeOV_{y1H/aCwƽK3.i nyB A{ٌZYČ jbٷ5{EıLsioH!4G>.مISWx,{9RHrBNe?+W%C dS@db&;%Zo[i#6V҂_rI;=E! kߪ{n[E^RE)'[*0_w5^k acQ ܒ6F,ə;z>˥hdd_G =ŴΈ>D@$Uk2KAkUo?ݡBMRGE~U^xl<ԮS^u*̗918ChVҪ ى'lߊXo.#nItZLI@^> kH7Gݬ.H@5|^8K9lw}:}^NB:<]N()гR>|8A$`0aD$>*."KiſΌ.rVdZ@RKJ-UWܥ1hv TYx<{9iZDZࠣ47,#|{ӝN>߼{~sq~I{7/wggǧW+(OZ::{^~QIG/FchqOUX{ۍK6W6Z<ɯC.f7/=t7A#:o^n?OGAzp~ \l*]m 3h:_~ɩpٱmQ!Ԡl6 aS^S:-g]qZaqP@t>TaZN^_EHC[*>z.*[FH1AƮsIG;Z7<#;9ǣ#cFc?Oцo{`vqn\4_u~IEEW>5Zzrv~tÍΚd+^8]DK/|{zxml&z;nXyI CoL,[dgs! HD ga<y\ffva^f>g`]q~ew- ,"#x!om!X] u4蓵9n^ZrO2Ŵ0~XǺ͋ę;yx |]y2^'ECzV;{ݕ%;z9O[x^Efon9|e$DZ?_\8/25uh7\q}~?-gqn6 `(n$TXqb>viɅ>-E˲":Gqot1<:x]y:.#T#4h wKC.Ɏg q^Hs*`d^ܮ t|z@FA_SC֮KҰXm(Qۺ[ Ȳ%_>(H>S9ۖ.8]T1Х]N&RuWy[J3V9V6⢷)PeC &jw$ƈcERR sF| O)Zj&uHk=FWk}iAq.tn#-i0#^XSʛ.qӽ "m/E%߉J.z +W]\-΄N1!S-9WV𭞩cI1Lsd691G~HǥYelfʋ4|~6ސnA*F mj(7Tk@lT6)Dwe͍8L;H ݍ (LD%'[𘂞sQg(8w̑:u4Z[br9=]xy3t;t`OO0 Мg(f33 HެUceZ:h+M=\:)1܂Kw:A˵XA e4. HyjT+m#0+8HQo"s7KC$[ B@$_{*cUPůхK'v֓{bΦz Z/F_VJVp: 5WdcӸSQp50:s,*$&AOqfY2~`HW`b~?mJY#B 649n\TZrO:1'V彂q?q9U"z}lMcOT}BU5l@k>q(6g&u|7G=X{,D,Z<& cnYD2˅l.;I&F=x.ŚyႷkkU*nizOhKq-*W'.9,!lwHF7!ùJBh9* RVQ8)j98ǬWbfKK !*97>Z?t8zs닮Urk~eCQRC oRVMYAVb"@?mQAtB=_vFZ\hV ])SıQ P]AcUd Űj\QB"fv#i$4%:&X>@*2F/> mT Mz@<7]TIL;}FBsΪL |,t1IȒs8lyجe<2jk9aR4nDePآ3P=?VVgM^)޵p&ʡֈlc1W.֨>1I MSE+-ǜ;V*=NCzz* UҔ8)e20"FstUv"ō6_'5Gflּ_CYPћ:7W)cYǨ{_5|y⁆MQA#`,?Sw* .l)PrdGv[ZuN)#+d|h<_nJILyٮ QF|JyZ167o2c458{Z@%C))(-So: EJ({AA%1:/O7~k #dBĨT { )rg*#).I|WT5*L聫[:U>تPlػsK){bg) x"!'5u6cUڸ] a;(CAlӃ^1,!t_nI},q Vʃt>n(H Y/?A;k'˃g Fw9PrhI,z_H~QN )Ws+Fj &Y[ Kl~Ϭr m6w\GnWǷ-x-=JE&,<qL]A`wa=C;˖Oa~^C#[.SCXu$8 &;#@shTw~6RYďiOkڦ{ 2.8MV{ݰ[ c)){\'˦ס/ Yb,y3[MOqxzM-tz[ ޖ|sU.z.w6%lߘU*Yΰq`@PpϬNQ^9sCg'py"2lrj8~ϙǥ8Eh灠G"-Y9 Ó oãYl[}.3ùG$3jAyl0p[1?6[XmŸN6)'<F3]'"*}E ''NSB8->OБ71?zQVk<vwtx$)2%y0(]gl5n>جy1xQs=%Ϊ]{a(;ea< R,^%RxMa))ӎ.8&Խjc>2ּ$=(]憒ϔ .=jH;[aoׇ+<邷P[{w|틽?n5 >n.B}y3 } ܉ǝ<ۇvkA8'wkO *U k0{Iw݇5x !0O?آg#f͋ eQEMi '87'~!X It6ӓ=;6n u-`s쎒Ϥ,Z"e#)$(8F)Ƈa~?*scֱ0)58iih5N5d8ݭy1Q0))2lCgFBxITOkjOm(yk+Hm I8_Skj @[wzs_wk5;Ա]h7͚Pw !DE@'s==%LaĎN+jX@"P:3vzET! 9qyG^!+7]^K0@|L,쥐\#^e0c%co])]$嗍64Ľ|)#&}Xoߏ k$Dׇfء!˙g'[)7O8KZ%["v׺*Zhɮ?9h?,>`v@~*kw\bz!#G1i {OtYn%cQ?:*t)R[ W5X3*ZKʡ{W92uihiKA Μ.WͪJ3e:;KNbG3DHvP:U9q9Y8u,v 58O3C֬Tn}TּP`R@OR3ѩ1D4kˡ%9mX3UCp"`q_َƽf2L} c˚2vQ8kʺ]!L<4.Y5v x]d7'tsã|GT՟=BӬ6NshCc`+c 5":ȷ*ȶq(rKs:'wuբx>ּh&Y1+ld!38}bp#v{S&?7/*WaB|gPҙUCc6ҹ!'5-Iq#zN9k-IҝT)U+۬phIԮ̜ݼ6?z})͟}A8K=NiIS'bkMUoEmq.z.M. g5{*EM,:GBZ# QA p;r?+fyx,ێs2[rR^Yj/5:rŽr וE=bW-Ψ|G 65*ž|%wa|v\hXi~\"6ir!2c֤Ph"6/;|:X_ hI浓zt=&țEKMw7u{C=k'S/Ei@" bC K7.mHЗۡ@o/@k{1mJIʎ!%!5!)R$-r43]]TR£k6ۖB&@2 \> ySzXD! _B&NUuw-qY`kAvrͳ C5Ԡ(uSd=QhCG&?WY_}$'|-.a֜v|ËxM]Pt5Uw<3{O}^ݘ~M}B%0L;fӮ e1gaГL:;rӎH Oɴ#[@<L`ڭ}v;z$V;!)IfKjhz(]C<5mw״hlfhL]#9]cV(\"tn!Eey[yWwPwLKޕŊRl(-"ʝ`sĐLL~ X@yIa!"4ŎQ9)_b8-͈2 ƅMT#s>bBi`{xwmAa~ztFg)g(qc܈0ƯU $cs*l(9'aP:X4ZVE!!o;[Qt|p}~}1\PN )҂Da"l$$܏BڪBP,TX\'qHjBDJJVɤ`\O7JV|fOYSPZ2,s1 50(%ىqgxiY&i`J,q Ėd; -E,2PNIfs:!Od0wsfmbsJCbօ!+Px1 `&t}Qќl qs^>!ǝ 0$`.G_ !N"]! 1;Иb{($SE mjٜvr[|sM;ԼnyJoLpɘPKL5*CF\F a'\VU.YiI"{d}-S(_0Xy\mF6Mp="BSr4Xe'y[K =Z'ZwP߬D~Z+ Ug2مHоֿ !=raˡx$ CmnYi ,kK|HR@9%6o/4 lCXI%P -\y/PsBs'AڊWL]vE.-9dXij/]Iu}Ϗ՞^>MF7]zPZ1`S>TpbgwǏg;t7|ʓ.=L,~ۏK㫞 z2gl)p/Ǥd*Rt{Ge|I#W^{]Yw.{gu/|+-#OKPQLoȽGqJހߙJ-IctN5OKg*%Uoo_뛧֘/Z<1`j|<(f4;Y:w(N24f1i<{aeLY @@9q$r NZ%ϸ]]U S*wl1$)kX_^OE`܌E>(%ǖ4ѹﭝj2s~,z"4OTۨJ6m qSh#A]iXd\ XbOi(8Eloij^>N]^7_󤹵ef#Yq KAEv+ Zϸә6BEg6y:s;eQX-xLNƒ:Oฏ *#<0tX;cB&24{ hA`mMÍET"4܄:g1P 6jߔ^$o׽ {y, iGYnA惑mq|~7xhQb!VRlC 45{J@[9uhNo`+8F!y_􊩢%x9 b !yBQeb83sF$;Ub}H(qI`?+r)5ltRK) YeP++RWMQ9IGM,xh2Ӱz/...F,Zdכ%vU1N iyVeS !iƫESRmH%y|ޯ!|.^ /.r^g)%Įz-Z(Jxg+^z$`K~8TW/+pu*__Jx4YO.oU(pdvV:LT;jT2zRxSlX~|c?I7_/e^~]"rGfzO"H}ϷA; =?g5H $NyZQ P÷V ѿbVth -G +C )) XQh%!Fqz1"LJmZ2F)}>)V<@АhO_r² 5j8xBdނK{4!V޵XN(FT۰C*3 c'yie*5,@YiÑݝo]ɝkS\+0̣Bi̢w?Z6YbQC^ j DZ / #DO)/\)et~c5/@;^=/2m`RzN Iql ɺ[-vVyC^Zu, fV:U>Fj@}3HB`7#m}< `hg,0o Z*XaR' JzP/,y@V-qq`!naue Xz[+PbG$1#{6_O/{9km--+zxs>>nÿ~~}̺ "2EZD +*˼ٿ2(򳿜y]WDfyOcF~2BTneV&v6lX2|_3 `XISDt[[!u ,fV<b?FkWSC&*pBN!jߗlDpc;?`K2YT1_-U8LYGM&PI?iO}P~Ўe0.{sV 8%.z;y-w>_]tHDQC t5 ߁xe)x-!T1v 4/~ҍI|#;#ƹZgZ/~kDIRz u[Ϻ-"6E ŗΦXQBQ թ"" ֤dbϸMQ.8$';I~~ԳyƇL_'vW >/iFŗu3/I4}hi ZlIߪ7n#߯=_[T̪ht^5g~1|:~Z|eY?KovfBV;X3ǏKcZŮwfmo_fWnw9s;6쫿zlu켣V9$8,j@=,3_'~8߷̳GF1 HeFTOKc""@I(IfD Bdr p2G.Jx #9|x@!Y^$0Z :1;[ou#qN)h qJ5ْn51Y% ?#tnI}v[pK^^RDy'?%7c0_E?Zv>@A$-8]h FL% 䱳,1L,)2+ BF6A~5J^u}PYyȆb䭛<ϕW1-Y*ucÛv1VVoQ̌FiRhfĞyj[fxQ;Ku;ZҖ{=PѪD&1QR's"uL4 .;.pyg9moֈ,һ H<ӵJ!$ݦ=]XPiϒ*T] Ҟ'l!ߔ7.a[m5{8w (GSrEt>+B#W$1wT_ί dv˪7 Ǵo/Fkc(KM R\6KqF.jT^rs8\l"!nn)ᴲ~$P(!5ZZO09BiZAeUcmc/Q ~ BWf7*m їpi;CRbol:?\qמo{&t< ί.y;le,!hm/l)W}@a{ZisX OYUFV|c3Ʒʟ21zٌVًY6u<[ߕƯ2kzp 6c֥:H6 EV+kwۖL]˯CQb *ъG)jEY;j~3 UdECwE6{u| Z`N$:1:Di#=V[l\[`(18D˴}smI;P (K4)=u޾~JOoQo|е)q۾Rx*8(`VC2|G/@([P(ޕӨ pҍTcׂakv'ƁY=b߶ӁgcA_:!&ɌV ˢyIt:nqsCk,҆B ä*UE˃'!fj55Y\̜$km>Jը?LX]Jv:2sFt=?wGon&4) 7-f*ØAlrqq}e b͌+B!iH}z]i&TV4w\%Qc(n/itjЗ13%2btLC|. &6Y+'9[JcdpL&TRC'9D o= ڍtpg7 m' ޺f" %UӗCF/*.V3*CƐ!o2<:*U`~( j~AjH#V;Q@9/ÛYsY֣]go9pɑ\J(QHc-<< %y5>xM7\< J7A! $fbDQX (b6f ~4>I}b%J?@.r~yōI WPw37.D,Z/f :W|Iax/b⛑ ΂Nr8[8E,#mاoT^l;@Fޓ}Hr̫ 5_T!gI!83gV7CB@\(D#t?IY XWY*s:;lSWkJN}}nշ5[h2uE|N=5:?2Q&ܫBpfţfv0=!:Z-y!!Ք0ؐLRGê? ,wͨ.I0-M9:2dIG1J Y0,vdSLD뻠1Ybd/SO1XoUPt?_ίyFQ`/*%ČK"%-}(4/t:hkvѨzGQ'yIu\$3X.NIxH L @P^f+Z..Ϭ[_>%Q dAR(qI>xz6?dZbU֟ !o*o:w_fkIA첖Tӱ3<"іCNkKeS!tI&<"YrnW4Xdc Ƣj0֙ZJU&"%ӌo&w`VNy;)5CGnyWNo~`,? lˏ_L?k.-WLcv},]];?;0n>^)0SgW"]ṃ9J3 Z x%u|"^Wi#i&p@J'ОޕzW w[{uG=#)0"o+i+ _Zfs;mt\]酻>؋{,:oSKS|O*Hw'c>{>qz \nyQ~̶ڸc.~yyބ>߻Uu-Y~zqZ^>j͔6* ]TU8xٹ"JdJ7Hm5^<4qꐥNٳ!ȧxWgϝPܔd|.`enRG!l6O|{,,TϽvxW~yvuWӘ5|jGWw>JRD ZrBJXՐ(3$ʬbM__M["Bt?h%҃;'V221م+ĐЖ/ݸh%=l}d;j0޵t'lN?ĩ d@f3-/wiieΈu ZGB^7Sw(9^Df H6RJ.{/ӖqkpHc#$L"} .r`jRl-g6}`Ch;V4S/%YU"a-V?s*cJ\IqRL «1z dY(I#(SE/ !hmh7aԬd\"~ tIK$O%ȱ]RVNEfn6˫W(Ls'Y/,ި}}5κbnqI=ӫdWX :,p we\ȧ7n Hm{n 2xdRC$=k$"d_ZY>|@!ޜrLk< .T4|YJA2jdʔ $IT>SyC+hllHmdL ;Ձ= v'}E=FӲ@1Ek_Hn=Pu%=@hQEy]7+sO~_]S>O2t^0ʯ[aUuk/N StL_U 7'@T4m B;␧p"b2V$_|<'LZ4[@kָRUh3j:Wͫ#jث>QuPOaWJ0{:= a9zޠeJ(]jw2 DojawtC"q[k)çOUbxJ&@6nV3u_ф`ƀ8y*`*9f?on zbJH\?_ns#lY;{4n^^`&zW}yRF1lH"g`hRXy+︜^9?:OMsMdgWW;@ ut9w?n_%o2cHM Xo/yyj4ؒFk֌ lb[j_*~,X&YM:,^"o+*tò2QjFXcyR;]F{[s>v'*A`qli :z+Xܾ^Xa; <27[̵[٢.3el͊7qs:mhbu׼k6e8G%G#0Т,/F`sבi㕎R~r]]cIPQk6xYDd-ڸ k2 6&mSB*.ak؆hj^1G+ѧɗ>pD/?vhS :_Ơe :_Ơe=aڔ\aBTҗjZpPz:g\ͻUaUvz r'ڈ!k%gffe5gU2o;iţ=ykqo^?`m4 I|/$!MkW^]İz.F?nů Qř&c,1U/GphFoCA>"V!H5F *knԗڌeԑR' E0،RJ7d%bDbXJVj'Kg`%6'i J;*p תTmLNF}RiJG堲DJ`5!o:(ęI"*'^Be٨fd]UH]W XC`BqlcG`8D@B:F 6\ݰ@Wy3iC.9l8Y)H fAP<S`2r k̄.mBY< P seǚj+ő G;b#%hON{(x(,}}X 6SZiDrUQcZx^ *qP1P[W].V^eŵM-e5BB!aEOc+d$QYRg޺XK-ϓ9 P5r!_J9;ª[N RJѝɼ8A/Grt1Wk𰵀YȔ+#J7çْ;m,kX%(p;2(Ͻ8+ +7p؄ǓBD-L],f `*<Az46My*5esޚjLEOݟil]VCWbvsslb&5rUS5L#{.v j+YY*owۿ/r ;8s^z~+\n:恭R{*bxt=#'-gJDfOÈn1r9bU`*Ob]&۞7v%tNzֱBBǩ}5FCpOwo #bQPe TSdf݄E4@/W >XD=&&ތ7GSiM1#؀EӼ\ϝ̚jڸܷX<Dh{6w"2 9*S$80aޛxr#;`~0U*QjkW?QVVLD+Ů4Ӻ' .XH5[iE9:Zq0ՏPྌYABqXņ^Ҥ$i`+>F!D+#nC^)ąDtNrUB]m__\jyf $ePҰk B!YPA-V(4;:LJ!ul#9bɴGX.re\.K=qPEK(<,)e(i R͜Rr%,UqGɝ(=JY=\OjW8 }Juܪ k̊&AŕF@ BI<!cͩX੐\aQӢ57XRǬwrJ);SEZ,4$)T=~k\3O;",yuk"P6s0Gs>JƒMpCGQKLi8X쌼9HeD"FYRB{v>~ Oh>t^;f![g Xk5oWĖ+lnXiElw'-gnfmߘPXeyBz/՟{n}wA(@u~u.vr%c4^?i;la<3H'f:S~Ֆ^zu Ɠދ?}sׇU;+ 5%>D G xo&6e6dm=ܯ1㭉J Iص9ogy,뼽H4Gac ;$;c k:Pa"_5xk$"~9/_m%*pFylNukX&C\i+3Klc١; Zے>"٫XNnܓȂIaY0)g=~^Ş`L2G` ~,$ &N:/!sf4唴a9%]9%BfhBN~䚶7@$)0ay?Ւ,cI aFD ,j2e"v2E-fk#C2Od 5ȰհSةd( 62dHeP#>2D'bLjs%?vןfkԜ{',vyYC*TV&q~I_Wμ|јG )_/.Fnb[U Oz ܇Ed]Scќ&UDv!&X0QaJD jbT!-Jq{tz{#A =20 D[f67b5EN2]Y|ewOϳIE1.cj?p.W:46{-zQ:.\)E!F8OU0^vQӹ[- wvm ogx,û{{ wTlH2sDd@'Ӱk!%=H <$bFp ,jav as 4 62VdhMPby.Kr.[. ;` {^IUeԱK1D!,(D*kmqgJ",yOg'UO6y;Mlb)K171gװ 3fQ>Xp+*-4Mvc[ߞ"WhYa"{Mʼn9SiU\0Jɲb9ֻْ`'&ϣR}iÓ*A4LߊʬγCBcf`=!!v#CBFG+P?#D9229`d  05)SZwixvbp_܌Z:J,efl9x1ZNbv9d&lQ7+X(FhE^BwnUzo7U?+˶2W;#UX=5QUc9xجjf8hEER3"^-x9V`!G)g ^οY|7^_v4_ӵ~_y=[`rrzsE9e&dȬVfݧ5E3壯XVm]j|݃u;"I5sgʵ@#h[Lٻ綍,a& d߇j)ɤ֙b'S[+ E2 %[IY@_I-61[@TԬ>fԟf W4<Ғ>xcOJ>bqBkT?{e1郆j1%߷U"zՇnpy?ipob;˚$LG!A#)Q,I(C@ Y 16 Q`!~S0覓/X%}z=PpB1|YMpk?qj1̪뷮8{oQFs`f2)OrRHZD  [9:f[Q+5S+;\AK7z& bze;K˩'x7lU.BCAդTCӷU&#jv]5]cN7zRL5]?M%CMm3wpi10a__zpg}|MQ!o+pK0yc@9ۀt_:Vġ`j'q?S\#'g{a!j<  #p)G{7}v͖g@k;Raf&eCAۧ&U Yl8TT:Ȯ!C4nz7ygǎ&T $)/ $<-UC &mۢr+cvyQ, G*SdmIF'X#LO6E(皭ǗrUE.TWi7CXֽNrD-H)6c7.ljXt$MvkfioP{D|̳K cT}nZuM.CS,qrUu*W{vھӢ>tրwi 159يODhS_Xs(2?H^kzNT.\㙋\Eh%$ 5d:_8cqbe2+w0M{v`# |wK2rϝ.n~[b+iW0w!yD-8f]pUVsH^Ӈ5Qji/eG ]{h[z7~6T@!z=_b=myǼ빖͑[L>w o%'<7o F~8az|MX""׃!AykM!$qt^?l!e' _L=S $~ V_<o[&>ke_!pg0ٝoff;N;@ @Zu[|Q<SVDAzY-$s)AMAu@lfy'Y j,*70ӶYdj|i0L_V4q`^]ZV&AMbV}4kon\:O_M+tl O"BT1zŤ4W@-Q4yo[ٖm$!}sCmW|elKe?>UF  ̖)SF9J4fJ".CN#?U!0}@tMc9Z~$k)US߰!g૑:BĄ!ka@`]iYP2O2EڼSj/^\56' [2R^/W?&SU ZkG& $,{z75LF7v-No~Z8;Öhwۏ{髻>NUwe䱭 *$U܁lcV f='fq:ƌ\'"㰝}i c'1?|AplܖO1u*?3<(@&B:.Y_YbdOwZosc{"?{/ n/t{aк7l]=fdqub68=udL^܁ױL^if{Zzyq&v~xײpŁr$ '۳gߞ{db/Ωh/|tv5x7?ax|>}7"'§_JSa׮?}zz,8f#{C)ҫRgVy<$, #;Et£7^oM Guj:_ޏ>ߝb'3c)NΕťB&~`gCwL([Y8>{eO҈W` ـ'M*_o?d^0⦠evjg+{$ޏTfO-loNK:3 bF*M{ԛndVy?~QCQRFxZ x #$$>o푁e#` Κ1 \\w/LɵxOS5G#4=.9oکN=ּwfoyXiJ"ZU e95M,uRzjhTɕJOp33r2Պ\"vLyNd#=J v4N]8F Av]/4GZ^J&)%4ՔXnͦN7owXAM6 $T#P7}H0J.VDSc,oǽUwa8@|*savQ< 1qGT"b/_.~ Bq-{ޘhKmUZj(WW 7\q#W&ĉl8uSaR5&@:bF?%lB(`j7^/GKV@|wlCK,+׊J?ʛPn.rS+9傦>\)mOf\/IvF2ދ'$DpOB$Wxz$FM+n;+_ ǸcJFS$l@)bJ'F8SdYGy$Ne`DSN91UXB]=N \FaʐN< R*zZV'h x B:>uE)r#+WH)x*SƮ(* 4"(2 61Da&5A2IIzIr.VE)FTK9+׈)%fSn"50Z#nߒeyPֺ@|긔\3o+L\^-5M9NJ:뵸Ť=U 8V8>*puYSk2W t"e>R d'DBC FYπ*DO&4a698ֈ"õV۪w U9 3XcE+o<;wglqpM-jLȖb71ؖ%=XEPsÍE3j@) SR?hZ_jZduX)TсG1̅8o ڽ\'vTܘ9VpHk_YB~N)fy:XU!+2_HxOsM @^iHV݉'w-}gϟUzz8C+r83TuubXn{*oޜջ>ކۛK!QTKUD(HiNzuʪ!$!pv-Ƈת _od/˴ǂR%AC.*0^A%{]tIu>B^C!3ڇg uF-g!P}{"T(Q| ꟜN3ܹg}dPdqg >9$i'6uXb 5)0pG| uZ>.sR,&31҄ai_>|"Xvg=PPh1z0fRD"*z9M̬ 'c%T# "(0*R(G w0##6+VPDMkZ-QQHÜЅ5Q^H0b@*ed!@"8.*DbP)"R"xKɍv*wƛkP e\焂#-gXjn=O oc1&1p/IKO)*ul젃 B) { da[Q](la]/@p uVJ<\VV7пK 1x>4uQ icP‚p6~QJIUgIrgH|M@@uMĸp;2&m.aP":Li:1&'j*E}Ry /zO>dPy){lQ*@|fYt䲽5SM%E6e0u^w(:ثpEb"1|3|^pjTV7p/AD@ *,9 vB0ೣQ :)Ҍopק\ybSٯWItj"+ҿttȾ {“@eIzc "6JhtVg!8I >C|(C.ͦi6-`B!:Wsl 6:xe}ޙ;re4ed#@mʶ2jaMYtq="bN*G?Kr-F] W\_?.^ ŒI&$}ѰF$neD6'!\ SG&Z;gB{'RL86L!x  +F2v6x9u͉[.oDhHA݈fbsMzhœ"şPM'R~Yģ).-k/L9rkvq'*!Rjٍ-1Ŕ"Ai38D 0nWT-1_= jG?3%$ekhp֪1̀\sO䈷){ \$ףRvI)L ]Cq%Lu>h=5‚hW4}iF8q}>4ׇ暆{3|ׇ\o& ayb O!0iQr;,OR+/E6-cEZdEign{-"{-٠țs)K$J!;.bW,LR$5"sE3%sm~/u^FMKdK6)8eM> nxNY?1UR)QJlSJfK h)jnx3@GdInSvqIdI69>{6!DŏЎ%qDNJTIP$Ȗ%u~Y ̳%ǘj\?% LH711 igGOjA Q8z]xS޸lkt o z?  ~.? ư5i`"x)AJ`0\5X&BȽ,E' ' Q0ޫϟw?{ϔ6½uoGqܷ0L+q7wu;S3~60BZ;fݥ<6a7[jpήռ9D+Jx|zڏ5DM(6jE[pBZwyF}XZLjMI ;ϑ#GO֬X>ݡ՛|whqА+cwh&K ,}ZZ k!5.b¯ڿooSzoigs"ZB$ 4*lԸ0(4n c Ϙ [ܒ$2?n͢_gECHÆ!<.>'iG-bQY\(dԩ18>TzHp0LY9GTerANa/1j£?~8Qa{/1EN7PG`H%ߓ﮹_]m{v B ncVW_c\9C QF& ZYd%!M^ o "Q)jm;(냢Y4zW4ӄIHĮ`]OTi]% 59xgr!Qxw +MwHT}PXWФpnD)*yϟNI$(7vCԎ uLސ|OuIK('\Tb1Ln Ѕ ?ů)/) !g!c1Moq=ի>ΜfOΡcwZ\CmrP=dpnF4" b˝1Z;a2,Y(eyݥ_ZzΫB!E(R-!>c{nø(8>T(pJrQ"'&MyntQ\}UeiOYɯC}EC>sݞ4kZ "%j"Ba>6[+YCFCȺMo \23 8#]0nA{ :4g!_y*q0Mų]v"uu5\uRJpm9bk ШդXm#Ҳ\+p+o$T=ogWW_cBiZN4Uh]T폚v;g4߰Spl #xv S;m+~8pX ˧˧z[m߳ߖi> `%3x &R JsǗOؠ񭱗a9M:KRg)Y,%KՓ&7p":Ta;u`QDF@dOcЁSA+E0gRji'jwXy1h$[ $yXX\ߚ)ff(X-VZ4qTHn"Q,q){xHq%M9ZwhֵҪÏG5Yl;\ |mn 7OxI%V7/q5E1M1W;Q8b䖠 ~Y\猍A"b:"a}Vkp嘖F[P D#҆ 1`LwĂpcXk%g+v mOd8AXg^9Ŝq58܆pNKSlK7z*7~V{=#\HOGj9RUb7 ӷ?_\QW޵,"egR/0přAهEMDI$o5)Y[dU`ɮY]xsVNDD+5B#Py[fЛ7?a%@Lr]DyHF2A(Vi:'Y= @5՝!E].gE{J.Ay5y_Ә`@k˶ 铷(}ml2B b^WfbϠ["ƋS' vMH %AcShy+7{gqչă;X@ 3>``ab5k4Ga="V2Oƾ `^3fޯ.<%JpH)<'ѲD RoՅfrb´NlDm} S'&?f俭GUH6p"VL!,!{5cf¡XXab0fd  t]rOp^$VI\E(2G DW0Nsl/V3mJzpD7ٗq g+YpzMX|,/Maa0f!HC-I|*zk)z`d5'H(4 צP#7_^j(zyZ %bq/zg S|.M D8%%R6 lƴS `RQnm&||NBxHU)}v\J:$"ŏޤ,T\fkKH+utf]REOqť׫ܠCCB:5ZHCfpCr'Qg1!Δn2KI#1&T3JTT$l;H"D(;X灷sgeҺ;gzv,un^^ џz{`$U?g;PԄx".0 !0y쨧h6H]CQݟ3asJ 2Ov`Vj0H !C1wjGV7;{!QDp\ȓ=(}R1g/{0߾Ox-> l-smsVV{B]ct4røҏn*)(,A:JxpFb\]Iw"OtXb }f>FȞ>?II%&I@Ei%gQ<ӄż)9hܢͧds-/&N &u"JiXHiu0&+ T7}ט\#9T%{n4oG o =PZ^nbBe8=tIYse4/{9!(gn>Gd8 o"?N=F i _E;aw3OA0IpL”j Ls+ĨBZ$'1wp0! ;: %U5#r4㾬]*0B:{> |ȯ;W5yR]CpEGY= f7vI9l6f "oibc{>- hpPOQ8Jgc67]iJ5qLT|\sQ_V8oUT*"5UTG\pE^X:%ֻͭ;z%v:oU]0W_uskܮ@wޔ ',)Pj$J bڎ9D` phdi*0\SkdxeJBZ2HZ;YKrlK^ܩ0|%YlRiH.%ih!^)4Pϳ0xY&V>Omqϙ4y 8iʄ҄pi\*iR[[U;>qOKWbaLբ0ZO{~@ǖ9Gل2ƞL!"#u ˟`&n$kwi_ :"&D(p&IJ2X S@KS/RP-adi~ L|J+%\ K"-dG4s!lc$N(T$3HihʙInD5#11vYĚIrcW`o^fWMya`D5Mጝ%[CqSa?Mөa9g".oZZՃܹh,<7g`Z9We[C><\DDHD'q{`Xfۧd;TB]Do'YlCVVR b7W|4'#yVcgN~@e=RIzn{ N)Cs k9 f˿gQ:靈D3s<.[F_RM6&P%sesXuXq,RP D(Vi@ !4YB)3%3A )Fi [k)CY !RQEf `k)wC$ܶpXF]# #!RZzt$ɰ}NFJj6]nd{\!%{+$eI93K&$Sm*TkkkgM}f ̠mF +õ8;GCX :F67&֩Cg8XĦ`5S3A,&U^i+х/pGs*}Պs)Bָl)NWJm#۳˺bh,29hjw&9fEj);_NO%s NQM28%iĥ)icSxB/BvzdkΈ_<֜셁xC V{PC}^ Y0C]8L{YXrt^,z/;Q_FnT絢!eKzIX|%]@D d}J@lwZ,8}J)@8KH8YnU/kH$?BŢA EM_"g_v&ު݋CIiO]㲩:z 1z;^Q.Mbxw]ƛlɗ yn;եrnM8k, פ@3TEwmCةЋ7v^ 5hEP087.o}NVO5Hl6M.^Mm%XpP }MrY*CҥLAeI8ӼE.{ݢ pvh^CBe8mcvIk5l]l cU90N0${"D{]v 8XUy2KI.͛O,,֬?ҕ-/5Rf! a]?d{Xv=9.Z;}/a"Af`WumGn`hv=,[~CQm(n`]*v OSL<2ͷe^3RW 0cq5ddĘF(#b\!8ӱc4$ Tb8}JRp*fLd):g+4AػFr$W,fgS}} P5O=0$V,it `JR)TR]0l)dx4Ft$s(ts.R-x^Os1,GhDqtzf2|P2/0FpeēY3 i[1 mA\h m!nk__Vn%Iqj9O29[ҕP i=i=w~8vςfL,/C~9wCֆp~<+!\0$IO ]8kN dOtmlwZ}'wixXNGqw:a,fh[ pՃ@[d&]KT>S -ͣ6d hSꩁgqUcxrٖM^N|x/Yc/p`w~.3 9Lm<,Bt5^ߕk#b2 k{eZ1FVb8З]p"mg&H&QW 3E"~irDN&5fJݸli*,,#y?&bje[ZP^)*V:!M).XK/H,LfZ'D/߫ioQ%I[ ^OIk|oz7aN;(pJ-7o?-aBsR]R-0]{JNcТQP -D9;- /Rd]J֬5['C׆_T%q99s}pCݍ}J_3G#ؔcOWd?ѫW{=ggͿ(hŸ4u:FEE#0 kzOc3y+cu,l+?Wu1fƷ8JGE$G!~#VjT*[{<ˤJd"`d;Yl&a%-:YG]M?Q<6FHCMf4󝖡<-5 -1ۻDI 0<2W߱IFLR0kI 题x讗?E=Ÿ/c}4tB6c^.8՗{@[P^E4].u(TJ:žMRo&LAtb~_XpƵҞfx,&jD_\݈x-xU9}'A<_W]e=>|5ԇ8$^{epdO;_-Mv&fǫװ5(VW:gʕ1qu |b۴q+Q;8h*ehA pqq|07LvA-` 5㙻;jdv}u`h3$Df1 xSa)S%l Rl$B NP X}&?>.G1(p&) uc@89T8Z aC4rJ4%«Y +38cXƇ~#Z C,M T3 x [&$Ж~urL.(I4c3X23f*"pkF;%y+mFc"f`wmI@ciHQsZc 6 2FV P:iHSM ^`Jh# ը?i FHD '&6!`㤑h),8{ Op0 bDzd9ђ,zu\)8}۰cjy\%#0 ai믊W*~4 \ͣn)W?}#CYAR/Y_0@D57˛!ef"r\ī͟#%1&f:>=U LTQGJHx Al:8<"\3F>NFDMn} $1.>1c0n_(~:"O<]Dw.FKi uus:gdhtzq:VM_ӫ)8vXjB#3tJu먼:ZuzH=w\[ϧ׀PHO+;}VQT>g #Y(I QkNp˜r T`nc)("YX"2N8 2,)FxOm# rs( 5jalR2f3!*K vBeSJR1V;'*d`HB4mvS[MG!{[C/P.־z0׃wVn@xc>o8`~|hDP׊1zu$dzQaQ3=F8qLk1Y3֥-ܼ- <VnZͨX-zre֗@(yiDqr-OC:K3 @D-DF)w5y &S!Q}-Ļ${uxNML>&s >::R }^qabN)o NJc%z\gZp.kqx#1b !M[>11n1['&<ȉueЖmfII [nKlnZ}K]鸾|Mm<}ٕ[ziVɋD`MpnW9Wcp\KW) ebo+%#!@`ru\1Y±L:B-!8]8%'=iB\]鷎mVp\G2KrO0o#)B/bN*VN&h`# g yB2 3[<+a:c9 Z&~To,e;ЕԇtD8TSrdՊAIЇ*P8kJJ+: R E<ԞJ@GU ZD <h-p \ }ej'l= §FXaHxQ>W&k9Mf5 q/TTkna,<=r5 ?r=XOvBjx|&g^*o Jǐoödo~xY.wӇ>lD`ÖMYtAg`J}YKŒA_X9X" zgePpe˔ˡFpMN2dGjڍ`5s2 Fp<gj{8_!ao%~n" &^6>d_,I jFi9CrmYdr83UOWu=5U12=.{ #?Gٶyh1|e\x]x 2JPDt/ޛVQYC~4f]OCoEΒH1`-%AB:/.R{8XB@@#S I[>ZjRg Vg ;YP39-FbGOqICDM 'ZX,DX7Q[HAC#qfX:L)O&<&bcqqUFFzP ɳFHDZ)H@qIc i2QZ>pjӄAfT@57AqQ)O @)*J޼H AlH?^K*--I"j#ș]\= y],g|MoP{)GNk̎pi6Ufa#QʕmKzAM2چ`Ai-R\9CQ:gI v+wƠT/ ۙN&7^:#OƗyeEݨRmmpN65ƀkX)7Vj?-JtH tz@UYM6eJn'*H'[+u<=wZnG=):rct?<+/?~|4, JA^ TI2HYP' Næ'9ʙGh). ވϻdoT8#)<:hz$X29<Ce% L0}gPgڷI;Ӊ;sNH~(tfo{J S%_vv@r̀ifEkhXљՀC5|2M![aL ϴl% 9ӱl<4l $#Qʹkƈ6bZ5 =82FYX-NPWJ pCLԜi#zƵB!TH5Eb\.y%T][k(/҂TI\7+N |U}~¦r_@!_Nj_hgbJ!,&]+}|y PZ*Vvm}qQaiFyI]>:ߍ߯~^~|>!_/B ӀǧG~|`,l8zohv_37/1M(4 9d$!>g<'_4f3Xu_V\5Xߟ&GHv60s?6],dMUD_3k9Ӕ)nԨ&|Q}3n pTvXXQJ -a2&[z%#t'f%N(fJI‰mHFS@7]'>겗%5&'>N|n,T͘ A]=$?އoJ gijx4N7:Kl$UnrYwT[6J :E%?gn}#=FE Wv8Uzu0L.!"B$- :׷6N9u0BYV1@ոGUa5_BH1k}QڂXT$sBQQ\RP}זX< Fq#/אz}`Ҋb<$_{p/5#gI/M5C|LzLzLzLze3YSA]iJ\A4 ;/?/޼x+ī59϶@!C Qh+,vev4O!Z ]AF4~^1rJ%7*/A9|?GhԵ(wq7«MUFwZ1ٲ9N (j=ˆҳJQҤb{P Ha8ʉ@ ƙ)\1 dqc3NYs%j9+AI3(9fhẀ/<ɵb;ՎN1@ʈ*3XVDKFS0"; lhWPO6<8L3]w׋U?v$uFr5v-kv=hZw,%JKoV \e([{HNT(beI"8u䘴E1)B"1i bR*)5;JRi6ֶx5Ub󰊟;~*,xvZEpT&۳̈́૫[f#wW]TΝ {N>~?ڼ1G]r=U_N["Sv<+V v3tkW~ r,?C#kaVہX 'Gg]MJ ܓzge1|bCZНVKMj+*ТTHf[/-w[/eJFl!4WJec?FZ_0u >Rhԕd[4ԢmpD`S3`)pfHN +%]k$(˺m*˺3Be;Bu*GSuwja6)/XpB7W5lR5p^ښ"w*?; Zv-/p盛O%JPzޭ8q2.y8-Bƛos{ãjgՎ- 0VÎUmO(O h3X#9uqZgMj- jQR ,'ڐC7?jS(Tk{oGcU<_o5n2;UQT;NGFn-J"duVVׅnCL*c^rZcٓݰa}t/%ktH,%Dh!u1\ჴ*,G-X[4Evp[ q(L, ~Xb5⩛ _*Pkz&gν99Z`r֡>prY@ѳ@"9p>PcF,5 Uޤvs:wiB=PYփz4(ٕ탪0ш('XkcFxSp ." ׵Rc89H2ixegHNȮ>UX϶kɉ*Ÿ4=4TYm CdzN44T)Әd $ 4 40yluZo"S"t@P95r[: <31M 3wP8dIbWtE[\n%m{u9ī= X o&^g$^%LIxxaorV/ sv/%:jK[1Z(iOJ5WiKګ-.CrPyX9~82 oYwꊺˏ铟~)R֟>m >>#}_/B 0r?#?x>wfM~1$AaAD3 lǨϟ3Tԅ&vKfw[?MÛ To9y~S~"Ňa6)trYKF*x;FB'szf 8]/? /r bxNh<p:73dS=zeO3d4oF/.@7rK+e:e>Y4$;a=0$k\T >˷dQUKĖ%#w1R̛p'! ]7M`> >9*>aT.g!4N 샿n6 dx# #t)λr!+Ub }]Yc&X$,ݩe6'gF390t93)ө r3Om@j"C,TC=LhaC5 FmG? L {H#Ht4 Ƽe^ż@!A`̻w@c@#QuS {M$/T``R+ Xn$7L4TÆJE굃nc %&rpPDPWպZ8wi,%ګnͣmsBD@$ԃ'p/!SŎ[/!_BF*K܇BԄB Xey(܁gNJLes/]# a)Zg҆X%׀*:I,dzoѿ}zm6 F2ϔж _&I0q/dAk4Tu&,W ioj Q8!mr_ uZ$ PAYWB[ρ]W5'Šl V@Nd`Zl?oW5h+48@"?O[1:K=pܻ̅e.qqST*W'e!L'kF>c>baiqݵr01 vL_Q~̮\t: M.&ϓA~h/Ff~Zk,>9nN(SJd)sTe4D,-TN){B+Xl R"ZTePc<ХVsAHo޿Hgxy=^\օ3X|G7 h +h/e:WWra//MkO p77m7[`Dy2PIt*I"Her8xWK/Ρ׳3(HzWC_~~{OyI|nT 'קo~~w?{{V>Θʝ4Ç?]}=[cp?0Jɵ?}E8Wp4דtQrƠcp^dr%TʹuIM|©;.S hLu_'VK"ZuFo7^jw#jqvhlIݚ|S{㩫s( fBv d'N|Sç6#7|)Wn Av ?'QcUz:IN3\dqj)ɧs8u؄wo[J?P 2nMZR-$RM%8CN}̊XRs'RǬ֍߳oeV }sDmT9HAITx.Rh=E 3۸ϚWM5/i[aVu&l<զxV)W ]p? _OȂ! B,Y DnP@*}')WȨQ!BF 2*dT[fT\y;ʸ)_ho7_H,ٴR}:-&-{QRĐ hԲ\0ͬm (&嬖iЪB}7 -gg;h$o=ÛX뺩$6$ZzX61{2MYz^/hl &J.V"WĹtòR#l~+PNg`s6wVi2}RqP>Uv$ GIՑfDjIQROIz~Q dߑ *z\4HdhoP " U_nuUZqX?mīM/b7<A yO! !KS\ xH45Tk-(͕4MCi-M"Iv}:q?NWWWi4=[\_ WFz F#c# ֠ߝ .B#?G8F; (6CJ0`@b@$Q=@k ( "*&kayQ( e;Puzr uR>Hb^҇O7~W6Rã[)*7#AA7 /ݨx3 z+(,hUADsv"Ab z:h1:ӟ51:Ӈ5r0:az{8@37P+/a8b{K20Ba!@!@FSF(t@" I8`Va8`;¯rj om3Q٦92e^=М+mzc 5m@pMlDGK@&f}HuƲhYg؊[= li lu251 UY;OKФD;/vI)dzŷ@%h-un-5G;pHvAmh{ϻ@a8DƮFH{ϻa땽g9ǭ-]< b|ݩ xAW kx8Ѳs_{8.I+5LZI^N{t26- ٦ nNmL[Nv C0B0=m,;YF$`P n= Z&zzzbNzxwca@ Ԇ6\L/?퇲E lP&Z]Ώ,)Os"Kek4Z)KKѸBd/Tz2W) -8OHJgy99i|X4GMu% j o+O~5_=|vx~曵؁oE [KV䛒Ǖ">PpBVd *mj]Js_PƸ 2R3iDZ0h؝ʱ=b<թ{OGf0bXR1w:[Ba%CGplGp!L[twn wz#˷[Fܭ# L :u7 iȎ2~t+Mwpն ЃG]EX3iQtZ@z!2ah EC01ѮQF) Z̔鍁$+G:$D}1M5rDZl?lDhb9MI4l$ `#T+zhMU ޸D; ]mJmr[騋Fh0:"6qc JY7fQ;(W͢E2' /RYdnE%6)O.lŘX[8L j-W;ݥКjZ&$2La{&K(nx88Ļc⍠ӺfѮ3E&(fuiԘ+u &Z>a@?1Fp*M. ɹf\Y+G^|ROp7?1РEA[o~K0v.)xpϸ`79t&X P͊eV:o5+W^H年l`{g~;s)tSfLp sAfS >upkr$93m2nґKo$nĿ󃇜g4֌bj*ӄpVXD1*H#FIZ54dR%a(3@ jR/ƀ =PzͩȚHDr߇/2{i8Y,}b$Y_cC;1Y Q-ԩYZgdY[!5z ~gyǙwdp+l8G<76͒Г,\!Q3aQbLƖrv7tg9d)`(BhPD42YV0¿YtrYdߍ>1g:]j]vd kR|"wIK4'P5x!F8lelqH]ytMyjJUt0&OFhuSV(`nj2Ғ\LVh a&MbZMYh(RA<4:N_Ѻ?NQ \Eʩͮer濍,gύc/Gt+S2֖BKtȤ:S#}#C،)~C(jnv1N^Kι:[|!zKc_@ S|9KZvTwv}?R͠4|ydnqϭn9yMpwSrۄ?n qsb SNI-;|kbš# 7YtOY(T=yU7}j:{ky 9`OȉI~{1Nv.m%,e&F?NVCDƨtp5J|> -ُEݖgrpl)5 Wgm谽D2ܬE͞V Krpl;Rk!ؽo@)lYW־notZpJpT73*H%')GDQV9 FrtR~ta 4Sn׳ ͖i~F9Ԥd|hR>$RiSҊYVbfI\Lڳ4 2{&J I4e)ĩ=\OhgG6Y֓,oπ *5ςS1v ӹlY2A=yQ W 9=QopYS>R2_gXFQV }@TDj=%$bC mm x>]:PtwYi';KJ*JϤ gb[:O}u9E.5Հ`#|tȏɬ= 0!R~D4F Uj`T\T"֞|8##yFҌb&'//'FgK-%b=+ +_x@)QN0;dhρBK5ZlQP^.gee1Ͽ2j2.xE H @EDeEkر㱩ii;u<6{,GQB]K>렾$,b8ۓ.V:a} GX,=|<]nz0YNyiT$ x)t^.qh[>$6CE$ʍTkE $3yv&w^|>&Z3|@.է1D  O9FAOp~҅d( 2'Ӭ+Jq`w*B|ʙˣN3U˾S[xO좆չW}b9Zz^~XҀZIʭ@>q90kishC-3|-oO.Y IA&Bž"<-BKDrr~B1]8,s.0.Dn}5#n֌qr0ʊo1 Ѿ@1,"فbf'fm~^l@a3UwbGK?9稦k@"Sȕ _G_Ɠ_h^E ZPhܩ;4`FK0_'qܮQcMeZpOĉ(oB9~q$J*Xי吲>łJW/臮eJ*x\&mnJ}kK{E—jG)l^oA> S|v77vVFԹƵnqw}[gvKKT|u\6{(dt?Kt[Q[bCě9!R1d7йP Bq6?cxӹ 7_qџ_,_fW|蘂ƤXÇGDŽ4Uf4 ~ץw? à x+6 );*nXpI 6R-b7s~qgt=cE J͐5 ;sk<5& ]ޖ~)_VMuVxjl p{k3 j'C]"1M9ZY<#H mE[<ōxC!<0`WLޖt?(>w ^|ZbhQzs{z |\5BFRI[@o'*idXt,o `E/́24G}RP|\&O2iS0Bz$9>Sj%ヽ>'nnUC>b3@YQFײy G|pF*#8 V`0=S^>Ѹ4J V#? 0ünH*xU;o '1NgZHun &N(ØL{;V`0@#cp3X~wiBkyG%ԍ)oF=54ڃ9w-]F(jcT~G]ST)«?#B?Tڛ>k;B+ 7r<'Z0Ќ9*IF[Sg7C(ʈThv,d//*8OyXVjaRMץTMzCk+bm8tsP\Z5mҮuv߁~ 7RSUԃŦEYҟyIcI|L>FZY%&~7nN\hـ|%[@w؊}JR|}nQ_~)}{$s727B/g.l,u3jO x:Wbwvo0FWKC%YABJ3^kbP2L>䠽c.wo2̘usl{࣫M\:C5P(x"a ڝ佯NVڗD:$ 񔮟2B? e0Udqp>~L5gfWڝc_kwp~: iPuد}\A%dtM0PsnZI 5rU ]d@`>xIvPM6ބԀb4SjFmy[']c8_itn\A1_ԛрb.m Bz煊!sҕ=_п !Bnz751c"(T5y&[Nh8ܙra@0`3㼮kŭb(R? $MȥuMk}ҩ6{ZM :lSg1!aTc{9ʚaNOI5-~вłG{wuA(g^aE3pFq?lq#_!ro0TbAII:OkE[v,R$% 2y:KYVVaTpaچY@Yh`~M*a@D}.C^o;[Yt}S*&S qUw<(ݛj #z-VkK:$Xt~s'KDb4YG uOlcG,GYш CV Hk%t)fЧҪu1Bߏ*j?EKp80zF6v% k >wSc]=*Kcϧ/#Vbx>Ÿ$c\~XUpKnj5QJB!6mb$c狗U8(? %{#4 5Pczo]z$QTCΙ$D+FEu$9F#3\9D3x4 iZm*:wzIYKA*Lqmv$xbо JXBԝo3Q]0& d'ܳݗ~5DA'9/DOXB&<<̟%2Ml`fFq4ظ28?L0r59a +Q+ET;sp4lc l 6!P(9\$o8;( Ÿ|CߧC'ǹWv=lzp2pTdZƛ%%`Jw Mw<+ f,vB`?ތ?Zd-SA'XJ%,UN ?Fo\˼uisXY7CZ/Rg$EzEg^D73\۰q#^1}8aU=2qЬͨY:kP |Ү //+bxhWإ>;g 4 X+n?z>D^ȟɃk w@kM Z@kv~PY, } ;Qo❧#M%ok(v٦+Oǃi% 㨄,AT- Upf$&ˤNtJYG2 k2P ݹ4k +HhWE m]mpD!CU{Kz/1=0𘽊(BY{ED~y9 esU'amԳn-MX^5 i*Fq$HʚFN8'ut$ǵH!0O$YEc+Cęb<3PEO;-L$ pjDD&V:08*24'(wAɤ$ƻXpCNb E׼NYnqs-#'>Uu0ZFOIaMOAv jia:̋M F+/uW' N[:BN w;tuSo޽Ad>FCP0VXRY2iȤQ1$H%"$;m-̸8>ԻרQ12dZ I $Aau OĚC)*˳\fNq+ӑHJzPExۦP9"~]$:r,?ݴ4.9"2Y?;ԟ{w_ۼ+F2$A]~?}c& O݅vcL"\ĤкiVI^蓙Bm)As+j5|3Pc Wm3 VQ)m8hzeQ,5 j9Xr0s^SH[3x-Uggz(bZfXdČG DAMWmC9H!&FRa9,j 65`>}OeۜW +(mI }Y721kgNĜ%&Ypxh! 8ь!LXZcZ`ʼn)kYٜ+NWޜwo ITf'2Unښsm,ձ*'<2 v'2"#H 4Pe,'t,JuUP6@64vl"A8IcfH("[*%#*/L.%]Ql#ƙucdJFI6\%RQD DT`.| X_aP.Oo} 5r!/Ze򆎋A<=r]>dp2-@?d-y{\)eOz_:t85BDb?3I:Krn>GwBWſYlvSo "6`p NW_0" >Z *{&gR e)ł:}JVPņ C#b& id3 9OQ:JI`Q @dSFk!XmrnkXV45|?O_QXJ%J[q_tTi]8`8۲8HWm m]ٖ{RZ"5dԂq+ g,fU.Fzy>U#$j"AW#P Уrk_Usv<[FxSd^†OUF/RR^C*fvl\`|};8 qx]ȅ+k-ey2ugI8@XnQE&$!sڅt'!ddi'/Էx89k6yAJ% 3nZ'P+uPe+TnI0U]"J.JzfEWbw-xGSrC6j1ܢSb)k{r86F U%BXj!Md&d֊@-{jp 2?!A`9IGV%I8##İ"k}yMdg u>1G_ ˊIJL3!ACbN 67x#|1a8QIc" HۆޕņJ/$ AJ BDp^H+oVL c[GL@QLGh͖81q"Bm TloIO*)V-Ԡ!8G` k.O8o'oi5 jɤIC#<1ԑ "瓘R謄[ :Pz+"B/]Hƈ2"8 A$Y&D1嬏ߔ evȑy a`{v3,]LI0XpM@.Ag5wk ";4#Ybc7Y(xB#YcnίCshw_p˫;^G^K <k{/kk6n^T'N, B!]nCim r(lCF3u: fs{PCYI&_R#ܜ;7-.#cK䱒GIFeA?߾ywQ?mC,oFn.31 +1d1SFXn45Qa&1X 4OBWhFN"=Cը!"RaGt꣇&<`&:,)ACtȡ/>]eWE_KŮb=t#a}m ﺫ|__}g^wxa}z^7\yz_/'v޸Ҍi~{ "~DDk)TG;;Lm Fzír~=T|zwHeu.aX0Ғ͝K*v/}llk1x\ozڒP[Vٺ<'ޱj FO*Y0%glz]OczZ05"4kDm_iLԚ{,N^qh]"qh]T Wv nV7aBH}6)n_=;oQ#A(r+;>Ph:OiQ_=;oc~Bfᓴnv-z{KR ס?+7~ ՃDf?61OшY9ܢ^I@U 'RϝIIYiMS0 = /;M-ٵͱ},C!tVI%51a 9`X̰LTDQY{h2C ɑ :2KBLjBI{75@Ga9ωF>MP$7BlCM?`25 W2*N8T0Ȍ3u2JGP0!Ič3E,f!ˈI#\ [eq A?cKx΄9 r^Eo9I$H%'G,T A£QQ;wyB r*=z4:R%F%]8 '$SL*\*VP?}E}I-(Vy+~gL6?h`uޠ|Ld8Y&%t]Gr h%WDG4G( Oq%I*hu:ҖMxC7]~bZŸή*ogO }`I9 ֢9l5>LTH_T!C1ܲci#?K#SBt XZ/D"\Aa4Ey鈞,} -Gdnp'+5\/c9}?ղN]˲rQQQnN[E3ze\ B9*UZ%ly-v4|h嗷J:6' %|(Z^rk*+RGV4 +hX#>k'1<8!=-E5wܰ\S\nNKCyZz Q^䉜xǚQG cOUSJ""W;!g_EȉwBO9T&k=GsLԋ3JUyE1Ԯ0:WU/D*Η j3uVdۂ{taP6kàh澆{<t3@``lzWZ6@բťahfEUl",tyμ-?jqUMNaFZel*ZNj ga~D9ӊ Sˊ8{ɭRʰe-sِ7`Bѳ `a\pٵfF0츷xh:y %S# %I-VW3nWbWZL_IDyCo' }G+4&Bytb_={s|K)3*6tzv6Qsj3_L]X`)f\Gq0+a9%J'Gr ZE V2w-%4@د WW6tOMy '%&I1"ɂ$iDle]@AINtҒI' zm'/ܪ);zw(x}u]~7`/x,~݋W]o>Jc%^}Jz'ݸDomgZC5O~o/hߟ| }7g{wqo˿逿_S ێQ><ҪkI;gdj.~>S)墂"JڤJ9wm?йl"E=XmB.XF"0Vιmy<-!#?!༩]$zR(^Y<r@ANH"5d֠ X2o %7sOʲfנ4,wDf̝Xdt}rF_?}~~w³n_W&"~&yϺ٢e^˖Jy|+tJɝ/w?kM¤gZ$0\2 f9QjPIdE$3V ll9B[9"260$:bZct*+e,/+Rm^>)e &WoRH\!c{Mzْ݃V(>5Ĥ6h?*XF)mcFsh}VF !1 *:7"I!;@Is Ob Ar5MNWL4rrTv5ycgL 3N&P)tUE%TT)aEs:XO֫YnaB*t5%&m^dtL< AhfF,'*LD9H"pSJja&JL=9ˤO*Ndh4M&l/@>@z1zmhd ِiBY)n5陬Yd]9##{ꂫA2)g2Jya(6]oRfQ26r5D9IkJܳ@a ф V%O?%k +^m/l"Yc LQ\79zƸd~Wce-=IV 62<D ," ‘t:ab'XŮHa @2[ɲcJ tzG 2a3B5IlHNT$eNz$`p1k8 KfPR4 b_k$IKa1(̵ Ф!7BMٕۥ{q~:v?ӆka|ޚsMo/.lv*[K2Cp=eG bty..|l[$!-HRd/$IkGyA8BޱL)[uEԗWU$܌# !II)DJ %XaQ,aMj8B^$݀,eS*/SkgIzs"ISe$i,띒5 IYtRb53p$mr:)#$MM|A-xEZ%8B_iD" p5dibY@.I&-KSX²4% MHsj=F I{{GK"Œ}#(YY2lD-'K k\(-Hv5)$}J#I/JJ63GHb\-bj޹$iK OO\f4K_m[+gfS|謆fR6dkmHʀ_NwdcqsF?-n(C/~CJRCq×6XpWj9S?Ҟ7Gi!rkj|wY2c 0/EBh ~ |7g:;_d&pRX{ rk}B={Z'0,u򘧷wկ8^eCJOճu*퇩/Mm bMY ʷm߿}C tec9qmE7_%ƽ5ꌇ}y"y29p-Ѿq<(?ϓ#9pi`Ԕ"1Ei>t͇֡Na w:1~8fm-^MOAi͵XT3ܼ;/ת2VH#U Z(g2w3nƗm90޻cZi`Ul~+ YZq`ߖ/R;V̋[ԜtscMmKQT)Fa~T]ECD#v7‹}h͉@'UVTGKu]([}SgVdeԌn\+nsI꼾,hOz#[GΏcՠe-q3i7`UL•\^R^ }EN-}viIJ{G)(Kx^$sbkrC V:; S9֪YBeUh!=W@?o, tʼ5CV[hh ;JkO\!1.?]0u;.utw~: m =k0qk0Pỹx"Ub֊ r4GSACF>.FΉŞxŚxb*ˑ1hO;_NvuW >A+# $D# TtF("m[׆TDo0"hzo݂@okn=-hPTelO8&~w'S HGICl WR?|C'R<Gr( !(0,^qpaýUq'Bwk1"$}lx 1k-)+5mlg+Z]k[he(~4khAZݘWLΘ~ 4)ޭ1߇hU.Huh"؀ƼNg %h'*TOIIN<A"|[/vO:U(R]bFLD YeMf[vsGK3fv>H6nbQyll-bzY$M܃V|Tܘ7O!0 z~(>ϵΜJM;{z_l:g5;2EqO̩ o_4/QzgV+= H2GJb48t0QCp  !1V[zg3P~k -w)e~:yu.غ -) ,huG,\FCVYCel#=;[\( sQnj~k8C*#ƚT1۠H1keT*yWcf R*,8 g].xďsEe=Qѝ{0s;m9x|L|LDzoxyPyʈ9 uHP|_B$>$6W]s<{3 jtgm:R$ct6N C1x1-̦42փH)͸K4mso<# YBIti)#Y# `l4qIy<7\ae^֋d[qbx϶]`΅jmTw%|svԇ"TNFjyRĴ\BE,%0S2XW! RH;#H 1DB=ʈaYZq%d.|D;L8LɓήN^}ʽ WAN[!~*s9%˰/=yycNj1RlrC6MG4йNGgg.\I.O/?t33lFź7Gf9q6I ʇ1C Ȫay7yp;&yP!=Ě$V>X?X)*޼/a2'5+ 5yA@iHH( <)c[O L ( Q"W)dB`F ibOғ&rRJ $rltG6rpqmPoݰL>f6MU϶hqይәOTǟotADK@Ȥoy}(ةzz5ů~큆E 9M OWCƓ!og>ؗݗ2H*SN72VHɮkR/_f*QUE ,):=8On߹ 3iM] *zϱ I}<[&rLձvc#S4\ōBaA]{1Z۵Vvc.ZFXT$KhN @d1g6kB!c 7cf |Fšђ>.B O4Nca,xzg-( cmd^MI3$yYGՁ 1D'LDXa#Ya)*rZ@U[+$Xp$Z()A=RДi.IU1y!Vo!!.ҩ0KO͘ p-N`\PZc*Vg'qBtbR~wrFI hg{S\S|({`=WI xQf^,.`DN8 1 1Rߗ;©6ҵNJE Erwm;aQ!=YX],YVw-^ߛo/&:h_hb׆/gv@9i&b'kC'iïu^A@ޝ4HZjqC-߶#5ڑ>JzE+ŚTQ(n8oN7P:.S]0eez<񡐆?4\-HP=%BEHl5A Hȉb"IX$ ,ڜ8e~C8]B%\ wmiZZ $Ts)ؐM |e&զ\O @¦17׸{)ܽN#3A]&0zb ͼ8g֨\򐮳4.r){L>&<|I*~eNl/~jd-_}K7WDq;Fz' Ŗ3yxWfP8atoq_|/|Rlz?0Iv,x0i;O\M^o>0uOd%6f{i`Ưv%ϳi}7 nΞX`[scplf V_>_!}6{7ӟ=1U˵~l8fd݉)^88[O!Ο1_9Gj|mm^4C<%}lRc[e O8G5⥫seq-+W;8W?ڿzq-+=8+#o)#BxNj0/|8%}a=v0ps᭙^G@J^ 5G:U/,5,+8N^arwrr=ykp7B3i'盃yݬ$<4R3&w({{7l.-&ɺl7? M#.(c~v 4_Nˋ J>07Rz ||AudU!Aes J A' gy"*pR{|W`R(GſZb賤jyg'S>E쫙[YcIפu޴w1Z Ry* /HM^nj#ne6C-}=)M!besuܔ^@JZkEP_{Td!X:!Sy{1i jIL%8#MDA;5}Ölbd ; +ZZVՋR?-.#KA?~,h̦M9;yJHܻ3 }:[*KhV"FHdL"|dvUܰTz"% puxN|;wચɤipKJa2(g|cCA wΚ+؇`d~ p^$Ga"czW&f/!%E ޵5鿢ڗdjd\ _wdV,K',;_!)ZbfH*z8uF@@K9IVx Ob߯kiu^nL^I&ϸj.FlK]s^ujs|"<&1EFHY2t, !dwbTlV+Je7i>5Z}DHVzt(Xty#7n 2^rm-0Fub~5p4Rڼ9ks{J%W(6O=Tv=+Qr#p8ZBO M1{89Ы9uqPqm1hIT&gQͺY 8pn Q:'S(>j dޯG[<$-󤙻 X ^1bB/й ~D .8rFdBV \rٰD대^(LQ%tQ&6K,Ҡ;>>HIqxb(aV,G:%*kXy#oq_%Ǘ|^WK֊]?2]Jʞ@Z}ȸ=kGqIFr1Nd͌  \J+#.9ǓsԳ"9 /u6prԌb:Tr:TN5C%> \1J C%g,CB%C:qFhgɊ4YP[̩J%-Tv\!e$'9T xA^:L¬1T*Y3‹2fACXAVBwB㉗ ̒!~e LQ2#e/+od Mbl`;.A@\Q+ ET36:(!H>o)H4\NN+P=K*OBx$8B{4_7ܤZT>Z viۯ_7(d\cα 2I S 'pEgt63K}b )/$#5P-d%-):J 2DV0L;30H[rNKWyΔ+0Xk1jQFUJ@g,ED^:djt/cX9LƴH)h!rhl*{ۨ2kƁݷ&pd*q٪Eޗ*$=&1ʠDuˍi6ء`*S؋=\-7Fϡǃ7Yb5$1;w:kl$pݞ9m]V3k.Y+| z(%]UB:Yto_nsUmDa#HNJM)ؠhыG#p6s9f~YCݙ&ҕ`px|"3o>EDS>^W3̫%R|>ltVa9Pzqt2 Hdpf}| u~  i1hV{.JL|.v1I.bzWtK6 $C6 <wY(: r4/JkyԗZŜUR5dz zUa'{? >XfPҩ] ד/GI=_j혯DN7'9'ps E2?H{U&S^Y%uxO_F#Xejarq}ӏ.>%(OE}+CFEaӥG.7N k sTs6$_:ԩ1)y)/pww S}Xe]l+ q9[V_yq%ni]^č7k,ͻnǝ!HhOǶjdZ3`h˃Z貟O\K'ZF!%gQ;PhZ eŕFj H.Od(,d:de 9DC`IX;XB%pIE/o%g{33@#x$!o<%!ì',@*:G4f8V ,;20{" Y*#f+ʗ[.1[^7q!<As[PhnM[u{dW݉:mv˾AVxDO{j>;ds\vFX #.d&+1O'fMF\DYJļ*̅gxBz\ ee ՖD#`+uQu֞ }GAdl-@uU {mY)%\X3CˈB&txxAX} H"Ht ${2YXXQVEwb61:)&vׂEf9DAA`z {B?}{xZBf-lg;lܴ'kSk6s߿6Y~_A׈߹f[ ?`l y'u)"L(pv0(˵恵h%׍ruŧտ:$Z)atl 3]s,oa˯=X3_ʌrא%m]hXD>F(e> Rqi"8Y.Z @=AL_[ Wbנ C -+3V]C[zn9QwA@tbG<2-ܩl !&I=0DpQʤN`mp*Шs(4;gA2 K8 mP:\Nx7v"y]sh[.Um.ߚ`'>W28< q@nc QKT/b5HǐP/F› [L mE/QoFQ { :~1LxF.@$d񭭛VQΆV N]&jC08w ܼ5J*> %`T5ӞZT<8 Z9OAAA@| Pq 0hKgRRfΎq+!N`KI_ X󽜖yZh7Ccra b oՇOp>Y%x}W+>H{Gş}Su $.z n{>}.Z%W>Wri_UY}b*D |=\}{ H0сq7Rq쪑f&] |rR4cމ^o:yl~RRϬQxS(Z"%oR*~ &[cL]˥/S, !f!r%aޏ-g8kzү^vkZDO&8k!'G1Bs;ˬ)NY&5Yї+U39H-toǾe.2+7݋?ghG8aTr賖{?-395O#<ȿ-nѯ)42~/w%]'8gP<q{u1keմg-I%gyἭ457U!eX!_  ݷ[%3T{$2"II`:%2rdYj[.%t I"2j s/SuTqTxBIY9/f8†^3'0֢F.Z 8$wsuģH;#Ȧ"R Z:_Z9 ӅUX5c 䃟k3I3xzW$a6-oEi{8eٻ涍dWP|9g/~QrlojkN6)\#V(Q!);T) xbRE3u33tk$W7ےOv\ ' 1LYUnieՅ w \K}ϛEGߟ|BQ:NuwU!yTת#çjbgs?i#\]},+Ѳy .mB5^B!] m+좋e/H\hS3٢.J}8Jcy۞zۜmpkApkXwjKݮAWn_ஶJ .t3u0DZ_DtgkG']D`51 rڵ4:uj'ULCk5H\vh_+uz.@n +;Bv8%BCq :Dƀڕ5у)ǼSl Ё ѓ>ԿS9j4N}Nahm;針Җ 9Ђv#^w:Ü)v/M^ɖ+j`U{Wx6@舀]ЫNR͐36DwfŐ:+ݶ&W!D\Ɨoj[1 tc[m md) jQ[I;Qm[qRj}7MGTڝWԁLpI Wۙ {/*Kѓ<LӀ f0(L9WQFx$6PmoGi~/<?SocMscnOa2]Ţ0Ҕ>p;~{]@YmQl:xkn'0L; ?Ur诒CCx y0*Ap[#^z ?wIǵc64!~L@Tqx뇷?=)YִdPJ?lvJ~¬ƍݔd9%"ֽo޾ P^C 5^"O㯃U".6?QO~Y]AAW"wo|J EU W=;)燀= Kk8ADK!Q@X{CrMp D (]A$.NzaN9 &݃#/m׃ju ~4Aؔ۞A*BΈPWsjZ@|5 &]6]?'cx0IirU[2I|y"J^U+B]:'DiN!0Ŏ۴*ffr4JeHZIψ/2lO%*mXg,Xg2O9r,!JL@1bs7Wsg RL]\;/Ố_&X=Fi[; pyt  X@o$gSN71Hx+J̆t</ɾ tʻhxӺ` X$\E*yc?d#3muXBA )DΔ1#N /~{ oJZ6kS: C'u8QQ4tخ{k!VjJGtyά (9\|GO1``@F8pϹ>gO&#j4jA\/ZLЎs;BRχ6sb`8 Bj1 `!UX+ VA%l# klOSJi Z̤v*cN~R=XD4+]ۚղ^NϻF68W׸ASe]Vv|g\]2=Yaug76B)k9s_*j˻p Cx a&VTEAI7(/0C"L ԆO^W\i&`N)Sb-R-]諢c6 LK;1zϽDI$O;b<9O.CpSZџ'!g,fw瘀Z_3]gpe3w-?Qd܂~x+x?M9۱|dNK[SOb]1X$-t$ mSrC FfMNlZ :VZp; Jbg}#B2CfESn9KX'v_Jz.e0aw,}39Fi:Y\>P=4bQpZӨQIjkFWoH͒ojRqTp0t7O<;ه4 0)`np>H'0sN!Sӵkm%dtwڝ?zjǸ`;Oe|L#Wy63޻ss$ǎ2,z+cxQ̼/}lg(Jk2Obp{Zn"E;e/YԢ`WK䵻QTIE,z!V1LHWW V]NZ}{/IOSֶz=S3L9|5C/]qiiB&L0jsT˩HUI m!fHjh!PϵցGJSRRjVl!$U("hw\\JɌ@SaYeu`0fu0 _n~o90 #wCRAVWK> hXpgC/_P;/jQPlHpQyFՄJ6/ꡏ%;MXBH*=;ՑD:Nv ۳Uz_C8b?{{PUS.O-Ю1J9 C*tlá}!r_˫l@ !qtHi/@=~uH|U$c[|w7PXc.>m+MKdκ}`WK_Y|֭D`3`颕SqJcM#dLw@uXۦ 5:i+ځ+2> OiD k \`xiIcQ:T t4 " !F#B Pz0WX pX17P`|~jSYC8c/+[VVdIb!U;Fx*s+j0$M*{]7~U##e8gjRw%& @~{ 910[2"Hrd:|b8OC@tԘP䠆)UHg]򉊳45h ^hĆs o"3r ;,e~fp=;X >h+~)Ъ#F*"et9q_&jRCc6 MWj[*!sI1(G?ϼUF(EXF?pA EHUTl2\FkmH"~=lNv_ j)Db;zH!)p.m2]UuuUuU`0eKPFu^`cvrz*R;3 ai`"!tJsL꺞fRMPЩBSjLjCGMJe(m"eGFC3aB1/4*H,lD8h4E,V0LB:SE3sO75uaHL+!Z2nM֡0^vfYl>Q\^Fz6RF{TϼHȥ AJ^ ې䫷W6Uk>O=wd ͤ.^,Ms}Bc [viU:B-4x~VaS&o 1U?N0fHbp'wx)r7],]ҟ(Ǽh=V\VJo9_9@eUݦCp)^Hw Wu2!c`Zy,)fܤf镍þ\ÈP.|Iw//Dp`8?@OwH'}dT,(6WĿ!;(Sp|)N~kZ Og_Sv࿆ P=&OC^p<ҕ#7 /Sj]*ri-ϦӝYFgG iz} e~iE;gׯt9WuߋSY!r T\仃SLL PBB=cR/|t J\ ݵkk՞F"MBd!c|!']A)jL* BNi^ȉ{K=lD5[t g2USqR^c۰p^9'ylNP#rX8q,$ec3VDdȊ_o? up0P`В!W&[U31mEzUbz }GOB\)_{\kSQf^OQϳϟZdz(M⏦`Ǚ܀twh60COJlɯΓc1R| %sFc2#M [ejLƠ F f)4'QT '@Z1f5 6=gGY1ª'aMF6tנ*Gym/LIjL^!RF<qcyTq+Eݨ"99zK8SQvdX|PKMSo0 B淕|TRk,#!-R70CtpxeÜ>Z'OP``:=5FqMprQpbt"W!9~վ5#sq?9cO:H'ݺ ,Of*/kIկ*&T|3XꧥR\iYL#FΡܮ GC֖bLf=~Qq6`a~͎9S}gEyPCYs8 k0q9HbNχ'm Tשgnr&t$=I~O'q&t=I9S=؎a"n3#cġǗdAq_XO_Y|g>c] Bb?xG Or*T|]փ/ٚi^ma'Rom0Ԇ @r8 _rykgڙv&鮝Ir1 ŜZʴA8(b:hw9xcWk(ޓ5IdM=YtO֤zOVU&Z8II,BHk8(a2RC¢lvVG eˬǬ fUGZIr.'ɹ$rRu. +:z#6SJKcX 1F%+)V-mj%m?^Dr- t;\?MR$MRۤVk/10ia۠\t3ƛ(l{o\&"SnRcݝvPAol9!MpN0'<( V& Ms-(aB$Z9Ӂyl&*JGfm ѣD2$[mHwo%ei nw7S~{3gл-|'HmҾͦɊT jP]T`~"K:S|4< # #YT:n(dݖ]yBK4wq؞@ZMe7U(6[g /u`CM`ଖ.d^#0Ţlݬo$Vl ~ \_"}مZM臥J̑,y;ǰvփy^u'AaЇ~ݱoj֠v /j1Lh,PV‚U`%&<:zњTKz0kT2}LV;h찌 A?[ `j)=j8=NCL=5ޑ46*I4"\,f^cL( eP!2XD%sxrJ3qǔP\ږ8ezO&꼄z[+88 ZX2 ~7"5C~cr-N dHBwSH+DуZ"p3S =MES$dcCN( =B T?&{ C)#a%tk*-5_Ƹp9 ,et6/>z& /+WH U Ϡ+0+;5>4Q{?,IL?.HV=,l9-^%+d3%yf<>vu2|}ό/k|_?| $onu!{9Ӄsq7q N9 -և!)\l>,<1M}i R!U~ov CK'ھ ]3_uْ2tV=ݛ@KMeo)׮)8 Oǒk6b[usZ~\/dޣ6(;q7 s>p]=m   R>Ĕ?΄ńhDH0X}%-Җ*u%S.sS@v[x%ԂjF~E$*$Fqrbu%8H-KzWagJ Xxx>J9"k!:qT$sVuLϐւV"MpHN@6$1j[?xe$j21YZ,~T7tvU?GjnNK_\ff7r5e,{r5T'՟Pqj!J+7URKMoGrQ,k.Rٽ[$ .,sÍ+rU"6~jO}X*tRv%iB5|q:nKDb#ቲOb!fчZ0kLsR׀ղ4|Z}KOi¢fGV2EFi7]Ob#:mh$zLhvBB"J*u/ܮk7z[[ rDM\SBn'ݺ?dfM֘ȷR@|C!'%*ˑթX aTiȄ"m$՚B;3"XϞr]NFH\b0x;*BԘ禗z{sEVa bL$(鱠f4+!)LPB)^/dӰT)f 7DkBXX>ft=7u1eYkI}Hwe͑F0b{Ճ;[} 9,{`x9#ɡ@$6P(BaW2@Z *lRZrMLT eJ5RIJW~D T#ٓJa񶞚7a~Wi;WiPrI*>}OMȟ&<^UTKPxu߼'FtaMnA/+'G07P-JÞJ$/ 9B^%P(] jr6bYu Qh  q?OX*\RyʂQ]B1`{ҁ (*W"/3OC.c hnZl}A.\s}ZJ׉pE1ݝfRN݈U,+Oyԑh%=ff6$XKŁX!w&_5XGӻk̗m_~ӣUmP QiZ0PL Şweng/*iZԅzQ2Dj ;i9J6C Óo64`:AqΑD9Tej]ъ:@S2[ PdmYIJ%ڔɪJ ʘ# C|#]r8h:l`*k.RTו3<0PlCL `2f_G;8!{hg?iܜG"D1akB ?N`E5s'`MKoCpǃ|z&WiIC/%D9P}eX$N۝e8\Z+MXK=IfǓuak}۩LdENd?zIʙmjц`EP'w$*`:阿Ube @tEo 3<#FJ'JaggJ'2'~v)=FGE!U-H=ƅKkNb7U]j@Zq[Z8nS'KOY˫SsWv cѢ?[v ETi pYf[T88|s]%)9J1IP :nץv! r '*C/)/i>UK]zVe~Kw=l8ۇ^: )\`@fy@)"U)_IΣVJў8F +AE{#qJzBƱnyƝd0*_)NH /Ð]D#L w| T\i=={;qlR8p9k3H )UMr‡ȝ1 Ҡ7V!U Uj?YFTuHC K~Hj&41 r*7j{t()qIYW PeaX7Xvk*72Q@\+^3qKl(RseUZδSZ?\-%wV"եf $*7N t„fnVbP.pqo+Sǽ{n{ثucA"43g .B6pӇǠHܵx 5>X/_܇}R󎳆P`vȗg6{ag 6 V{G֍LFO)`\#Ӈ ,?ǒ- .@xP~nڎoEMi̤GN!H\ZrRŹZIT }iv5E llP,x,Hd T·O&d5Ayi^ۥٶGJJzc{"/CV\zHY?? +. ze }V8c {N\Ob  jo :un=1>u@R_f\uNqR΄OMLI+Z )ӥDHV`jɨ. 1Ɗێ#uzU4(b'FexȳBKed?a6LH4}H4 ySÜRJiG L(?REeJ-.Dy"|B*c͈(&h-^E)+֚;J 5RWA\^RMqpZzGہ+e<3fePd>ݵd?B9Uk2^gҊЬ]:sׅ8e^YIHgUIu%jK槮V1RY,MiԖ8TP1`%jY%:(R}+xw}mٯyufinh,-[Im+~4[Uc&HI[GHD6G%Nv7U tv@oӯ- h%ЙnzlrϣP\LT}2K1d-ܤwoGhTse<-f??z ~V';sLh٘yOcF~;7e) x-J D9`sI M:,3+,H2"T1ۢj)ai&YL%-,8OX(UZ4%:W[[,l{/@NhV=~MWI%ո]ԙZE~Ĩj1|\Dil\v 2x>v󬴸Ԅd\Y yzɸJCtȤyyQSPIӑ|}ȥbLjTI!P3Uߢ \]ZJ*&98*nT-(@I8W͔r)پ3 >C6WT(\)œR^v w bDWٵ?dwSaE|3./ikw~g~=y]79TT+" P8_ŵe|Iq&͂V'mIr_Tf]r ׵/;Eo?^t93y6EHFm!oE]m: Q"k dA%Sr%. <򿧌ӹc/ jzdþ3Ӱ/u9 ꞋruMJ}c>2>}Pf%a;^</5/s5f; ',~8 ^uTqOw$@sxEN~}{UU+ O,BKSuy7z;vTh⧼ir@B̍l%P`BxPؒgVD`$$\ N>&=Gԗn8LS>ZPVh8OѨ,"5!q <[h|@ZPG%Ui4 Yam(H[0<~X cDZ![D`{v!u}5_}vg^ĶgwG=52컬+Յ3BF !{WtGg c]W ,/Ds\O-0/嫋ɍEA?b0*Vӛ&x} 0!e q`3^ݏ$f2[Rhuc9[6Ⱥox~AV"lDj|'96|@_/tΟS>]$Y5k$\`nbsg!導w~E`):hLK)3sZ]kcLGKY0&_e=(t<@qi>zm 0GptNy=`,1fr(?ݧ>}xа핧(8I!\{*+0^+qŪ. Dr!5GX*J߹T ơϥ$.n ם`昽Qܠ Q-M);Fv2@ Nɜk3{F[cEp5>6Ag~?m1AEϬZ$.Eb_Mga(WiEΤ1tj[mgԩm} Gp Gwjjnjl;ƅZTɚjsZTDI*J[\ar/ԩˏ+xEnB 6G{B3p*"Y*J zDSAL^ w;Q8BFNxLVT1r?-$;atR%C콉`;0;㹯;ƥA97L1% P057h_XS;~ĻT,QVb%wm$j7Er~_mʒ|R∢9|z t?U]o]]rA MRW<1U3Wl+l]9Mc0'Lε0q=u/j(ㆈm"C1*,XP*gBQ?sp$C91EBC_3;kIyz poҺs.=cJ(s<i6h;BvloԠPI9*`b ggUI@ż@s_q^k0ժ$bU?1#ZPؠCi+AA.IhX_-ʼnd_dlEopcefm wp;\JQ[P>`f۫Jo-vnvpVjlHLҤR 7-,2=5]1H1kk\fjV47~awp}lvۘ$X<3GEh}n{|_0/ކN6Df.|:2a.Uk%/^2AJeY̓^^E}e@O<su641X ]U <ʸ;i=gJ=T:DB 0a&*2ɉ :'&X໼?dbmu|n4B#]6ƲlS@tҨ-ܭ|kw}3a'¸WӵDw]~{":>Y煁@ѳ"9PL(If Np/#u:$}7W<\ 3Gm|zbxˠC E^ktv X%7!D'1 rd- -v[r3JhP}`6;Ƣp:f**t tLٶKb?=ʠdp0$FͼOoePŕD]>k7œ?01׺}A\<ţy0Z̿r7JYUL.ޜ z֟݁ƴz#-?a-kVl &,N,P   )*p뚳PNigܖќ@2kBCHpi K(@JYX2mcZ^nw=)zoa=b{.Y)5v=NK/Cvʚ X=x*shB>g(6#5)ןn6&3c ŻW!S+{>j{ԯ< U//~9r?rv?`~cPQ"ϗe^0fJwzG?oiL~:0;_?9Z,Ϯu vc!hǺp[6}ˣgkو4d׍.ה<(JkO*CkC.ĖwzX]A l^sx|#O]uO=VaA=69ogoٰV:7fmE8HrzФSbwGlRAy>gs UⳚ|{UcmtlQO"z` jBFcM<]#' sD~i"i4"]y͘Fw72uX:=SnOk5)f͞8EAB\)|'!_77hH/~tt~VK`,G/.WhQu\䳟'NN,&;gȚi~N58HvBKu1A@(ZTscX=Uм֟SH\.FRheA#6śhl.D6Li+H5<-2ɝC=!uԦK>?i.tOu5p}ggB'1Ҧj UDi|V]+v;i:I8ߔ>[rLwL%ƚ ;ϭ.%_-=I 1M0 h$dZ~*7g^nu#OZ_EctΟ._&lwrWnX=fWҶIkJ+Ǻ6*ֈQUEC*p*qG#I|uF;7^xSZͫD5ውsu<4XQV͚lU]{˰7nӺ N68moeXJ*Wz%Zo$ #!0Rϩ]eء/tvwxhOfeRb'XV j-HEEe09$dKݕ(Y9tou483|8߯>pO߃ugDBDiSvP) K&qfJ#sA⼐:Gy%?>N҇}vPȾ"uA0) JtY)i<#Zq$i8TdHRTr.6'O!*/5&gzzAp ϐ#j ׫pPC~ C+PPw uU 긔#_&opjjWyھ!?sTJy0@G9pxJġڃWɲᇮ{ }^"9Q5WagyܽRC$>=UjEQ+[gǶA6^[p1͢~'z`pht^M-?J0fGpFѪlۼ5_~!/*Wnj۹5Ѫ17;-&*.N|6T#fRRHq_8z{J!Đ~;xL} >9 {l==|O@[]GQ=:}*c[ ģaڌB7z/ARH]37aIx Cxݕ$`pceeml3f~;{1[.3ߐpȠ#]S{恩鄲8 ud-j)bhzn wkSl*I5׬)ſ0ASrYy%CehS KfJw,imÛgY9bVER <"KuN YkC1ڏIj-=EҴٍq.RAL.PP R3!*%QVXM]0ERILFA # (7LePlbޭ*-ћetǻsNCzxv7=/.jcϯѾr?}=}!{VXkv?(Rx.vdsvWպ!z0*D( LX ӄLG9le7&е .o4`)&K7W'B=o\-5[՞Rko,N)eze .4PfJW=Cz ]$9B_r݀dlecM-Zl{HC{g[ggꩮnC08k3 A LQrb!Az:JpVyNuZ=%6`1W-4+M@`+DZΊ qQ7Az՜|6SI;x\4ȑ!"eu||%d45b•\5G m/uiG+HH͖H)[^}׾,N`S9lRhE%P0m Cc&1`n-ŠæM1w( 3>7ƄbD:JMOnl@OU!׿n .D5F]tO,\$IYh"=li ^,K(QA{6f-zHoD(LouJްiMr:vl Z@cl||@oocϏQ??>P\~&F,ĚX3bT 69PWS JAd +AI4 AsĐX)hID)Ĕ/%r1BXr:24V 5i+x^j4+ H4QL%S8<t@lKSIk.](.&UV52Ѣt5fVqV' )@Zj<:1D*U.Ċ@>;yAvTRGG$GVN$9]9ozA+w<ځigF"``sOS#FH@)iaS:+ק=zZJ{~]sav ȍ[6m?*?Cv2(𕔿oe!@+lO_DxжK#:OWi*Ng՟ެ阏o.5PPBsm }bmJg_D>}epշHd>ޅHzf֮(ϙH@,WQ 唢ɒu e1{8!Ak AO]'JдG=6e*aF3DW SpL{I~]Ixٰ ǫ &&A.E14K&b z[MጜZ揻86L;EQ-=r#OD&b6&F1ZuV/:JðGZ -$ZJHXnK-Irqs?D;T61bnqOٷA ïŔ2R2Sz<.}ɩ%—3pY Kͭ잩j"XuSy\$C#G؋VlZRܓ1x*d_tCk3X6+CY ;";H4h䢰 oZM6Ɠhec*za̍wۻ`. ְ~(I2Nʫ܎63?L3ьV_1os,jɭŻ[Ş£."{V;(,@]NB C8ӖV|_* BlhrrWh$7Ԟ%%K/,g=GP%1j\sNrLC\ T+j 8N 5Xx8p K/c\P:Y2;R(#+XwZ]m [UL5Fk30I]݉X;`rl6~>EM8zn02md9"W>2D~9')fyQ'(l noUwXZKzx_wu=r|# Z $R7=~˧Gq7ƈ鉭';^,>78O jE6O{:6tl7zoX2-"m$Ȏ_$+֩F&ѹx1⠹g:%li6R )c-ApH eĚAKȾة=Ղ:ajUf PUY~u 0屺fffY]to4-&]吭X˿v˖A,yRG^nĠ!`ͫC>_U 2|xY. G6/6y] /?\E eURGhN u;u6-W)&WT[E֭ U4Ghmשz֭>혾[ 6o4ݎE^;w˾7w@C~pmS%>5vؔ7 8 Ň8t2>Ѝg/[9 5'rWZQf,SGn4{&=MEqz&!#IX &&^jR%"•yd[ݵ&hZr,(->kO2ӁwLWVuMH*$S}0F1Os'o_P3r\?U'NQⴚڎcV+43{ɾ7 Ŷ`tmMPU .@?"D͸+ɎsqJwō|~!0e -(P(ˢ+0{x!ds +R(xzUQ`v?87_Wc)KP d!-9Şo<j7AB%ƜazOGnq"oE?+>,k ί0^ޒIDكLiKa 0f:׼<ς44V|_rs>~ok\o`ޯ%0V>J)>.e |Dy6ҕP*Ô4BX;y={M7 H` L!q/#JDgʱb7ngjqݩ3Go.oaa .!@r% x+PZ- {  `<"6o!!w@Tpvs=g+|'9 +R+SʫrJ ,~]⎲"G۳G(Bk:|>L+-b]}QVB·[q'e?@G=~~]g1j'(^,C(\/ڹmЀ(j|'E)Vj`fS=p[q78+v, !TLTʻ wo>Şo<ܧBt#fk]ovxc֛O_[Ɲ7bZPmB~70PŞo<.I  AC0=CxΫ&.cZFb`"l˘ ]ٛX.bwQ?OLP//>׺q!r;xTzOB}m5)EuKRg#_cx b6)i7, vrr5@P8FOˆ1U("[.0~ 6b]i<0MgFہ钤xm,mbᚯo[ƺOLͩqtr"e If_"'ܕfe`zQ$ƐS II #F8؂^0Yc ꣃz@;-Rkcy.[㥭Hb`eJ7i4|2HRz+{Ƹz>bRAY%ۊ5_eRL霧!sulpH ThCkWW[sӦ)G3t<'$-*9Ê&)I'Z4T"A2dboHVqtzYIPfjōJ"j;6hYr4imÝ0-?2* }fNW(JD-%d|eZ#IW: ڑՋ7auͺhtbQ6 #!B c,)xKe.\F |+t}tňfb$Ni ףu0w/7in"ͦuHR$BdbN'np@][l9r͒#B#\5h[4 c}6XJt?15(Ӛ-78yǪKmFmb怗)MC; YrKsO'X-@IZN$,Dtb,W8Lc,\9_,,1 6b+X9΍ 0U‰3LcLr7]'2&)Mj3'USlF84qh.єcNF$<8l .AҞ-֮~dVa[0o3lS0-X0<\Dw7w[q]Y`pg(d4W̚)nCs{?6G9mްaո0IݰOFM?Yj/90]ɧa@ab&@{HPZB^ʳǜd=- )#n%ʱ*/D2fq^i/3ÞKy)l,V3= Oݹw~0;ݛO:uvffv60Ă:;͐v|>N3;A/N?=\8T[VlbWl%xD=Ksc{ǜRʴ=Kڽgy)GA/hwG8ゥ2s ?Q|\(bq40Cfb[+O5@QxTJ@,%ߥ<E2JÞOGC3{%"U6@CN )o># Z=m@5e{j٣ڤ$t՞%de8D"3YVLPp./<0 ʺu_[A7]_\1 g6]_P1Eu@/; ¾EAM1CI^J y>D#`4xʀ7b *K j݁j jמpyGnf˪u;2!*&jG~rRQ62!2/ Iq%2D{wp/b[*0љ#^3y3` y+$N1 ?R㶚m515֯-,XQ)ح ֊;o=,(*,BDACoa)TLđT |.tLIWD2 aJSQXytd*Z;'&M'~*0_#pzf49?&3 _\}÷j'g/vzO|{˝R4dtڅ{/>}~ {/w}{Wu&3?`OzO^>|է~.&k?ɍՋ>қ^+7>v8̈́~\7> :O럚O;t4IAt@9kh&™)@7N7 [w'? sc9LUisN(^?_/v:vur䎀>tg3ofݟgH[+vl~kPαů{c>Ns=/KX̧A?d-\ _pw?Ws21ih:c1p]~g7ys'aY>,?S?0zN]wA/f{f1Y΅c ǣA Slo; 786Gﱿ6~h|kxoE?ppPs֯K"6N}S!,/~36?AyT`.{-`uȵ~Os;5gF pn* x..74\ޡp⼜E]?9]×3fۄ&LOIbqڒWѸA_<1AesPV dk($.ɀ! Eaṟe 6Y?fמ,~`SGǹ6?Cߏqص/cݶ*3"P/{mӚZZhne-!(s!yh8\{\B1֩ΌaDӄ9,6tu夫;&+1n*$C$b28/a w 2 u'dMk{׏d{c]k{Wk{Wk{m#Ek"ފ%Gs}P`T3$S/$ϰ"Ŝcp(t+U"B HP0ѹ-lWw_ BB nVNH2CF+J [S㶄%ߠ+~Lk_|h^W_ ;)Y ;);)}diAxzT 20)ud(e?{Ƒ=n%lIV,K$;N2IQp#$HlfUWQwQZ|ߛp8g9[}[gokz0Q uS@KΗE .q[5c;p):%JAq %XF@;ڇh_>Ўveet/EFS "2-glb\fFЬzK6h^-ml)MR hz-RΖ!)kLكA13 ͛13131\_&2eVKˬliy3i34JuS=jm4Kr]g ΟEw֭FuönfuO!/,]}iOϞ[{_WZ|19Ã>y/<}wMV;+&5烙KfFC4~i~zYa<yV'a)f?xN_ƌUpQ+!ۍ{g^ Adf H$o-eBC1I$A`@ǪRHnR+4ŋN4E `Zi/k1NG.屳;걳;fm#߾q3 koxN'oAeI !#7t:s!Ifq@pQG)v侍PW" iQ(1,֘KVd)GՖL:%\Q›z 0/\2 YiȚYH<[,2;T.pP QFIHXʂ4RX"h DiEK/V;0'V+Qm&FRIimYH"Ici.1FJV9WVݞ9=GS$,L^W*_ |%leSN !.lfMCaJ MKz?jEehkTIJ5E+y\@FHP47 BTy>>\VL<_lQB8Pp-?:!gLN" Ğ \>EEӅwzaW 55]D6xpBw=cd]aWw̿O{;ͼBxשN.4ɛ,|Oﻣwf;Y9._Nn5t4C ?N֩{nZ=^;Ŋi"0*EM"wj1I\u֞O$diDh*d4eEVV,P#A>IA!eFc:`)#(`Q^B%=!^i{/i*Ru@WP1X eN;dGլUI6#٪֓yOP+t$[踹cۑpOlU'V PίQ?Ҭ.c&D7gC2O%BL YgЪ&]2)  h["h.(8M&Fhx%# I MnzWY`:߲Ц˴|lQgQ!S䱸`'3%BdKJ[Y.iM|0l͹(w/a$BZ&ݯeu ŝm L?zy _%a!IYLȔ-0ˍm< [uD.6nW#R$yUt#շpj53I&o*:UQ3OY%sFjRk⌜i+xErrGw`)ȇz4P-%f=k.@oD@Գ* и&Ϛ&^7Rf^35NT{ߕqnYI uy.\g&war"*\qiC*br-ɾT'!*r z2I9] І8;356{ Ŏe'|w;˪`k_ \+ZKЊڗ^5QVWkqM+]\`Zjs2W->P@x !QbxHƹ-|mǷM;㛋w%67oa+v)o}[檿dlf'uGֶxEup(3Q(Z\ 4\H݄ _2;vmCJn]QkS3xOJONHDFJ8$ȥq%y,^lo'7?6Ʊ*H"-,dz62Wˌ2F2T5).ծW֗R֦UR^kq@9L7W5H2m0 ZFFP^i8^--Ck \ `e(, VVQp 宨5əEW(W{rE;9W{M`mFwZe4ݞ\4=W;g͒jrbJ*S^y HVJK16nB +]*L_e{y.gH=9x I-I \? yNgr*)ⷽӓ -كͤAg_nhv{kr>ptO,?>t t>s_QPB4rt1 ;fe&7RJ##nNJxߌ;qOфj/_>&_~q/c0&G_^6\^d1wUѸyW_~i|'\5˧~vrtq䏟c00^R`<}qΣ4M_˽xhk O}kmޟoYfo|<-њvFW^v:M_?YՔiJ[526#cSMm\q6#MmEJX1I |{[,zvFXv Q"$-CԹPe;P8RMB:L>sG 4pF 4b~=1ЈF 4b-&(K4 D&E,@f L[n%0)a5I1*٣3[?c6t-rX pAJn&T{.T.ppw0S%r<E$hT HM4/^(V :"C/XJ8cb^<_Xcb^Ջ5V/:p-/^``b>ljY ݇C%w--9vҭG':egR2hW W^ൔ„hU,1CLҀiES̖4'"׎!Z vĵ#qkׇk粓kwuWϗZ=2 k0rfeТ$r Tve=mt)*;1]']ـMڨH2q+pЈ؏#ЈF4"!)ʲ|i #엏g|qhbK2I O p+ lA:`P7Bm] Ysqok!nU+WpNn>W0 -e=]w{ɗ/M}<k<$맅*N5Ml$_>)1drdࠓșdTʛCܢӀ>[T1sc#Eba+1IgŌTyeŌ3bZj12:p({1f@L'3YpK]r ҫ#E󳽸w{{Qٴ]Xi+RZ[d^IbCפMBPQ9CzM>?`rz(Q?ף=J7^rA : !slҠB"p0[{C앣l3:{=2͋{J41jDusuicG[dzyNwhɨ1OH:~??{؎~u9U{1amO _d.`!LGR\(&j4Lo$!JcddN8I.S$h'16Qm ɄVΉ^D$Ho)@)Bbٰcbyh9'{͟Ϧ`(_U9 Űّ}z΄ZqP0!2hޢ5NNE gQBk:O[ 9aB$pDSk,$ iF "3SM m\

dȭ\bCVw!+#wrɒ_ܴU8$vB B@&eqJ\EJSftoyOm' >' Z2~q"GϭU?iOy]d,\O?ogpނi[+Z~Uzw^>6&! >U9l]HO_/.d,kl'6Eظk;['{v5LJMɲH9J 84l.qS&rÃ|n% $?rDh98/SR/>7>4 **jX+t2b:k P !3V_B :BLK)q%r6pX:㊧e/*B_ WյPD&Ora1rӀ[D&MpVг :Gٖ e*¦y` %A"$!:[WA~U.r45w '~h| 47hn251mr úRt ]"1B'YqHbA\0Z҃3G!Mqۇ9 uC-y{ԡ[EDX/qG1\p-v,xYfYw$X¼!aR&9b .LQ"F-&kuBx'DJv͂wN&zH!Zƍlزq̐ٛ''@8"{ǓZ2- ggF>3"I=rqESؠln"@{g6/[C:thkD$v #[C8 qC$#vz60o;:h`90DA6\Cߡ^T`^<鷆8c#¹cpFXQw'Ò~(,! W͆zGe-Vl IH!ߺ ,a)^+GazvZ(a)D_pvl+5dy&,ń9>HHn,AscFO) [ Ī$PcXx?%d`Mh&(~(5[ s@|p_CA{٠UؠՌCP S DdG|</Z}>{ixHAҜgU_;y̿^>{.j6ף^rgm7Gn*NӳE˰\!L']i x9;rV._羱5<۾1=%KfdJt>ܼs}1ϟ?.߾o'ojI٦q8pb?[HpocǴFr:"LpR_ŗ&wX&w.& "m7Y%]W/.]\?:z0o{uqtڳ'Wnf3'jl{;h7Gp$qOyN[uV{9ϧ~b扵jX1#L+ -Su7};AT} v m$y9C&n$ڣ$Ch6 .DG[ArmkYYYjf+Cx,PJ$ VM<.LKՉz㸳h/?qq{◮gJ#ϔF){I]e!2/Ђ?MUۡz~ήXXێ ]#RpIEI;<l-qbs|oMl0xF*=G':i |^I /{%6fOCx_vaȞ3MǣƖfI׀.E꺬8[(deC.`q2ot/&PTz􋰁\Ry4Qd\~w6=ܛDh ܪ2ӽ OruF%"zO2zBi̅`5h5>෠VB(Ֆ^^,T8~xC=L& $P $7 r)B&[PLdMD0L 6Fb=xL]b8u1"nиjH9.g0 nr;}韰QX#/O魛V8D@!>Bx"Xwk됌42PidO/C^ k P:{lv){QG r !aRJ1Bl:`fȑ%*J#_4Ǒ]>f)&zzH,ZVkJ! MY@#uʎ9V7hzHZDQ"5^4xY 9#C-94^u<֮=#V,a+&@ DFJ '厍1p& KGV?ɬh1RopAB-1T#=R{j1v<4nvM^rY(.:GpImqd#Ze-%G/bn.@D/vɣzG0v".NTݺ$7ŃT)pڸDӑKFz6ct=:: q1Ah/yޱA1 ڬb GavHb\X`S5zz΁DMgD@gƛQEp&.Uؠ -]KQu:\3Э#G'qqp#)#62snhjka*A'pK ?\Tީwt( c#V-IeMF[*DF$z8K\-q<,2spiIZ%lͶw_K06 ҍG8 `Tį%ج۽Yo2kD -8I PϗѬRҤ_o?;eV}7Q-˘\DsqXD/YSK.lmh;ᐹ!،zh$Rk V BbCYPI9Jk)8Q+M/X\t#w:}8–Z wthTRCO>(!sjJ$V*Ebݦ2c9VuULuM^R44 ||&ӋeL KajTJTm+JK.2cwaǺ#d]0jcEK0(Fx+?0ɓyX8 slX1EtVcpOF) \܍ hI8G6qrA-S>5-ef6[ރPhs@IFoC{q(~<&AIcji/.Ǥ$V ѼpvU=c3$Ոdk戆F6 [rUk-bKf+ga fHȅsجĺJhU[-' &T[9EcR{-}A2e%WD7Py\V}l7) " e69|ma@jeS wJh3#ACy^s4L78ZP+b')bFJ+c,o~8,DpʃMVGP,Qn1IC`s\rB%\&r = pnXBfKR`98WN%Wo[妗;=~][oF+_v`)b A,$/ ٴ#K${& O$KDJMM&LWUuըcQ ( /5 B E/5JYbo#%R <:h)dl(e:VE8uP%V3(7 qF!!8RkP@a (;Re)#r^c>P0 D`C2b {LfH̟)tym#d Cf3D2Ha E%1zdBH =KeRi\/l\=ng3b\K $f%6+ ȳ$A 1Ero(!sQ\yfrAe\@Qx$yLV]b[/*.Gb 9 RD,E8S,1V[7t36P[JGC>XQK7c:\E0~4$idfQО@#I(QM8Ȭ("V{p|H3C-v yjʊnuE 14bm{X⼃pyECv"/swx,# q S-Y"[cZm9&>Bb/HIgо1[_&:/+!0S sz0U[ټ?tFu ۯ¨Mx/o|Afgl/kܶ_j^lPFQ^k~)E$m Y'KēB/gC C~<5hTCt@S B   @m6[vy="`BVՊI[eFPpu߆&g=|<&1 /%97\ev.fqT\ʆ=}ߑ橝pᲵejd0Kc̃n8С94|s9qA AR nRS;RS A OM:^BAaՃ, BR |{2^b1_>8j$<9aSMD䡺BEȥ'\&?wNj?$~̗rfC; y0=?.sus qe5]G'Oz!K͐-!Jw~8jq[6^P-S1^:`R'}nG/&`}1!}+;ٳZdΚqkZΕ^M_Ʊ'.ϥ1f ەA#}ƊrB7vxjqM$Pvݼ&,ݣ[V0ko>.K󷒞d3+A/R\]V$!R2Eh7NSVne1p(.x MpnUHHEL7hdJP&DڳGE+un0@Z-[u!G<Km8E.űRODhuFOz f 毪 u2O:C 8Ihfz6J~fszcc%z mԳq8#Fh玴N(Zg{uMiӃKL}1ئ{"U*m8ň;F^|^RW\@$|/A"B|RHPcj^2c넄4, :Jͷq" PAoNjRg 35!!5FpL'=ELǔT`n~ V#2rd+3E`3\[~w1Cr6A''NvzoG׎q y ɲV/N4>2xw&-ϯ|^ݐ'WNGPBa1 >8u4#XS-}w5!5IG]"@|y=h Ru'J0?y6 AUoB:+;j6f Uҿˀ"2 Z{X-VTB*+Dpzs:ǧg1#ԃj" x=+',@15"\LMF1@8!QD@"E2~.k~UyddJHq Pn8fx ~Kǂ^sǏg=``R@@ݙPE[2u8Ļ6F]J؄u! 4#LHC @jD=[NPS>ߥFẘ)sыڷwv- ЕkM >K(;CѶZ,Hu9ݩ8]-n3Yׯqt~wW7WRC)$N@ bH5Wv7oJ?_nm% Pœ%HX44P( š <= zwws^ٙ%<3mk"ĎboU|`[t=n y1/EVbk[4xT9{sB.) 1CH9xr]LW-SƊG`qbkZe:{L'?\]rU(Q'~=ҭbg2뺟[Ŝ|B{b\) C{ px|>:*{ݞ>Cր~Ԥt.dž*pQܙds@rxz}DRV\a)Qc V\wѳgScwx~&q$ $U,]U*77ۭ}qR{t l.?BTz9E@֘;BZa׳6鰏?&S.HLM pejI?}I %gI-A)FXfXj=d[_m7c) ĞJl`TѫHJr7i2~,0Q @4 d*2_(V"ag9)ʙn%Z°9țӀ15Dq ΂H1JRGdm,83FUeGXk [-ȥu[h%|7>܉+.ՖD\g'{iQjmHwj(αg-tD>maS!5Z-х/\p\T;EIB0ÆA ޥ0{$9uέL$d?܎h- %wz9#tD4vམB7}،bV3!q5[SapjU[~T {W[n{ ޴"wqS 6=;)ڼV?({?V acuL!| žKŤYU]!Ny# S;t:BAU 8x:v`pf27kmN-vb֌~7?'Mѽ+$*iJ|Z׌ dq\y>1V=$AEks>{t;"%nHd (Ϯ>"ܑR>Ķ l\hcg`A&%Hxi0|ft4/,- E %c},ΨX(`dƭly^`NBBKȈ@-Q9xY$;<"SGW+8Քs! 2.<{|=9s0n$&Y meS81RH dK C-gÅz\eP߀RY#0GJo !:̷o{;N|OO|_/fiit$\,"Q9`GZ$ZR8 XC=ɾǥ f3Ýd4Gz=(GòbxO3rţN=h8޵m,"i!IHE$wb+K )9N|;K2%K'73lCWh}8-Wm=^/,Uv? u\SW߻1K8H. OCX*8KAKT)GޜQ! )bMCwyeI;Wp8$7&}*P;t,,ZqnjNM}3^?6עQWo6^gO1,\U4n<`e{:3kZ5ƃQT+qxuZASqΔDN`;kATMLs4Z/`G0HuhHvv5HꗍmQF-9T*;Gu'V3(E,Ǫs.)-!\0M,-WRj{kㅎoMVC0m LN)p.lf*EЪA{4sR *UBOs=Rl0Cd(` 0B@< b$=Lqu6oU[ʶc+mLlmEnT 7M4:~MUkotI\K`쳊4A("lucbbʗ|G$UxynM0-T M 3W ^ ?&xL/Xbe|D-̒eYr!gY0w(p8CF:@J;B$51V0(:Se*̩kxאJX.=(^ _jo䖒>WHU66Lɍe&rM.!ʧ}ޭ(JӇ7-*0[ux-Vvao+)cr~l6Ի4%օww(ZioLv*V+s?J " &bxT5lGN8$]{5 a5&w?oǢr[6WÊ1!juw5w ; :ZmW Nr۲|ӷ1'd<ۤ2b86'1kTb’WC `RkOVs`/b͐&~ a#YiE`c'm}sūخ|4 oBNGBu_'L^j0>yti+=i R!7InDYǘde^ >WAеk+O=,ƹaZ%Qo1-I s ZYb~\zȔOdb3]Ԉo'0H(ҚnIk'WKaͫk)/ ȥbVL)w7/' *`H(A "S-u.u6/!U[+!YmڗJ I\q6Tsvù8bt7Ajד+!>kX9ȺiWf; 5f?T9+1!]nVLcO4L]K|.ߕćhJ~}&Z_~Sǵ}EyLV3@>~qIWcɯBRsRdfVKkR3OB|̧izSV} 2w#5 2^DTV7aߕӧ7^ǯE] I[MxzQ)^O~=r"\M2D2T{?̩%@u (2;'.WulN<4syb b8OS`lovGEgG&| MPc!4ÓKa``'t8]Ɠ46ݭ@J. ɕ a"z5k:/=SHA2Ο2nV68?^)Q?K/A1*>t R8W?2#,aa836\o܏o'ИT͋WådSN-I?: `܍jW$`5_,7i zfa7$Ào/p-Lor=~w>-Olw`\/=!d ? i=w["*STnzFb8z{'܇ba^vI8yFfRF\7o~&͏"olBнp' H-2sgyjf)YѥϺw$W *_{ sof?ɕgĭX7(ޖ: F ͛ASwNq~=G`a.Obwyn-}:rZP4xy3{kafɵ T.ȮMqt-==^Mu颉+ЕcFv z? nl~7IO+bӫd_h 5Z a??u_@ͬO(BnpF^v~W>}f@?~OYA{iM`rfɊg e0BHpg9Op~5?J.ȳ~q6 \lOGys@1~1+ (P:%ϻou?>~M?/w7|6=Kwc`g-̉/\:j]2"|5[?8iQY䟻JJ|rn.WUd)+1˜8_LQ1 m-c! "K!6R|L^& 6`?S"HEX"[O1:٧A쉃6ǝd45?r~m38{ ـ)$,Nm{9;EtUED l𓛙MzO\Mu(VkLRCj~/OԲ[B8(D#8M}u3} `xo gb sR^0B+.!Bbش8MVȄ!;x]Dg^ gpSXo?5I|5k0Kt:j:"|nM@QwꀪɺϚj׬1vrۃ|B5,ȳ^wZ=Q̡x ÏQ&?f~Y1໑ Lujj? {9O.AqrŃwJLbrǘ4\u6 |u%߅l->ز.:򝿻)"Ϛ]@FEKj{6Я[o}/'K3zJ3ZsFSW*~3u*F%fXzxT[9MWLk1*1IA2]y$wC-.l mt J8J-$$u!FNri u!qaGSyUlcĝbrL@bu!J7@̊Sb.3s_hGE&Van$((,NjT8TZLIh13d#2넉uU݁jG^r*in/í?ER WbeCB0g,Ӏb"L5L7ǜġq30j69Y*w>|nw&qC}>Oʼn8z`e-c ICAbc@-2JTA!3k1+%Zk#ZeuHi@nsXQP" 'MaA+ANkXCAU>^/P%͹ ֐E0-+# 1XPUy`a``($DRSqd@'& ap S17sqX#Bj'M ޒ7ݽ ٠v fugL%ŗcg0{1/2_(++XTX3U HdDF|,uZ(׃) C[,[VBK?>C!#/@Hi'Y(f[d?RX ƒXqa" U&b TNj`Lof]w]MaGͅaKp2q0t:b< kzQ51:A ڙk%sm0edD7fpӯA"Q\E/Ht(oC/Qgzo&6$010pvgYt:$>}}{Kfb֥ث/,in8ag")E^|z`QqOEF fv9t9#VKCa9w EB~ ScO`-N,Zr8#s|bGQz{)>"9; 47n9Q>L v{0o.gX? `Q!ģ*fH4d *F>I#J?{ǍA_Eh; 6H6i8pZ3={XHlKU^*Z֖^kN~`{[ kpLׇȥcY*I^amtVJ2#\&t MuaGj_:h XkړL/䴴F[-ꇨm,*[~UlUM˯R/Ԍe?DZQ'OZG2JNpx]9 ýߒPyx/)ښ[Q]|~Wy~WV^`YrMyo@>D]MSyFv3*͓<@ (f'+.FkF_zͽ8aę:߰)PꃜdvHqvAΚ/+- 9}k oV>1[F5D&H2IFR % _t~u٣Zf+ZyUnbG^}3 n·6??ռA`/xKC?>]˛AZ-׊%gbM\Gf2|v/ 3gG 7_u}gu1 v{'9<}+luH({fN~s:zߧŪQ| L %ruJ;&K(Ղ^.IH\9z 2-U>dlǕUr,T`8b!ڎ#L2O#VZ[1iV8-2.v QF\tFdrzC0n렂^G?0Lqv6OOS/o“ӄi Nj-sKj-a_Rku0u$7xAzI{)FLW !^_$Snj/W|Po}>x{qĐkt1PuDlp*zC)38z<`1ll_VOL3k.c''c4w?݀m$K {u^W@MQY#:|ι:ʘMx Ρ ;|I]^,W *HBW.qbO1&*8F n4huNwxin>ܺ5:koڭCo Suիe۫}\Sj 63NZKOX@ɵ>\~+fTݨN'P ~6+x;rscI;72br0oGLVOo"&Ei1)y4bPre50a#^o2ޤ р92:uNJgv 'Oh%ut@-'gFǪ'jy@m@J[2p&CAw)krobvqjt ,Kcf')ɱсALGrk{ZkPԖyHCuEuv%sO8CgAm*o*Epp=H:hjgf**B 5h&=^ ]d8&v61Y( bF}Y}T}+@5:M|YGUt/Yg78]9gȑK#VKk =$r[S(cAbtߟjLOf=3klB^`el32@&S)9FPE/ ŒΘCiuG2Z^̪%X6@+ϕfj:V sAsdylu&7bnb| 4N7Ջ2wMzL%6$, 6I`CYbrD>stJmr I2%fܽIWT ^It3ȷ#)^v&'&PGk &S|KT$\ª+>QBJA60{Q']Qy09Z<_!|eZx%KWh-|izw%{8*w:Lʽnd4p8mQ?Ȩd}oUmc}2 b0LNS9xx>9EθRNSw^vRGi`/?% |՞wljqJh=[rt8X\zz]/P(zg5jq)* OY:cpo:1[+5h<)SsEm`)H5(Ρ6clI5d#egHN "ף齺g2%;ՁR.!JB\: Dߑ*YrV\i1.Ov; ~57Ckր\r4jπ;g Ւ 9䧛R.T" "2ɽz9,qP˴FS֊~*bLSSqc*3_Ƙq,ͦ*87ݛbs%m+:dXSf ITZVa53tyqVwjl,f1 ,C]-~jm3'g4d 2% \pIf|ܜ%Yz+Kt4-ɮFt8alwcHOuPҹoT#=ՁrD6+L)Z'{E5?⸲ٙ%!Vm&s!c]de1`BDr# .9ZG $י6d( [hF4R|~"u hz2u탧 IcܑPp UkvOo&b5)Dp^~veh|$iedR$:\krB*:":t{C~|B=fj6UVBdB;i: ⬈Bۃo揟;Pg;NPVΊٶuvbL2>|P-vn4zPEvŋQLF/nEpp-V  GhDftǛX/b=c/܇ ^Gn\;yvh~:GG|Mh x(!lu!d"/R (/qRj|MIּHͫcriAd{ZLk2Jסnoz@1{Lݢ =hkH6愡tJפtuejYhܣAm*d'lm9ˣ?(E'#N+"ͣ"ϔ?d{' /L'& 쑤q|E:p'JjzQ̱$|MڛWLfGtUeiycJ0./-#I*%FE>qc˿Hvc7˞bHtFW>hD`^: !Ѡ:uz͌3*2 :ۗ z_}Cdwl~6m$鿂җyyQ%USq.3(-IűSR&%\d3=Ot4z$ߟ#dQFvqdrT#{_V4>T{P] ~޵ix $9)0|r?I?X` % j #msHM!G`|AZ,#@& :ZfJE3 JEKXχKen2-[42P63HxOBS_Άj/rdW&<}J{dӞxV)DΧ"\0Bh\ `Ь4v}Lf)DO@s׬AD3vYWHjeEn ns i0*cE!,*7eH#+]w#Ѩ zʦkD]{o]a:Cbc4{I2iݟT/(\C23<Ƨm_s#ѝu76S-i p!]SBe{hasd k< B?ל)EjF7Sr$tոQ[ӿ38zj ;azrf8.,MX?{Nd2 -siOczs7uSA. sJxr/ Dq"p#k*}(N m +{*PfgMl(FTfɑO'~LאD|?|m ػOMA򳟆7gCYΗ%K~ RlGz;1CHpUۿR 0P /ξ_XQT#+:dE(1\~nx?Xaa+'(xJ T%)yAnTr9h#qNE-I젤 =f"u"n^hw4-$%.~g`Na&xgFj # yRU1c**7<,foH,AíۀZ#tTny)u5wEmNPb8Fdհ[`Z4WRGn`}8JIN.ీ)4\hy 0Qo  諀P#єHB&DqPKI<4*iMh=b4fT^{f,kv8@T*Q<{KQtãȬ<^qc5k 'P+ o EGhX DA?TYD]p)ҹ W#ڈ&K[ŘGLmkeS`z=U(B.:˯}63 kTwȾNt"^5 jv\[^r_ƺ[B#O\Y?WIy_& z8QF$sdjEa$/}Gf=g Ae9P!5ȣ|OP@?+*E p| yeMAd"~r3q[GɦBd-N`HCQp#N%Bn 朿LSQ$U5X-m_{?jݚ5F]; J˽v0}JPPw! Zmgp͉ I~x2oy%R ݗ /; j]scBQ޾~Z^OpY/Jк/GhxQQ0O?oZ.s3b^k̷u:Gi+n+|Ǡz};uU_֨+9z 9:ݩ}/GabpA0 yoe4%L?y;xӁ:@Td>;WB5O/"/q0ᘨ$;ұAz8~zf*G 8ʥklqm03>-뭭Zy\[3Y*{&}ǷJKb#HJiYEL" P?!!oUָǛ8(t z2PD#Li|4Qoϐ+U"؄j d*5s;.3)F)__]0F& 4oDN΃ Ca/ ?jCpdpOt`p$W`kqz8608'AN+h4Y<׎D>ݙ6St 2I+3&5iGJw`:3"h=%aNM}?QH|G*ʳH?e?䫦!,zHχhG|h/ jP[2P$ F͕ LPK"(,ZZЙ7y[?RvTR)zJ#iR=*C5rBu/AQP OYƑ"%(rҹ|{-&lv{6 eޛbg'Zmy!"zɥ)]A3vL%!MchZZ>>!GhsZ6p3R4( LQ(t2iln8G;aSM"yHQi&h-+w n/F:qiϬEIuƕޠ)dVj)Fg*&WTr PY8&sKPy߽ۻz] 5OK|8n\> Ȕ\ -j[s4W :ۑpȼKn'wʟb]<$.Qۋ7ɛ0{MXy7ߘLǽrxn1#4_jw9|wk:Z  i-KV¿ +jйYŮpA5cts c{iX#x& 4gyQxp tu _R[8bhv@KV]Z!FC| <s'@\yo/tEdm <9"/ XrN,9g HT 0L4<󚘂4bFt%Ql<ԁDiNli@@TݩJ>p @%K):QHd95(48i{G͞v5C%2C2Q=!,P[ +gԐxӁ+v)_낖膢\J4 Fԑ1$8lzIfD#e̓⢍^84ҫE؄azb ߿<[S>3M cwYOf]Ct6|(I?r L>O3w.lrS'w;i4OO,RJuNHDNe䫭e ZSA|^VAu6K'>#;}:kq]B;Z~wucLjf] J2 g%lQ8\@evf߅Í7s1.2?&v:^oЌ Kh7_ǡ[U ŷ/U9 ?_K}%e4ݨ9$1&Lvצ<%C! i>ͦ ŝ%ڠiSJ$ )9`^ e=%L,͔q)́34}uķ /fg8`or q|x9A>^7_I|P$1[8m&ꭒ`yw5؆O91 8i[|<ᢇ]ΟDqdkxԘzGdwGRwulFqBq'^ͨyn/@B^[ i vWۗ,Ru誝k|ԁs+sT 4;[BYoETUZsv3* xaZW4gf;ܒt$~')JiG7}Ne'eղ(N'\r%"Y!RpU T(c<!D9 5ǻW^vIR|+ti l') {WA]ٻbt(oS˅VB8"WTRjb(7E A.:Rx*I_z>_?/Ṳ̋j:e򡼥"B$l(s&O2S2t'[⌴X!^6Dnjc 4-TڸSzmGh֯uԡW5(]N{~ӗ=Ha@ZumG#  6\v^D&ZF-u:b i`TrÅ2¸9 Vŕ}O:Ope֧w.}wՠXwXIjŕswY:?6 6<\b|IZ;B\HtT(¢tސ?ZѬޖs)Y've0mw۸]m3QIDfقLfE XYU[ hڸ=lPJ^?k 9&hW;6{WZټPpaJlN3-s3qEAU[ G̳g)h0gz#_83H쓽 s$/dՉV%v"4awO++.~U*nNz-nNY5W|:*_;7}:d5jԫ%Ҿ9W/I&#15jMZ-7M|eM|E|eSrf`@|kv6V́ 6f[28\J d&)$ #J \g8KY:,ΚfbWVG7dkZ C+䮒Um0Dm2̦udvb`释K?~\+Q[8E\))˛ (OVfa,Y2 YXx 4q#w( ֊ע-P+|` #d@oDMăeVC!D4$ބ41l@YQޘ2l<@o+'=IYY " ^/[ޥK%ͱKe#/M$ΥAڈ(E0sopvl$aȚpojrOSKq4s: ,jpu:cp+O{PBJ0j0yji[b@jBDWXc^Bs"0&mv*҉h4N3PKtI@v Jk) AE؋gV)fh'X B\ږӰ]/Jr8?ZB[b_>B0ceOM *'筳c2iWݧq-$^ͥup~]S|-ҟzsЫzjFnx7AQH|[po`9к&M*fUي#(rLEFR{|m͗&YR|c!rE_jqew}zp׶@3dեv5Y @onYGx7 8Z~:䧷V+SCF._nvzMBϟŪ6~zγOpÿf}dLϾ3;P9x?ѓs ;G{YyKLzZhLj/V?8V>i6?,K$ۛyPrv;GG(V N醴W,gmqP1%7-UqMx;GwO93ELW;?$cFCER"E:%<1rO.lgA7/m| FFE(ȹ)Z2~FxsSnhS?-Z-=IW.E2euv[)9S6vxC޷v+^hvBBr=ZOֳ 5ZPO~2ь=XXyeV p!y%w(Z.,D Q6E. {C[.cy2 2(*نGi,+X0QXÆK9NY<=83wUN"b8o9xzp 9jR|,\ݷ˻#+gwq}/H)@YYoo.g.O* LEy) ?H%#ue$H"g Y;bbԇTʆ_ixgNϪ5>dzOwߝ1>!xwyqoEA8滿~OB~Ij$oB7XѲ!j _9z;孌91 0ղORfryejێcԉE T걌OՊF*Q|{k`W:╌f7Dj\UVzeǓ6 Y}»B~}W[L|]/.} RS,qdpmx' X'&65tʲ7V%l|ng1jƓejfR*1MԚ;i|HyaWQ!oZ%Wqd$+nAfVvZLa*4tT0‰b0 [!g>h^SjU)!e-Uk$x:vVpe{ǟ&Uy4j-朅֡s1~_ P-`N+a ߆ Z(fte]2^Ct5AʰGn%[#  X{="m dSf$ 9z{+I t tU!YL/Ab7P# dLuj9 %΂D#@UpX`9Z{t>Dc‚Z[VQ7Q8IAviGYuiς`۠R B6qZ ,F;N:M0NpQ~ I:8ϐ4c]Uޞ@ A:W+qdଆNYYhXR@@~ |FHMz({ײ0M.:òfB):EBw#[[W'*Xvw"r {pK<.Ś`NX>4kwhGeֵ͗7Z+"k\Xfa`R%5'X;uׄ;5m%%&3QI Jf!ģk384h<Ae1ù8 Z2^ -7"I.~'WU|LJX? ~uڜ,6g9k6o Ov#ۀL I-V`%L{ J@6`:0E@$m_/g0۵쬳]-F1T8->~#qBǏxGx1iǟ~k#[{疽tOBȒg82`^mDloϒ=K,۳eZx! \`*Dr昣S%GlcLM%l]V"]V% |ƏgDrQL4﵃ ~nDv}3>8|f硰{H G""7^[`Fw=ڄKL)s .`֬uݩ-Ԓ_G Q AFe& $=a:4lOzĸ7 /߆gqo7Y(sntKrh_zi\P 3t~G V[O}`sFɞo)HsB {ū+Ym$2#?@97m,+gD%JMG j *Ye*႐.n +N${: C{/%-#R37M KBMfY #YY&=Bׁ&@֒U5QNUQ)3{"g7 Fgw(PQҏ-i xFt-":h*fh*f 5ԴETԿ΄!Fx[٨b[a Һl.P~|C>0SqJ ;qr„nPtEC:p:#3\9𵦍J!B QѻEIo >@tyNc+ټ>le@B_#+Hbg}4F {fH:awi]oe_I8r|8>1RJe zN(GJ@yUh#LU!u> Dp:ILuh]o-#=X Lbf"ܪ\!F#D-X a4Ok s pkܸ9up6P8.HОk갡 ){- Jj ٠ɦ)+Hũq#-9AII:Pu`"=sP(F2Ú*AΰFu`IR\-0m;\2& @Ppi܁˷~8KxOV>#"F99z~RzaZ4& >^߀dnnDDHV7'`O%χCjN)ΊBwf:qix|4#UE&(L?|ypsƚ"D ~+OSO7!eoqX$ϱl(Pt ƠBAzWgH\KŌ֫PռU T¯qz[_㔵^w#jp$#C365^+N#yyJܛ*A=0p{bQfPCHb:Dic7Lo.ehi3ygx{Y{1G%Riq> 65T!3JVwT%{/nRP!S,'J~h2r]~N]D)EMHWTorIw"tv'.ep}i0&i[ u&"? ~o}>l+y/`kGh `4e4g%A}\3~VS~A˸G{/hBJ%1,:Z;3h+U"c[+bpS[Q=p8:s\h@3˕úLG28^F#bv\ D`ڵ9(yQ,s:¬I G qho[ *F(#V 1 BB&fDYN@!$za8R -CQ- fzLLޒ2Xs8C"eGBRx.sgL)E}c-#p$|0V+"no xrwDSā1dLb1mH,( l0:N^ݙɫ}US|UW˾+)Qa y0&c~ig'0*wRFSsz3 drQgH\2#dhD3_krIE·H|XX_\d~"#|w/໇_h@oB*˷Ҋ3& _xv2y`G_7t|7ͮ Ӛw4pVUQm^tu{yqkfuh!zLG~#6Z/lBVs{NHޙYMFKFӒcjbhVPpuEv_on:ph FvxwK?M2øe)1ynwO|sB:x)q;)E!A> 6'Nt.|bm$QYk㘫Wow!^TySA6ݾc2>K\_5[:^ o]M]r1*ƐL`J& MWWih!9#hw7׏3}c6V'k1+KR͕RqDm :s@#n)[uc\<8X216 ~5yS}Kd/ݞbh@Es!DL|6 ,^PyYHƌ,{\ndk-dFJN/Ϲě4/V陝8 Gc]v?Z6p+U UG*oH%(Ac:6blRULtIHnvc~KkB "vi'ZLzoeXX֍TMEMN?٧GfF-|tjj}vmAg\ApM)2LGn YG8:iUu@tQ;4R1wXlD#(ºxF_yPжGO"#Z12GڡRPE:4-|QNN}cfL.k`m&T *g2%r9"jHF0nHNFk"T'IV.Eѥ U s:y~D`$tcX_Ёj2~U*^AQ$څf6Lg!̞Wq2n?\ò0ƙS99/TB ,5slDD;aEXiCa8_ɸ.R;z=H9;ObPY`9:zyp;zJŻ<_.K9e{4W6ei.Msm˲MS`K,r cR>rE3HT{.v4S;4?H͹LN@lsyȨcI ˛P9S"I@IOQV#Di랔j!&>s0BIZrE S VVA5PkDnTkOO4>2B8Ϩ&nɀO՗8UD!pb'`k)H3uHjT:,D$i)SMo @Z2sϑ|m@#G뜉/Ҟ@:ULw'x'P Nj/\Ls{5@MysCNf&wԏϹUS/aUTm`@=ZMBjNm%?|֨WI!\]| \.O5ά&jB"*)e;<#j&R{ge\F28Ĺ,8D>6(lyK9y*VזK4F% n)jBȍ'D>o?9zá *O={P/e r,pװ"y%%SݽU`݃49 K;=;K6n ӏLގUZQhû,L=~ n*]Z3qz/:qWףŧόbsx3O'ī]|'<!ϖbE Vgmzʳd,*^흃4{0fJ\>,yk/-Npw{h>O"Z'#KX$ng~b?v imt5&7 Jt+#>xkoj2^=*r&5YtL`i3p(%z^37ïtHɣK\0ˁnJSJ9m$MGj]s"u'4isAEɃڐ@ mQs@B̀A '8Yz΢$l@3Tu9P;T1_:BtGFpGZ]0l@#e cUqex5ń0ID?E˛ǴXbx42{1kqfY+m#I`1[Rއ?u{E7{f^ 4-SjQ=^HrbY̪bM7Q_DFFDFFx;a/tdYZ.Ğ HMXFvE:^M[Ȁ+u7-/0)_-=?/p6 {JNZn3Xgí-/h8[iGbZ9dY< }髳̓J鞍ּв'^-2ΨZnȜw\V=`f[Jj'+qњ GvkZZ~ |+5fI$a"Qd"uAnIX>OO/87S0 6{*g@_>)犪Nʮenn]=u&x0gQ;A"$! i3,y1ʮ@k)O-uZ) "9qGcߪU$V2gCVX.,er]擔[yv-}>h@|QP幑4([K8-4ZZqem򊈊ɘ8 ::l ϻŵ^A8;-9oQ9/ s"s8hsY'ܯyێF 5}Tq H[`o@֘no"g]Hd>N@V1EOb1*옮T+ A5c޿[Du1k:?)_?ww9CvΪOO`NkTd #=&6I*Yq_ѕ= AB?f˾ & ּϵW0݄r8oNEJ#Ɓ⣝#Q%j jc8h;a5rɯk`b';Maqrl;=9!#yWhQ}̓MI4Dz&j#@]Gѡza) X`P!/Rmvv`U߹wYRrAp'}J&Zq&y obmNN*b«hl*BPLF+9Jf0SsgGg8Uh2P$ i G=0Dݟ xSMӃT"Mf W9ghQ`ԇ{S۱J,1a7}JI-?S5WhOjqi5 ̩ETY-bxS5BMsuw2}]oH,q@㡪:ݜ=4BY,fԘT#/BPb?Ymq@o# SW=/+ͤop)2;t֔F^KzmizqU %tU4RWcK8/wbHIʹ{O+(d錝z]6rf|u/lh|Rzd9Njk%YaoS-R+\w㸼҂jXRo?nYyKXOLd!{rm|߷H荈!IEQYtLb$S]!g 5JΡ>SH͹ԧzA 0Exa2ۄL%B↪ !&w}?-NU:,[Q&n>sDBnۛZ[O7  fG(B BQRaMŨpzq>p"#Vv&ie=y ݛ+#'\9ϴSQz+!"GRP#tPF u4,^ś`b#cj`Z5-hSkB@}ͫcsMb:c09?689M"Z<r ~X5ZRIBFW;0jOPޭ4ujfwK1/iuQm++zZIՎ16zهy[غ_5̱usQ|[9Ǩ7 !ޛB7p%>v{p1<,>%^qsf%yr߳6 ;(xlq 9Y>nGYB-EuڈWK‡! 1<8t6,3dTwE{{7 3* 9 !Q,Gk[;_T =yZH"| ^4KL!̼_FM0`{ŀ.lS'WK$ ,kј6ʆc ͟Tu 4NS 4N)'`sw ._p5`k[!p'Hoo&>ߌNf"7aLJ8,f&q1:ln|Ǻt4.EThZ/]Tq\q$("ȀF^aXc\$IMFNeGDEQ::OKņq~_6L?XwX@uuu ԶUij1?M-槩b)S+A0υ3΀޷ P^#SAcabDEktecHs:~퇯 9_ &YH>ȳ>AdeK:w1D*5m'1O1"_-D@nc ׈.u8١U+|yZU+.v7sD:5kR)5kRԬ(U>Fc:$*g@@q`RFwDEwK0g΃ނTb%eZDl#a![wu8؂yHG̼yGf|mZ׳gM}i6Mզj}$TN Xb&t\JFa 1 ÒLjHVDEjCޯdvתW4^q+nZW*j(*7RŔ5Qb"7Ri^{-G+_E-z7iFcrx#VΥռG`#Htعcv:\6QFo~#): zn?=\~ׯ%#N %uw!2u$V$?/'() Q7pCDw5T[qem򊈊(+8 ::lY  acCkiN<ļ;y}kf8O=*^1*ɦ {`ح4Cym'N88&Q0Έ8bKEdg 4jANTd G*VL ""A8ļE:J9xʐ#,r9%aTo7ѕ5w/Zl"UV)2[#9@2dwx#oBRFo2z[3zs ,́-P.Xx--(3jg9Y 280 0y;H$O\+| 6eP *(Koє@IT=] N4X)=!L)R`C :bSM-gZfF*0F?OSv\ R.?,~aЮFOCY0|K.5+?G}/ER7_<엛p "*D[7Mo|w17~Y;R?c%ܚ_Vlõ <%lnOg3xVz]KH S_q]B#<\ژl1ZBQȁ^x`)H⃷Rvk1I%XTquS)`oMFo˟?Ts7n.CG٭zjr5/٫yS\-~NzYN;DK2-] G [S~[Q7D"g4HRm@ I$$ F֒㎇Ww[D$KCA_829|!="Y2u@fO87mKޛWO^J*8j5/VuR/as+9lc" URz .EkDrGp4<Ͳ+ZA1=գ}1V(DL` K aeckM*FcE98&&yV6zIxG;ܦ Y"HRVmK%!=ZpGDkHfv\ GeC# RJ4“[w$B91M*$v"BtYv(um#DueΠ0G6xf%qn5B(E2XG("rao;%0ЉT7t#8Y'S|ۍHȩnJNm`S`ucIdmJ0]sK1sVb FDXX00Xuख^ Svl4$Ј: F÷%7偖< Ԁât#ZT' x!NoA hPipgg|^9gBUksQI0R:+g𸗝dd;lF&:"A1)t[` #уd!C8"͛dRT,l.G&U8zW6GDwq j$l}xrZ6G?{ȍ R6d?>l ͼ uaJK#,/)RI.I,Y vUGK3J9v0åƉh _/];=^"|B .hF!d拦l[)]-T־ VJu A@SBhRAHp8i4?(8. -BrqH}PH+.=(KԃB"9H@yP $4%hvHpT ! ,`e Ӎhȸ}^W 6-$Q`0pl;/Y|wjBsWf[Sf!|Q%ȶa4/bWUuqXa 8iJh,מП҈HE%?քCID舙%%DHg<bN.x´J\IB9p4ɵNIx 9\ gxLb,phA@"Kr>KF(ߖkiƹ^.Fb@:`u܅Zq?Mtr`fÏ6saͦF5B 9>qX:c>` I0 /t^v"^AQɹ1جibٙLPJtt\UEd]P ]jrĊuWTKC)Hx]Qnmۚ[QTˈju@%t) Z6{ͩ6U^%0nUa _JߊlPJؑpЛ:R3ېGնoWNөm#䓨N9<*nvxSuQ\N iӮ{T-]ׅ)Z=yqz 5ven_>pti]pe<]Aob\,͇Wxu3f< =}T^͙Ot`Ɯ- Op:_o(Q>$ZQThl-4Lfe-^7&|tyPH7]%(ePҗ˩$| 0'g`qma*Y>g1ܒm$R ̎qSF$!TTFG'v BngVaկ!qAvS^|I+4dS%RbL̜|YPdXJ@lVZܾS6^//%%T _ٺ~<%99|yƱ&Io-TSq@1P=1Q&.̺q$x~}x7?f :ľ~o$vE=~x!E;gJ^^{*d.9T?̑o&,f_l9'+יWCս%1k1ر.o V HT4[0" }apL˥ 9jg9~<}֋{H0 FވYG4Q`ɞu<ι,U?W-88ϳ!1 ><֕xV.D )%}>; &< U]ap".E qVx(V J ࣰg|]/M/,&<} n܃z+0>y0:O- @>x9^BK $UiH&#an,X|y11HEG@ +{\l1Lq-s76֯+v27x pF7MLNt,;Wx *8Pv;DKEY/UPGZ8*ۯWAtɈ T~qKZ&Aֻ| cyRU“'Ͷgl.䟵P> B_.3EϵE"rlZIN$$2[귾sPqrV")Tˈ2$,y:/p =p8';]]LjQ%+eYF^Fs-ߗwc; j06:K4{w9"I9} Q bU}Mf ԈSB;By{4bAV p\6A,xә!J(X&('qz܂{Cwo-KR .K[;QRWHޠtىMDǼ!$Mn03xiQ6PLVl>aYVTLz8D-ٝ}j۫NwzZhs8*qNJ=a}7h._ߍf}8A"ج ~dYK;%RCDV-f6KVcS5t?MQdwnsUS؎7jZ`<ʐ9mDBD" Mg z}zľ([ F?:Ue]m3DUAa~:<׶&pݳJodW(!<!,4D(cMIg\>}][sF+,څ=vlo*ur*8ټl5Wr$ʗ=(AI@&UNlLOOO*w۪ uۜ׸oً+=$0 IG@!=. 5׊GP"KlkD9[]hoى.PΧ:#@5Kfy2=l icr+@?#jMQDԌsntfP#xdyh AZhv0:&| zfm5Qn^'7ڟ $EWkƣ]_t vDni-7R8ы`OgmKCV_O7oY Wsŏ>t9}"g0L]wIYxЕ<*ǡɉ\?Ui6qT;٨屳ۙ0t6/_?d4yn43k<>xk,&Tl7XX creT jrY@"\JAd\ـ)`j"qF/.FFy7'yAb {VLd~bf'1GMz1z7Q&EaټC,5y20ƍ%5/F ]ͫ٭ ޜO_ᷴb|ϹEz-J?Lx}pllqn{t/1KqEpcO?U~5f@"vc@^lKa% tbu@8ǹ8W +>8ˈB{0gVRFyLT+HpzSsU+mP ߽kkХ$&ݓE7\TdT#`Hk.O#:"+RNsl=4SpjUd)j)6TvwF C]<Aȵ)~*gaݕí:ě8܆"3&UBx6iA]݈}5JWxCd~!;ym: O7E-~1V[+aD=0 g7'ŗ8h'l<`|`ʻ*7jmյl= ŀٴ <Ȧ o60-.ߨۯv{Q0DKУnWBч.&k]mbl*-k4/ n%=>H)țud{]]NG0ohHrH G-~N18{v j=DD;Cpt2vL>ị='a{NWц]Lf6x??>ysSr{ V|rS):X& SQݴ8D0wn*!ufpsB6$k16 9#ޫ[հ+ԶA[gehr4jqm4ڑhTFb?%r^_x9tX9XȜ?r`9?9#"BXܫgYT=\s}so";`Q+xSFu9%UӾwrfz 5HZ*ʳO%`b ެg(x3.4/.?w3{7|?yp6b.;Sd:_Kxx8;:Wo}g#{}=y? =X %ZƟK{o+(s)=H l4ZMs1y^FQy!yKިD."ˠ7Ae{ⲄzVVw J1AAYUMpĆ0sO5sź/#\Vic-Uᚨdfկ飒r|2a:FǕP m$\-Ϛ#1) -sܓHBSw  L{25Fp9po+rWgIF;dLF;d2mK"N`TEKk郷"QjB"T&wV8RNӴFD۹09iﻋIeI{ob*;Cs6Ӏ!#4?5@{8PDTӫ|e/D80Fb:(%)DqpNDa.q\״FzM߹ lŧ@"u15KF*q & ,.5.A;E`\,*I0Ax H*bԤ@8.d[DtfQ=J%>zh\l4Y\kv=u/a@f JdW6Mhim.g2 K3r2odK}gR2Qwp㺇J\`]uT;C(!n|߫h󴋽s׹p&Y`>هg?٫ٵtmۍ^^=hժll{*OhTr~ؙ{AXe Iys0tq6Gp5he"|a\Ky8)I@T ݐp$Z-:-JT"hmЇCf:C38s\ y^=kqLC\%N}M\[/?o;/ޕ6tBy皸UdR$,35\BKA`I%ـi)!1N?Oٺ ETnFo֙qnXSո:V,+Bq\}Gq@t_s9ЇXY脕UC:VhV U=ZuTGR qnXHeGz!>XLpg ~,bBr*_-RRP|*@G5BhA.Yg^{Ȟ=?Op",41AߙP faP˾C!&i"1e_Jܒ+ÌfA&|AGV[9gZ IC= ~% c@^Ƈ(8v垧e1pUh E!.jJ&co:@A^@Zj-z4Lw% `.9S;f/Tw')Zô+!Pw;Q_߇~:B? !4ĝ|êx5)^}<{k_W<]V"J*x-]%FA^}ξ;[9k+Qa #UR pU^w0b*ª.U쯂T)hc%ռzne]Vi?,x ʓ/'ۭԙluc>uFNC7hʣzӊz.= \4=yVr1btJΌ2‚\I ev&Y]郟")nh IDPI Y`Pyjk dy G+z|]c3E3ny%=quߢO1l`(«4kVҖ|x٣I eŜYGoHPYHxSfK-Z*R/ʸK%pTk*[/<ζ (/wu"t܅$R|RݚҝbϨBsTHzarcu0xi %Y4aJ0 Ӯ3MHFl la^6[0eU`4 2Iހc8nn#ʴicYʎhS`G޽~{s]xֹq+gԆ>fC5{O-4eY{lV[Ñ̽]rER/m[V(&:r(vY\dnV A9 2vsDS]$*3PY sWp*sPķ66ًT$ ܻH`\ӑvO_aN"Q)YPʞ"A`HnϹȰPL-9@MUƭȦsZ^L[pzhE چ^L_Ĭ8$Wh,g i<4|qm7MPM*2SCe ՜3Gڱjn4;AIf fgH?p*Sd6] DkS'Q%+ 0V^0t[kDCإ:pBTر*AoL ֳ.ԑ:>mfJxR[^=D%ppn[oLd >iRzY^FgnpG}"s4N&fP^xbzS@*T#b+`LRA) ?]+U'<sC<s2..h"*؝_\^8hO䃵URmONN|3ȩ07?zmϽPې~ e 9ԙ)F:!`2R~+$Wb\3RRKHLFVu9"Y*v#Q,rd$%|Z\ NgdnHL <dd,!sSxhHfNM,|&@( ?(N ;I683!H3 nѤf#oYԜ{p4v*wߜqǾ 7XhY 'Td[?ӽ$dlUlDcX(WpJt{&B)ҼC~j4R3Q+_/ Mw-xj`8|$|56!WG)bP9g/ﯭ(QBg#SBRi" ̌B,ORWDTN95SnC}gyVpm~|#X&TAQM{VL<ڴWS&F Osj4$zm%O \?P.kJA&.ojd3QB6!Q^ʫڠ6<bmSkw-X6xO@BtVKzH}m9Z pd?8g՜SJ$Ussv֔q1li 4ONc։tA|?$9mBC5X fZPl7cq *~ : +BwɆ2 RQ~d8k nnkWm  UW~U*)  pמȣZ bN*Y,7%s-{nz9[Gq#mxIbU̓ jY)qߛhk&KDtꥧsa}^#!Pύ`H$ en3]HV^gWUQ׮>;-,͹_g3a[R Z/AM +l ~l~ BTgk55Ĭ-bcK_Kc+ BU1݅eYnwv !Duނ+&ddq%NVvsI韜Ua\Ld)I4A%xvnݔ\n*?bh߶ LoIs?jf e;@PZ%zm،HUz:殺% +քsQҴcQ4g$YWd-? {r0|0: |-EOGk+]J D!Bz#@)#pfqۮ`1L>uv5-)=Ld%2K|95#8S8ĕ\x;=pyQ`UR bKZHuS[7jAƪSæF$2S\#lEf{^y.&xק.*ާ3zwo0VRJqv} Zo|+}HlEɊgԖ Dk LE(ďח2M?V\<\S;9!0|, \Ԡ %!gtj zUT%`4Ľ*!ąkqO!T\ty߿~"B2QΩܛᲈ17d5"v`"P<o*覃<#KdmgQnyuËRUncF#s~gQikr&q)KOgsZJ>S,I hyZԅM*t REM: 8jjJehVeHd&dP78jKZѲ%JTq駖 \op#iĒs7=]{zNV˃c*v6pXF3NQUvn, V~e rI 7sʒ !Npj?C?Wm.#A7q]c%4N5J;C>rviH+|!vg7)TTdeJH$owcD$0ۻXX4vcB8mآ }N5z-:矠LH|HM@oh-eEޓ/ ;C1Fʶ^ESyiBIB=䫁HɟIptv#%b^7g?.{Չ^0:`? y}wś)Y'xz˿Oyouuz?.^N =rWoп~tF.;?\tnLW~}<'`\x#:$}xN78~<ߎ[k9 FN6Ƈvnٯ8,P e.S#@wM"qq)sG{x әt |ÛN/zK tb,UIWnǽ?~ꃨ~:hؕ_}8N3@%ǩ9?_>vn}NoƭNoL^{ooSGoȏiۿ="tnNcχ~0@w¤4`3tx&?8tk03yC77FbBL'6>i>?!]$|B$?wL2]bGE3;݀Z98X]Lo h :>pǍ<?݆kH냠w`aDQdDhF3a "zf|IlZ{Φ!A ZQfR;%^Do!Hsp988g%Nwm„+YG8q~doA?q~Qk#}I"p T>q^8MM<}@PdBnWIyRI E|E OPuEZ#" BH”R& "@XQWX6 uhx&5!< SP/snX9@ň:#3) iPpjEBFΖPG']h;hiLK '>J MۿM0y8 1[C`WY܃/Nb{LdMD1/a3V#3WT!?\;9C,3pvW Tpse"0|fPr(>J.mϷ%S,{ͅ;8${c|fqw:2mtGSI,W//?˗cZ:#e^nE3U*LnK~7:?B-հ~cfPj<{VM&JF`:Z6-3MhHO@z4!xFqliIo$Ls-12c%G3ڙvS=mߛ>lnsǐ xeXHh(qCji$A* UL4F@K P7L- ww[I!w ̥}r&x'|ZOZݻϝ^zkE[o:Q jb'j.^RLVgv3Ƴܲ_jR>$U󩌕z%זX)*sŬ,K%4bZTJ [M)F; xI~7HZ=5mMRhBlRHgoԦDa *WT .>uTਰR^nxI3Bq=d3h&dj݉0wZ_pۙx\r:D ͛~uVF#xUxս@| "q*t>Bg`^/,Y V;jhzE 'l׋qppzDŽny&t E I3HeqE ^˲6_ny.&r\)j>Ş&ȽG`*oq?P1\Ύ;8Ǿ;("jbZXXjf +$w?[-Ikr2 IJ,iPhf=\(Z} uJI}B'lcRs D$I;\s%vn<ƼW}1b6xFXgQXØ7[ɵ5x|b&i2v=sD쁗T+4}SṼmM^*yPs(+D5W *?PU1*[)>)!P޳bBHV?J~Z)8̯_eBTe2I I< ׊sU :AjDXH2!QDFQe |Pc$b\H0x~N~Z YN|R)8*#rG_Dۏ:Vu/]sYzʝN/2C=w!<&D!6 zέ z@c~c_PhfNᘲÔuԘ;?+Oy]R0?gt+E렛\?zouvFOuQ˝[7WOL2?EO.e'1# yhAl43I9C<;rrȰ?>??}u|yoOĝ{3v)4viVBTF󾚰넧§fD)a) )lrS?'h|K1-,!q7oRIYs EQ<:ek$s{:O_9;6,>)GCGUl|Ο$k:|ʽQF'ezhHgֻ5Qwӫ+3~b4Gt7+P.ݎ~<{k&6e*d0w`ngԁU}9Cgi$mw׋/-wҋ_h6|wg׫In0&'0c2:=n!R ;rn ·aF I_5c dLmCҶ)Юu|={7W9 w,_/ddNq!c~*m#Y)ʳT}2IsI)6$3Jj-8!?9mڋZe1A<|$BD8cI aORGሃ2ǭ3X7>%(Q}:En?tF9> :Z .cfRlM084FRE nK[6LCD|RС V(Wl֚i*[6.f4[^'_] twg'>VGs*"S׷6-W 0^^~`F ;{n&,a /%ړpؑh;j$ +2RbEشȃt P>f'. 8fcZI_-؁ځoNH8uevEx|;EzEyL&MܱF FƷs,2#DЀqQkΠ3_e Zq[9RJG0a!" DFhau"k}|Ed7yP4ofرfu+q0[M8-QZu)dBS5ފ:lU%(p}2Vn}5B)6AseA̤m3&2CLfqY?0cͶ!X2ye)&;9hӌ{^tBT6+q5;tN6.׶ ejH8l^N!n0'c̤:{t[.+Ba6~N}}c~MgUQ Qtj3MqC/y#TYG*󔟠qֶֺ1SYPS ;z!>pB"*8qyV 'RcJix=f'q9▝qߕ l \ӎrfDy:r G' J"C+0 g0Ûl1Ȋ:-0u3KpsLmF§-G|ƥ${4Y&;fCdHb9OnGxd,'"[mZp^S. Iqm#Q#̹A;}/ҏA$}`G`-"2C mZX~<*jړF:C:[ӋfiVtrEv5;%4`X%uFm4 D+2g?'>MS-:opeA?nfI3@=ٞY;|̼xBڌѵ:|MIQ%HTUْݭhIEeyo$US8!&ӓW;EZ>3f1a_VcKZbO 3"/iak3S|I /cxގ )9ee1J#Vk~z.rZLH1:qIz|lϚLfqI!rAC9CBO&eV^|e,+2:r 9є ǀ"8. :3dЫ*h_~ȁ0rݺ5gTeϨʞQ=חj>n%EIٰN#U|E7te=hP%} n`Q v _|p)1ɩyd%O&7pȑ)FYg|~?yXD.EX97=S1T _&"׿>=X1V8&? 7wK|3Q9S@Y>~zrq'Z 3q29ww]n՞TS 9CS|bL) '%_^R[y`mȯ%s=%Hz'x6mk4[cxOB`_޲)R1?=bس| y 6&D֡:hZ:gZ1 8:8ZQT$[∿zE}_b6ɶ]\q.+7{}=(<G_dz|e뭌ӮJH!.n$+y$%9XY3̑j q/[ F׹cyOYNOVD91+w閦ˢM"yw}O?3w> Φ@VM9d_$sѳ6gB!Xw/krUyUyUyU9Tqպ | r-()uqF#5\FFӋ䈥@6 $>!3n8Oki-â &WUL2Im9O$|2EA xk=x%% SNi_k#% 3%0@mZ i!A?4;^PHBhEo<}Rw'9טhVK 77h_, ͣ;'/4_HGl_rdԄ x>\W?>4??> e[Mi"bzc[T(Gors7#40?&!uЈ(΅7(V\d2 !PkR2JVLu=OԤ6*Sr4 ͎C. ]s߽A^YvLmؘ^IhFM>4xlr3bNqv LN%F6<=Y?rbXs2')66EEa '1Y5E%݁E5WrU yKF~B2i(>l?Km6r'plT0C1-rGAqCSqOVS4tćy9.;nS>c6P\o/JXnsў&3UyFm9kETNqkS98 *Qdr@ fd)Hю?ib~xI4jR{gh#+HTpjkM^ p0I^mSy2t֐ytONIE:#0"y\IBEgX"ed]\њdH²izDC?!ί. Qg*錯 L"ͺ|"erR[Em%#쵕HQ|!@$Z5M 6LŦ(T6u5i$nS2R6u>m*A`Me'XjeZY0dm_nSQq`AQ)/afIlG 9ßYI:G'糐 骔vj{uVi5.D +)yPO5|xj"6/%RpӐdaVԹ@DZ?^H{Jdq>[ `#!_PWr<Q c't;P"-8V){,^̙ͶS&FhnHEP$%U{+GތE ] ;j(׾!  2謁{D4PH|o`Gɕ<*oƳD):o̾^Pݝ/_Cif=yS1L6Hʤؔ1Ma(ؽǕmAɸfJbIiO6R0e1酭ev4l)}6.~ E ]L = 4Z~sK/ tE#9p;[29ũ"\5yrl4LSͿ"!j47/vZ.`Dw^mCDZ4Nm2Lk|ueXO |sOu_XYI^Y./_|tG[x{zxy9ǴR7rry*;/}($^C~`Z<{W=Hkbڟ޻ۻK+T)ǖ |E]BZ tX -ͭXıa7K2*ɤfЦkUz-]-}Fq*Cvϼ_oi\Ɏ ^` 111lL2* ܕrZe.IK~W|M6m땝vyowtuǞcųUJJ{QD<j/9dq. CPoH#+.E B3䃁pd ~ϡvJ3v״3v0Z >đBj_4 !)3+ km m/i>ϳ!dnA 9 XUP+鍬]``Dk@jb51Pچs>ggN rQBA&֫B#3rE-}ҟ kko^>_5WF_/@g@<)cOIR[_uR1cp}8-9[-/w_<9<]Ɨ㭘llk "]<2c@k}vk~SqGK1E<-x|ć1'4HogfF?KF6b;b,~av7o^k(Ơ؆ssW tzlZA)(!i+*T^y5@dfYٝqHEELFZhHB I:ɦD37EreeuBW_)%[u E"^:jv΢H(YӲt]S$7UL>eh^(`.} T !o\E{Vvr 4Zw;?LlYaE#]nmB@00{Fk' "<{O_@j8h4E}9 f޷[7/r n|[c?[!Gƞ.ْC\/n(_d1\&J38wѠm2hPhz?ج$k셾*yge@jXxM:czpEZkNwYs[`(#ʑU_va6ʸ-u;Pżyryzrusy0'ryǻ9;=p>t|5%#lt Ǽ8P|5#-9S:Ƃ絒duYɠi,F8WuH44ѽvwSwB#&5! ]SrU?Wr 1S)ɦ[U.@4:zVHc` ֒x=HQE%y%"Rj<8nh#cͽ)}uQhS;΍͕SfB'G ( |-{ZsiPk7LIhiJЍ@5j,fx?&Sk1g"N`,rßW50$0~jN0EXM6R2*E>p a ʂp5JeJQn+9~QO1eBO[ ὸ뜢J0GWRog2'~d{=mԭ2bDs]g 2?XxzɡSVuɡSp)ܢCqu(YvOG/] ٽ"uC yÜ9) pP@M΀s9l#{>K}gTܟZվygo`v39]#|>)ݸsD:qٻ޶,W4:u4E'`L{AhۖQߢ$ԅbQd%?ıMN}_ΥIĆ)b%VK-'y8C&A<O},EֲԸw :ǥ'ivғsbjR.?z81 lP )į.B(! HTc;69^Rԃ]c{ߛeBW8//NryBw 9` F0t]%)Dd=~iK ~svQA[*b^ƞ3?Z<ߖscP>feBx JTܸH!Į@R9GRob- |K1M-c&m$t&a W@QG\N6ꙷ/n;rbB\ٷEw?ܟϿvsI"<%<4X jofh4J)Rt#1Gwj@KT΄Q%ԫ&쉯y/|H[{oۓM[=Zڒ<|͏,y W!k_>l]lzf:2dfFȿBd3* cSTzRPٟE.0%{xbߡ 0ҌGD:C%0F-fL4aLc\&R45Ƅi8`*xB %MrpHHB^&Y1?vG)Al >eJ0펒,WP_12TDN)sq;I T O!E̹ApGrO%mr2m΢=NA W#g(rkd%kF A4)3$3¿̛ ä;7SRԒVзcTV%ooE^J٪ ]:UzG+wFZn=%ZW֢CFx^:`0[_ZD JT' kaB󶫕\)GUa'vˮ,n 1o&7G~^ʏnK)m!jץ+]9_SxDe,h*,wp j] vD it*$D"M4pDEUh>"UmUhkHBHcIOGPp}bt\Bˁ'F' #U5@OJaw^l/~ ~fLzku'zk'Đ?Stm1s=Iz:(8ccNpѣΠN = KRHp9hGasi ^ =M -}jRLk r2Ә:CMAҶXF^ȁCh0pĐ.i@5İxד^۫W|z۫gnܑt66$ ) &O&b#1bve/w /p_^I)3@$ۗ/MŌv7,ZۿieT,OVSP4zMgϧQ^ f<՛G5{s;oVMj b%i,~Є4 @2N(#s@PY%G{mvݩK'C "yTjc Ji"?'KTS&'2x*CQ%M4r '(,&ANhZJS% 7ҝ+@|Z0>cʇ:d:_n>$wXȶE5d~2` i&BARIb#*Rh5ѐ#G{w|7$pz`=γo;Al +*! &QZ3BW d(PV,!N gLPhВ8?ńc&@I `!O1ѸwGwNJ 줶Bp^ /N?-I0tl>Zv,bU)*Zmr鐳IRƳ>ZnZO337{t$ziSaXo| ݸfq&%mD(@xS1y;dA!$EhK.DZ"麎@9V-ѰM%ɴMl7^;`l $I9{4-#o!֧q:c82- nb?WUbC^)`ĕb9>[/e(sdU1|?;5{7}fnQ͖wPy%a.a=d-Gף73{PhWul( f:^Js+ =\F[:s_;gPjn4RtTBQmǛӝ_6bCw!*orz3sO@.icА 8~VܡCthlHQ# F>, Z$ѳYsY7~hơC4cףYc: [FND}ez{QhշPҺ00q|QQిi l'&Tr o^:o=ۏf%DZ:B|B랱UݓD}'DIhK{T;x/bN+$9~~~5ʦ3^"IN ]g+g|!H5RgyB3FQpocɉbf7F:b{KL.xaS?g˔L>o"?Gѓϫay˳zFH z?&yS?I:'9䟔R~~jGy G]kWӄ#*:fbC.cg@J i0qFqFg3?iĎ'7/)FMaqsDSx03jEk~0~7_9vF~mƏ~dO0ҖٻF$WΖ:Aw`1l'ȳE[W۞xHEUQbYl4t"32I+#VhUA` 1u;}]l2eO CTʊJd]7!c%-[d9z[chy4,WVXtBͲ6AKym`@Q9CC%U2`(A$Xn&5U6embq}Ӑ=qG =q<lz]JJJ٫ߵ4Amv}|1z.&zæk$ouV\b9 κz9c/&2d ]95rr`-C9X_>|xJ0Mb& 6~\DZdp%:ŧeF5^llb8da șo^$y=%x}(:>%JME+>ԪlhY;M(R/!kۯlNՍ/牿 Dh8,F4oƌu%fsU]ָQ'yNuQU |,DRߗ)Iru}ϯ|Bt'K_Ϋw !^b!YMݩwhݹdze+9w!g9tag̖jrNZ*oYFe|{P{^3M2l5BWEL ТRS<<9^>zFj[b4sH?jf?"~5#AQ@1 ypE.7sbN\ Ǽ2£CŵQ7o橍һ:b^b{kt`y{4ƽ{tv96S#Q3O6w[ƹz>k_0,7u]\5oo 豊P i۶7ekv.y9]l]#]=o<3Vl!N5r#}+#1-pW0p.v`RaH0~HFՏ۰Sϛk}J/<'W5|pˬȣA]2?K\]_ .'W߾qP<޲+NO~uwgc`n-h7}Z/Ľs~0PMzنjjfx⎂с5^;0=اR" zmJdnU(Qtמ]Dgw}6[/mdkmnV-ZĹ϶W VW5߸5 pKjuQhN&a"Ծl`j]hmg\NQ^l7&l6krun?@]sèx}#K#_HbT ܭWBXڪ?^6b[{g#w:BJD/ n OlXq08e/cXweTyL%}bXҁQ%&]@v˚6摑~E,sjm/uL^_KQFp~+Ib?O+IHXPVAUDY`5Jǝ U"Nʘ{9ȝΛщ dv5~#MS}X 56OO7[0yhl}nQci \. Ne_zkYr3iۻPɇJ2z),MyK@o^4C3;-+Uj8[~㿹VRطEqopd(B"aFQVuq8#0-vk,_JZl >Fz҄VX醣,ti5Md_qhH#lG_UT4d^L;PEhK%%R2/5yBMmRro:Nq]G_Ho&؝Cټpj \jhx`^Lٗ7lsmEo)"B®> <~}?OǷv 7>> wWp\CtS+höG>>&~/RYg,.Ǚ4NPΗN* Ks|Yx`& %p$C!(yJtʐez#g&|-[6W Ua*C& e5HHl!Fn6j\=t!YAe+8ߴm+^N~IJYJ y Aَ֕>WRsXAVٓ_ sɌEGА] lT; xf[!=s\I3:x5#mjor#䎼m6?.(E?wsu44H)rcVl=? H~{ݯIGͯT֧l}:*[[^ *`F(wX+" ^1h),FI :IicyOs?ǃ#ϵx@>5$D&'(4NS_dz8mXr#@[l|M){E535 J4 J0\:%ʣш3\ h)M0J Y/UP굮׵Q+k{gۇRdGTGToXmA墯["b:fc>!rɪ$jndiHF]PZGۛ?;)ƆK[0J^;m~T*QGnܕ4qDƳ1,)nR HF@Ԧh*XYg>+l6Fl͟m:7QDm)JA t7mCdQ&V8*-lmC蚌-6ﶂNC5YjhDe^7Ď;r6W~yz޼Zd=H2/n ! 7}np-SDɺ'ڥo>Mf'">qy=?ı;ՀgB`r0'zh8*fP bc& _ 1DAa/xŁA7#ڇKԨ LbT+i!rʪ9kĔ%iQ[ =>[_9NN;B`!Mm;ɥG ΀EN72yf.A8ٌ9rƽ`ZYTǩ-P0R1Ѳ /4YeoD2䊲 vڪ\!1__q5l>7Nr$̷뛩msŧ{(xJsGn{?([-ϯNiO^Ru5nUw'CzUgtxz3@}}qA/XR(_|zp~gƵ=Loo_55HFY"`=Ǎ["?SHȢR}"3kA 8SR#-wo J ZVZ;/|Qd(TU""~ „;< DD4(y#(i 1/Lv mC|1 ?o,(1o76-ts ,%4sb@D1{%%L!sςwRB1qQIPDMj&.HJ]Q'B#g/c:(c5Jϟ­@j@-m-A1ML*Km)[f wQ-G%!-BHYb2Ĕaa(< `-e^<%[jwbs@h1}f l\ID Rm5" -cd}I2Ĭ<*bDM8$KjD@u+$ I>K[y:Ҍizf./(Z7w:tϘ#cdaO-Gʖn8`C:>qtAA`/"1\vghи֯$h L޵U XvV$]Տ'"wXm%%L$ xAD4FJ#+r)r"i5WnV;֩ ((!d1(2fULeև&+Fb+,0`Z -=cs/4E JY;k) YE 25Rt@hӛ:;Zj478fvVbuqujF,Yם0$[5-gڭDF[7(ͱ0Zr~Dh MU"ŐdE)h.v(NXhDf9a{-wW {r ;VNg`{XxمdR2ؤM.|$ABL 4M*-Bly|ՒaO]\la2k EQn}y޾ 049aCұGnNܝPj척~cIVJVguӢX:j/6Q-+w1jg~Ңp8f5W}ڳ6Ũq8u)U|V ҆BWa]Tȓ '0JAw.]t[%)V䷍^X"ÅX 7!mhgOng(D+{A)jd}̿{?Y'k izοn7'y|impq4 ~s*,;˧򩸳~ؿ,&$#r]!sg(#e\:DF}`/˽j+ϻɝ{ipx9iF*-':'Fzq2?'s4'_[RGͽ* mB PĊbMi,ri,ʥ d cRiδUV(%2(!C0&"(om 5l;QvZ:Wc%*}|5ge8|Y՘SAtG* uDȂ ZJI@NZ7Exp|m5 _"` 4ZUp NT5_P8qT?ӡ&3F)8F`rwo:Xoȕ7ͧ󿻊~*bqfO)=tTq{{q?}fW7~'JjJ8sxPhr[Z q?7%'9'*~Aw̡֢\ u@nh9y6&uHfM0y'v<_ {-U(+/';7WA&/(:VQ ,PFJ}"| O.ae*TLH:Br%J7ޥnhuH :=u!R\Fŷ!lThfRf, xlb1I?3!g'H, K 4$ʙ\%'N"`[2*zsA?_20nv뿜!D} P XUɈESުc*WPT)RDe=(j(Z$e|L)h mr6Њ{F0I?kTO0'\\/X hBxrP'2#tԉhI9rPl[iAh׃qSuSǚ' midc)mMu 1{Rd06$wKe2gctIʓM9M:z[bΓ{WD N^'-U= ~W^#¥* k柛0&SmenF..ޟ0At>Ȳ,Wڛ?~y *x&hn6ۗC}k>XwR"ٻI0}^({cz6iIw[TJ )-(ن޽m5P<2_h-Bff  ) cdKCW EMR 㓑!q;(BrTc.D`"K`- znPR-h^.lZME"}ջ^O3:{?eZ򑣳6BPڊB1a׈ŧ;<軓'YNp2hG%ؓJ[MxPhRF[%"hLE%-Wc@i׏N.&T4i J[f8͖vx:ŇENނVI[uTL -3>>Z/\$A%eYp2{bX' Y_$" HP @CW_K5I5=& Ipـr1"{I9h\{dWPޱ(% hs`L LD/0fAar 8(e!_i-%ix9bCek&h:Лʥ58[֌fLўمE8c +Y3{6װ|z777jkGOKda=F8匿G U (p+_߿XCی @V6-e EZ,|ٚ?gDK;Ѧy&[nIaQ+m,ò5[hBn6\RS!׊+'a>z|-h@JIQӁ;ť_=M =.FutNpX$a_UiMkmadZ>:#mݘ]etlx[-,Z{uզ'~&`dNvE!vh^h~!+Sy2k/]hIg Eq.ط|=GW1a#/XHbeT wfOIC8$HDԔ8 -@fmʅtTg_PQ5#H"H޿ڼP_\uc,7z7b|?k/uO5q&(yr]ko+?==e}W@?I'm8A,m68F;KB]hQ")Sԕ33{HQr8ir1`1; sc?k=œW+ 骡P0d i;g)$ahXGY-S)t |flkpAa:0 h msSi$x 'wm0S\!&:NPe2 )a 182; <ЗW'!Ђ+P3 1f#jMgD"p.ً 5$)] PR[ɰ%%ҁ VwVmȥĵPf+!f02G.ЎT'[w'EݑkF,y8P2F̳%W <Ϧ*xTXI|Ct)RȦU%T3+Vi=SȮ|6b4KkxQ\IQ&UVZrzF M+!tx/Nk!Oтir0m蕂T`f" |a0~ŇK`׏KF7`>v@ kNT+慭bL_xݜ㏲2EG/g4C3qƽt4ޚ 0^M'x sdT|PT,afx$EHd/:/]^;_ A!ﻠ+}s,VOWˋڏ/bxSK=jc到8GFzd "p?{؅ ~E`iD| O)bUZCxpEUՀvg2 h6hB;Mν>@Ğ@\|xL!ԶJ+I]T^xL$:]|\0);-M PO jlYkla~7T֭ycz c f i7߿~]%`sl Y=h,i$gS//b0шwlW s0ra]O玳?L X0;$5}yw@|]d\L(۾oU"9O[U)㎭`^7OQ_h)V@D?<@Rio zaut!)v7wB^"qȗJ)h[.NAAP)B06n^җL8Bڵ`-;yc5QUl3 L(.haGX;>ӞkZH` %w>|wG>fb"0a<~B\Ws8R8*RZQ$Bh?߂pXtM8_Hev9õ7['> *M"d U ihHHZ36[ZZMoXo;_k=]øBxlu1W+o2Ȃ'1LO@;ј Qg;9CyƳ9Xӧډ^ oT҆/GK9ƕ"\ϝkpWXkӺMAcgVZͫMd !iCly5k!&CΡ*tT.6>87;TQxs\fn*{pVn"㒛t M⺧ E 4& n Gxy0؄BmnԂunp d&&mc#JQ:6* c,d֊ +Kˣ8"Kre0(;N|.#89D c-ccjsECva&Zf 92{}ep`^x!Zj"+9loLYtᬢpSb3M`F~X 23*][Q\pLj?vd&,xۏ3%0'QDLf$scd@hsfoAĘnѰ4 ͮN2Y#R Õy-Ө3 ,BS(U+gV#cx410%(GEea>U(|x׺\f7 ԾRM5$i1cV-aGr8*pe:$2-ojzG 8@v\/€COG\vw Xգj@G>XI9) w_ތK X7Qmĩم{I>DuNH]K%k\ڧocI[aU,?9 1#7 Vj\)$$Z\ڱ-kQČ[&CҔakc 3<@2)FԤFIR2OR컱nggRR+ġ6Q1u`EwաLG Z3Zr/-i]e 1UMQ0gWu.T6-jFOBq\A.]0%08I 1Ad1y^(AJeemJ%lZsR&[3kVDg<-xZ/l3Q:-"xLC5-|ǞjdmTTC؏GE6`57#N=U̫o0*s=[*~K)%HTy1WɕPJdnE(si ZgoxUh|gat$r4bʢң#mi(ߦVSEjzh2PoO pinbA]1i]?%j}Bhm7iU |ˑA:!> ß.<[6kڴSE!}msvlsZY^q)`a׹= 8K7-5>9"H`"F {\xUJȦ\yZIɣ_T{r/D\&zQ)cfG< &*Q)9M&}d& q2J9OZaO;O|Qhٹ!w^WX8 KlLv??SQCc_NkZ UyÁ ՜ ٮ}`*$`WN&g;ؤLӆgeԫdki^g-y"&e/v)2ySK'FbB* }V<\ے֔p5i(8EhJa<^+U 3yޥdF#|FF|x?=c.ɐqcsVWRmV/Fi I%UTx m+x!Dڿvf:7| Ӯ9t C\f)Af}\;[/ S\9;+`-؟aO<K8-,Y/~}wo?S " ЈPӜ~qi~0Qv"ް}")^n%h ;N6_7a?/~{@$/=?s¾y4a dg?'m2wGN qsSx#ǎؾỊh8r˨;{r/0%L8M%&qo IA0yKA9%URJj+7Ntjb }S/99I{<K=XuCG0Ʀx|j'Teި7$KRuJw-ѶQ7, '+=4,Iwv#у{e4XGK0Bo@Y|KT hK\f)1__swi({1Qlyp9IaxDu @xr5)܇G :0boNgꇉ9E7of}:t1wK#89%üʃqtß0jٛ7[Lހut|&ގ 5iWM0  )e 5Xӳ^ox;oe9t;UI]dQ|/38K tgѳ\1Tw'߂7,Ԟ0.YDkRVN]va MoNF@I2fRyj˘7咎L&qly: #<+%%ndz 8 P6_q%gV2&uyZJ٧.R$dHAlqqLx A` __耲q,!=Ҋ(؟|Dl#6n=;< ;X4"$,jd;ȳwY-VV)L)ktﭵB>H%o)A$hYB6(}e[jSfAlH mAK󮛘x&fQQ!LLdAGjM7iQDG]b9m4Zqng I&0 ݠn3^ȥB8[^~DPŶ eex76⦉mټ+!nox QhiSA>Ы<^(3s ZKvk'z1l>WNHi/KbM, b^N[.zp>Y*tz(Y=b͞=r35ҎV!b)sm|^?rpm"3y`fw},zFƮKu.Nj2} 2V/ t |P\e]1dj*krUSZ-T8On&!u:GZ* g8\iO83/-)N>n:A9Trôiѣr;i!8zx5GG!q*=$f<9@̎Y5)K~ϏcǷhsfzgweo0)م2 % Ϟݵy@#TnxPv ebyx "x5o=_j l m[!x7}njwLrA[;1e  [ a).n-azZq_'ҊӖN%]O)bJN^+dw$W"NDD+. (qZ*FRFgfvhi 3lPp ъ 3Bi2_[LwB~ɐR2iC.W);^QkUQ=<\#{FکoŞ6O{dOScf:>0Vxx|6*ndώZ'܏8B I'Hc6xnPIڌ<B,4B*G[ѣod/W2+da=yia4r,h\:u2߆P5jPpdGf< M2fĸ0͠ܦfP8Y3Q@Zr o}w}9 ۳nr;L̰*"Y#Kao/u7W E⯴@U4aq!1-kM3ozWH{Uyر 8Pd0]1ؓ7Xw:S1[j`Gn0ݟ,ggih=+IV -= j=yy%VHImTYId*b!1&*G]J/ѳ|vZw ;wNK>z(}Cl-a`ڂ6b{)餧D.٠A̾/Nऊ+:gW*II't"`كZCY5t &x JrNOhaNe [hYvhԢ?XZ& J/t+Ӳ?{Kf;b0:NL(>QY|Iw:3<ZB`1ǘ(G i7( +}>8;@'L%&FWȋJv>!fES(\QZ,2f' &pYQ#v}s D%󬬒 ָ'v#Kv (ui+]Gۈ3Ֆj/jJ"ؔdTJ(c!],敲[^/Ǐɉo7 xZOM:wˏ'W^N2߰}%Vxxǔ!'=\/OEBcFW Sk, ݍjzY[Ye_}fblH՟̕_:i?}0" rk7 hp`d@% IQ_˝Eo`S" Fy"WrFdHg{5;[=pjֵrκ@Ick&uߺѹIh$2g7 = u qJrzHe0Z|׽Sc0KgT z{5>,$-'e< }ރv' Ki2>nt=%`lO{yC? 䍒QbĬ9! iPNŔȓj:Pm:^^LƤYjT̓c!x:DDanjszCR0ܝL9>)k 芳>B Ald9Z&bN@m^TޕtRHYhR$ $A%'M@b:u0AHYTύHKږ*$m&*.Od+lʄFX@|q! ^hX^%f11/%蜜*EQF!$r)DTZ1ZhV vІLV)TѧTۖ.gJȪRSHEܯjuXQu;WyY^ D/q BA*^AbaU #\Yn0o@fcьh *j^॔!Z/u2+%_X>SDs/a=m X_= {it^tl9TJiƢE=uȖ+m2m`g=mՖ2O{edm\,~AB%3l#mS 9fҲgϢR9[A;Wup@&c,uԫ-LRt>a3t`cH 2z[(M̮D@dP[Y,;мp&U9U|6,DB\2a'># B:̻Bϲ%'ΞJH.C(BᓊNvY(=@\kCloIJ̊ (q{,1},K Dz6g }H]teQ͇"h(xբbFAӶcWT9Vcrfl3ZWx{㶇$axԮ/޵6#b`y4bqvv;}9@1]?| -Ke1݉eUXE/Xj`ǓkZul>#t1yx~5lMg{+ׂ` R/Ͽ?'{I;|p =~PXfȕTBRwBL6QO}$wiOj1}w71 JSx4 ڏUbwT?tdDD4gw/J\}Iz1#.;9qS6}qFoLP|Oh\TO<[ۘFVƒ^[MOo#\5ۇSz{2ߩ|4:].wsۊnYXǺ2cCLҤkGK:W&aa0>еjXXUsJ[#!nAUO]TZ q:Ə~gP' Y"Swsd$ Hc Kp(KC S\◓4N,3:dZEA ˜q@FR$@%$HƲ)ҨeHsa#Bak<6O%x0't8o%xR88c%!ũMz4kXAh Fa6z@ ;$ w9Qa@Acls_\~`Jj̎rcv$x*1v`Ĺ7<΄.c׏h/*:d̈|F% f {8P q1N;Biz2Afũ2?6,qY!Sɾ}'mO8|?dI&t-u.ȓuˋM[ؿl[0)}%ziԡdz҅Ī[+QVnDݺd&YastAP A.O9M5Nsq4S4@BOic[🷓I\ ܘdEDd[Ml꣮OWy- q'40f?]-6h!NUn׊|Y*fA{?өaUP'[z[{ͧtf6l^Ʈ0>Ho6:y SDLjRvO5\.6_}9]*9),20U'ضqAAj7B.29$%s!ʔ7x47Bnӛϱ:ry hIFýf.$ؖKphVF^F 8NJEBR%<хҒ"˵7x}f~,[&p4lr1#V 3䜡h/6Վ{`v5/5l 'y*1"-I}t}Ìg/td s[;ȭ'zo܋}ز?IdOPʍҺwYݾ'کB09ԅȲ{g`=عC[yVkEDsR"U,5tZ`% xZtt" H筤\SU^a[/_|Ql^T%JB"̸"+(,70( UNAsDޛ=j/ԕ`USs1BEBDChRSJsrhYQ(F!MS-nTdDk0R8 ٺ%XXIf+fƬ(AFa22QdE2I Zrd% ލ }[6f6[Oo7N}[TNO\}]-JĜkA|괛+ӓi,BaixK*>A7_9,!ek?iz[KJ ?_f\mLfݕ& o?wfr&xtE}@C]2שMܫ;i2(Ao#i"jDL8x1 eL W$\RFpzmH4/Ģ&xL]owICBeN!NסX bʳ2_1F#cMHBaC֋$`YGoGIHET͎mpb r#ZPJbê!zqB1d9!.`{/J*)mV(Ksr|/tɺ2KcS/P?C5V%bp/3V1f&`jrF POT6+T,~{$? w3h܏TL%BQOU/R.Z|"p,_SdэU|4uۿ6w%u6](g|MK4_65sY ׮$d_GӮڃٿ:K 3@Xza|n/6摋_NthSB0VOPzPZ'-'Xu`.㍥@AS)s\k d/ 1@zdik3BSc8f*_34襒Qy?yDpyA<{ɇ]ܧo|VlR||qj? d!@4D,?ݍ:fXcvWX1AG G!c*O~u;F 9quiJޜvo-ܴ_}eֻM;?/lSak2oނ&46GLFVV띇c'!c#clٵ}|S4F>œ0^q0o g0IBYg{uZ֔l/fd `MT`hx՞6:QణBю N_vz8؋R2Qݿlj 㴸hetְRSCl>T!48+*̱zA0uXkU f*j)8F! IQ]GN2n]1@&CFEzưr:º r.+I)^n2W 'YUBHjcP(e͋`\ ]ߙҐRc8~`ƆㇿWp/Ά /7(Xio+-/N4c8vpOvk y)*1g-`B:Vq~+9=Uh U5^i*EWAg&6ZDh L 0? $8E 1BׄN3 Αy@i*d%DJʦS最VQxb༼݋$%SX0MSAQ,FVTL RL)F8DH Z"r*|MrJ@e0)5Xi"I* 4j ؐΈG)Xr24Tgq,s β,2x!r˴p? !gr߀,ՙQMA3`Ԩ)se*$ZA0:)7R1.4U(˸Y `6̼p2e-?~:iE#}@"U ̬vČ&(,(gbY s OXSͮQpԨDZtIZ6a}OlʈB 5OgJbK-H!6]lUӂLcAs]Hd 0]@Z/ǖD+D(50P&ass^,3֙=&g3|gG [qbHMeg=Ŗc'ngz02²K}!l<MM*}{;ΞQQbNMeqQvϛ?f.He%i| !t4uGS𕺁"đqZ1RTJ_{ ʇr.KPjHErCp9^ً1Pde_\?WU8sU=S2hvx!9ɱ;HAVjG&F!R ]0݋1AIS~ IPZ$] =麴({.Ӻ'דߓK\쫾J95Y.,e-&հ+">f~9 -d1_9X\\`sF1F/]3uf1r|w;/7767l R6}s3Bb( 4$ty6%s*ЪETb!C0E^)*LWaNUCBx;cQ)WhQ,PI*r4L8MmhSdhS~P ׋+,\{׫R8\$`yj]=0Etsm:Oъ9sn2ՋI2G&gukA`DT('h=?pHSsU )Oe8/FccʘG36`vI ʬ-44`n##J6xYx^L8@DasG]0#Hr$BqJzfG#,.$0Q3>#r{PSFd!$%G9 iL`," !Pru( =T0=>]Dz}ٓz4S}f4s"*[̫U[pF~OM\FeKz֚r bF$%U T<>|rr-i >5 ?z;xh:㢬fp9oqV zST1Vy=Vwwp} pm%#Y@|};0!|Xs†C1W02GP04R#A^E6J##J\$+!齀Ԇ`qgC`Ֆc>׆Z)ǛZі FI5iOܒ-BvR eI&< ً~=|D&JP[Vt=>x (FBb:֟,|$P-{a}y(w6w[Rfb6ܶFZ>0i9I]DZ``ݍ\NG.[!٬O*;m˺Y;.=:vplY>}^FMs|DYJF4RnN(֥'P-QJB2B"HQsޯB&I#V"S7mb_#0a Q,';3vlQ iR[\jMfqH Pyf3`0;8Ԓˢtˢղ+A_Xtl4U2zuBensrs_e*핫:JtkP2 .;Bmfժggtr۠_%tjDQX8 *s; ]DZ ёl89̆3;);vw;-!<=:KZ8Ncţ˴7X_qbDЗk6RDŽVMVx[][]˫d4U& )tk֕-\B)yTHy#@ҾG5!?2駇! IA~)d~]>Q'pA@tPUVO;UӥojB%@lҠRg`(hŹ(W!P\QlBWj+,KUTZ[/ :zi-j9D +KG.JHsPM=q\*#)iC$NS0iy[Y(ujȓ紧Ak- A:&s$6"^{1XFI` &ރEa, S`\BnŵLpv0 j_ZAlpSXo9֌j[LGIMim @7j}$Vi )"S$fɋ-U>Q7ŷR|䵎bJ?}xF!S'?Ȟ bߪyB_߽k1\SJC QAl>yRBk 3L[5#@A~Gl6|w[Vw `?= &1ROT7ki4DT8 ;o4Nhq N$6S-;0<#O<.;əT&Ng}'oUnPvpc͌O>JZV;a4ht2a03;k5wAF[ r&wUWXUWoW:T_o H_ruG? b:LBi^li Qs]Q~"_rTp|x3OoQ՚RdCQj7кUA=^~:NgHp~p9dg~ōX\AhQ-@vXR^MM@Glޙl"੨lR]KTf/,G^o4g}NyH:(; WHL]JOm35+W.I2U)=դ-ƫNhR1gԱnK <h-=<Һ5!!\Ddmm*-(OhR1gԱnKu; [HքrMjy ]9h5F5Wju|>F5AsuniH~}׬&HE*Z>Nh@c'(Qi }Є&;T+8  }Ф&h/Nn}*->8Ak-8 B7W }Ш&H$.NU>N&5B.NQ85 sr* Uss!Tfea?XZ*wHܒ<*j0:ACŵVFs@ɥZ u(G<^%Do*1*uxOhExy/;6!!\DdJm&q+I}FvyV ~֭ y"zL m/u&O?߂M'7p_KfE^`Egw4˹Y%Bgorlf~vxj2'b{xsU & ד(=朷$߽ u{-R Y bڳ'sW y< |ٺޕMTQZteԕdf8?@[@ۆ gjK%cFy}!ZSR#0* y)*5߉E1,wn&6d9 l9֙gLrN(H{VQZ*TvI#p}JFy%9n PPK|K .KFTrxL/n%Q5.VNxyF ^X.p8S43:#Ix#\8_aIޭ9e~z3_$Ť,ϲGL@5bB|߼ ioNRL &@Z[u[ϼs089 ) h$I@Kt!@"p"99%dC>N !Cu)S]U-Sb4"IM$-+D*"R FȱPQ) <xx&Z<->b4y&` RpEc؂'$0aS8NJD F`\ fyIX$)kmBO)SX2+ f"K* iƇQТZPH'a*#0UZPWe ϠT+ILi}^_"w| 02:ˆPl#2P>?͹9z(=TbXY]7e9ԃ)^~>efo..07B#…3|b,7㻋旕B B767l R'va PJ`F7$l9^U*_̧@p*pEs\;'Az俾H5"^Bi ǥY2 >/yo/k%(T̺j)HA > [biV( 9CJ$r-v"4_*!`!nUTrj {},_뒢 Q.w,PI(3%7v949I!<"= 8H(<q.w`ƶ !~FXy%ZUޒ m<;q2 sZ6 mݸ&Nt˰lėZ%Vj˃'zRymH1. Kgmc{rޕq#RˮC c cV &l/w }U}.fN J qjiV\Fבc*@Ш6&WE=} awQװE>PZPQ@j✝ОLُ.*)ic)P-_UOHmO{YM>}Z}{|xdǂGx&.^K4dW8f {?p[h_}9[~sOf͐ĬeZ٥IcFb^`휆sih#D:8w{D:9oM%~w1MMBxnj3EwU?:~7׳ 'g-z$cCΨW8h<yʺv51DKNFnzp^n'9v>Rq!hCN>~p= ZOs62>Žow*~fp~ŃgN c6mo7<%cYOr{M'f<00ؑ!ʶda U(tzM{k6?#ODkNza'@&\s^k}!%M0 rPJ1Z8ՒtBAKU)׽ ,RMt( X4UT#.A x\6Nu5VՊF)R e`!%Y[&~3W|Oo=v t}!Xm1_,qX9blnO{c\12„MX!3}|a-{GnXQ0+QqPG9p~'j8s~ޅDݛN){:-O޲NzCj؆[i1s7P8Ah|i'i<5q޹<=14AUi^Auʵ1]SU )}nm8eu?%g-(}Ԃk$Fs뮥ڷzAɉV-#_@p+тTr17C R > զS|Wi?L:5rOkr/M-犠;ɅQ %fqec9Ǧ}-1ru bRxlFwV Jd_E5*6nA<(: \3*$Q6Q-G?l>]! Y`t('G90o@.}1"Aj(>T^bx^>x^f3#BcI 4p28EX[C kNQ-2Ҝ⎇쾙TOT:2OfxsueC_!_ͯ.6ʜU$N> xziٲww"sgv\E6̆FC=z/ 5yEˇvO`4<4;f1#^ όvoaEu+z;Ŝ覈n9脶K՜v7/gЉ,9ѭ5g8BkC,6DYZH{7Y6dar@Mc!c?V[.go]粅IQ7C K!CGcnPζp c&8g2']LJV)Q#DCft~~0ۂrXof<=03S_!6Z-Z2v꺓1;b5DŞ߿zu$T% !%inȒXnaw5frhBRS2i%R4/j.VM4o> yE [ ͚uu'^J*J}TjS;,9;3ʺKgrO$2Bzn#C~٩ݩE?0 wgY&6kpKnU)saJ Vu )cVܲۖ&[8V-(]ݒ ?v}V[k'q*̃?ܚBo%Qwv/&b/ẴoG~E2Yg& >|EѕɎT-bL]q8#8P-0KH,hXMقjT?b3?Η΃oCYcqfW76|q}8?j?w S9?x5ӧ'f?_T~=0%!t1 /_pLg >Rvn\ 5#Cn9PIue~$KYRP3r2E4KHRZn2jq-6 O7%QE៾WB[ y"%SX#LnzQv`y,بJ\+TJCcly6<[Rz>γ7ҊIy@;ԙ]3 =sUbZT^p8G!>g/LХ(lȄ G݂ާy{H_fѾ$7 qِSCM8\1Fhh_EijqMiBN1hW>#;*FZA-d_biyHtCg2š;F=S:\!b̡p͒ȭp6TRX2`?VT 'rM'W@UFuB,,ƹcJ 6!MPBB^fTԏXbw)&VwkC@Q22e,'OSjp$1#55xjd4)&pѥJ.{l/8]H)( ZX?͎/>7 bېީtB4*b!S,$Tᎍ(̸|&`ydz_ FWLQ@c)PTGܓcr?b43y&Gќg8h:L癒8@QԼFpSg'(Ӊ$<P%Ӊm:ݵM!Ol"R_QJ| $&3>i!"Tq)PoMDQNImN֤xfǴ+!9%7o%N%)]'fxGU"73_=Ld_(OyC2z|Eز7oC7hଆ6FI # V Qx,)èGڼwg4,\_]6eƌ0{#4+g|vLR`)9BP'kԙHR.X 5"I.͕9fvȰ~5w῰LAƺO?~[~~ݿLӐ)t2NCi)t+k<;0Gz"Br@ #5|Emp9 f`g?[ P(M %#O$:]?a>s=xg!*QN5Gb5c.r(.> = i0HOc4EpN\ 9Y||A,nki17߿"0?|~4`?[Lѝw`hT>tlXr<MnCGB_⻕tԏƻ V͑Q%oATB`ӋU8V= ߟ"t:>*^$a3\jZ5XJZJL,SCU`&蝢`ʓ y9vV pΈ5}W@ x302%`Ba ܵk$7ԤNM 2T|J)H%ĀI I$f<>M4e,gڴHw@#YJ a&3a&I-xi/B $XU#cK!&k? B)ЊI1}XӚ:#@׳T NJ* 9d۞ҞXIC? ^Oj$**mWϝWP3}!nU'Ւw;EjDyf-*hOf?1_|=xyF)oWAvQkԘ>MJtkVX}Z3v(gK!8(l:Ƴo꼭fuf Nwdڡ g(e_ O&SpvMpxGx #CH(ݑ(MB pbm6aczOQ)Jꐢ[-Fw.螢KeM ~UZQ{xlj*9=:O'kQV /Fƾa0 ӭBYɷUp?!}Fmʮ֏(Gm(~d+kp(. Yo91ѽA+POmI:51[G 8BU̢L.gW"MθU_v,H-3khLi\+J!1q^ᦛT ,WM8`wS֭$_z!8nJgRo㩩`+w kC8" = ֊nܧi՚>eHUSzKd27iP]̀{4l͢;QIi^$ ]^pΉY=F(4Q4GJucR!?zԮ DqvdkpYK__ze? +Nnhm;'f47P2hC==pѸdh lZ<C$%L\ekDՙ*8 ̸̦(M|RNt_̭G/DEP"YnQӆ;xu+O9V{ o*'xu[ÞC'Ym숩k}L )~c5NҰcpv5ǬN^;w5!!ӼCWrj(FwNi4j(s?vBWKt-H4&:ZrRVl/H a3tu!D.'!CJFaB0r>8AY,Ha$pBa$ZX&T,ilň@ ^b^]8ƀ́yÿ #f6;BcW8n.nfx yorvO߳Q4}uyLb-o\QVo>#[z1(svM¬,] ߽ -k1,~9JG3?pK5 Y97ɼ3Μid(7jݧiA: Vp!{; ez^R17}T M+{^VcHہ7c=bw^`x ~SA+C-MZZWZYuX;9TГ?E%bN{/#Oaf)D0W>3'$"O=%`)X1Iy0N@PᱠZAsB!#r W`xtΨV! PnW/0cN_Ư4e\7ܺkg5sj 21R ,gTs*3"AYfXĩ%D0+LLB4I r[kg8M ÈS$ W)T7+RTpC1j`-D%LTD)NLU6*S~h wN]0BP:R*HT2T QFKmRdG.\*5G-eDFI+G}\P(n\~(ڗ(X $ԉRA"On=dSwMLtLYD?;NR[U)ֳM}b5㒔5ȁr 'R}\G4 ^VN}*zcQ rӔأl&TBT=|7qAx9;[ E jQOx.Wd`h= S -!YJȑZKa2'g;eݚf1|E\'A*H8Mϰ(FY0KS% e)e!<,BlI APwHoy'?~$~;Oκ#^WԢ/ta^?U9?6O[]rݸcO(+_m#R (O1AL~_uh R.7./] Vupa2J/v6E§vp5t]^>s68-׏x|x]W>A;օ_S8pGhÉLMdZt5-=U/.bk4SsIquMʢ|CY4$ݰ(u>v uBQGuH]Nu˪zj&8䙳hOr@vOs(v'nN;hMIw-vkC9Աɖ^j7- }Gv6GR[x.vkC9SZ#}& JG?-%XЖ`}ݏ:!m1󳿃u /`~]݂b_?T/k ޗ{S$oy֗@+Es'*iOª7WϜZ^Nv a,/#A| ~vՠ.RQ% 㑱;_ i *|tщDz4 I͐g)H J) *95DQn=r Bvy"FA^ VI9 '[Y%/,nv( ~̬5fBtҺ皷u߻'0S2RI8g(Ii#(,)sDn"D`I }u"xqaeQDd*5RRb X0!LvJ]R4yhy(M }e CsC퉡2S%:`eM94#(c160"ib92nZ ֈB$6N&F43q`Q󥉲4Fk I%JG͵c#x+: FUoDm^>r^IE mg}c4o&nfxͨO}] ;`3BE%Inճӛ|8;/4)$9rѲ uy[Xl煉{PQ=j0_pߎE5p2 ?i糪p9,gv0].+`ÅӋn[$rJI.om<^ܺ(~9"5ǿ`3-N4ra4͢d~ #-_͗P$e} HVʲ$Hb@%Zm[`2}QdsP lтx҂ 9nי3OX#=k nMo C<mEW7sh#e}O"5l@9Ƴo ņW@)6Wz. \5|?lxy3?߳⋯>JKD-u+Lbj'8t8+X uƋ[ч $/k:[;g'QIoK9tlKTx j)4}(j]a%OѩXh AzrR8=h/ASuug[$p%[+i%q}Sbi>3 3 1_Aux6;YVt ,'c;_QV03`0'@)%.%4V*zUݼ[=V/o[R`\pU/T :hFqC(aǸi{ w=^G@ q%q wא7xJ}&֏zb=IVRw<>"Q<y)#ܒB+5?@C}.ʻ dd͸vu|4<Ƃ'јtz(8~iT=:U5h6ND#|񠃿e*{{H%hQ]0%}of*փDž(h;83vQጽMŷMc'gLiRdmvqU dz9Lx~|9}t ,'3v{ԉWJYST4f$,coGLTbtFWy% 1|7:cQ_vB0vX!P#2=F$O4RQHrImr jaٽR]‰~G\-x tɣo%4mp 9[Ԛ2L`AjkSX&.D!`pJ\h m(b@=9 JPڡ/N5Y% DY09z@wUnS%%>z Hf2Q%d"]2CJnlTM̲T9Z54Y7^YViYFM6Py>~[M.M4>dw]'FT)ç'q6]_V kX2Bwqz Jc0E>t([>)Ա=Fa+p%O笆uJbYl崡I̍^gٵ]9<4)VMbiHp5I,s)$T+,R&LhѾ6 9-J i/0{|S{~7shLKE]>g{/q˧bvSo\XZsF|yw7:tQ_=D7 >v8| *qTQ)`Jɕ)D0lqx]i$+8J5Lc ==W K¹l>x%ecboeJGŖ|=jM.uS\Њ2 E{t!T"X.uo-zFt)kE`*U Z1P-eMH Ty[2(1:XM$F0̲_Q74F:.4Ň5y}%_cƵ$DT5 |dWݱ*Iaŧ!bjW#\Dw?=tm T< ͟~3NˣnԿ4ppd A}4v|.N ~q ”XP'=>Gpgrg:5h?Y+sj X_=Z-r:xGXBf6=>(]_Œs[3bhFBHVG} #˪fXF՗8x_i|2R嬄+v3T$o*].oQ)O'h’mhqH$e.!*?o yzp_?㫏$>Oo"wfK ي&B ڗD1}wo^>>}LwQ;>ߪ8kBxOxPP:Y84ï|7_qFUE%B5I dF̤lĤFJK121I=͗H[g"m2J.}ZUOZqm n\=]H,[dI7#O'MǓjAتX~ʼ-bP= ĜXhd@()sH8t= }R Kۨt|@+) o,a>:@!UT󳣀p1_,(јFCq`=ۧ"+ew .C݅:Mpk(rwG7T~qq+֪c"Y@ІptnmAb*͊Om؞bzV^(n7iU c,l++꒼*4shbfY%n#U8U!["]Esq^@Kjf7+5#eN`B3^y'AOЫlɜMX/i0dR IUG>wĜƣ$(qdf6HB'!oI]Ii̿ ɂ_]v kx#k"l#<'ŋ6/Qݤ!-MNC '!/A9|0x2,!HV%E+ 9qMe/r7ygO]1^ab}KM>[K؁OSћC3 P\[axp.^utVo}kOA٠;TfFvqN(1 'u%I s69׷׾ҫf(ٲiN5Hq:bq܍20KS%-5Y-I VdK2DL%u:IS&\F%Trc UmI$u( $1<8I\ʵ"+^ QD(&Y , 7I`]A `廒 H FQ DSN0#MG󔻚nX_fkԃ n~3.vR^:Jau:PE/8>PiED]t+$ f{Ftr)߶ZX%Sg/5&kRy4P%$uqZwCg,qpb&R $Klh #$¸}栝>?6ma^CKQcng| y;z hnvF/1=(;؄.߉F_}nc)=|x8k^fcyT&= HctΏO.Kd \w_}5p|7~\|cÿ #}jnzƭWc'VOέի;CT˦uɨ[gˋ1i1Ѻ9x(OH(v PNJ%JC'9,gb'222d M,4Q\@O0iz)O.d[ Լ .8fm`jRRI8g(Ii LeH%%a0iT Ƃ%B'xH& N3̸4!"SgĂ T8jy9(J%U$*O) "tFhEPm!u I 1$AԦZKI͜b2T1i9 s(,,p2kl!W4fz5}6A5Շ[7#-'۷Hh ^zsO5e`ÍۏDU\ kheH W-F ` B szhX ~uu#)pL46Fɣ%4meH% }uhmReay01- ƃ~7}37^Oc2X׃L{tv|ʟxNZob1hnnA; }O'nqR;J}6:n+b<rF/%%:axI>gu/xѶk)U-*B YHb?j  Wg6yA[!iͭR aWkS9 ANɉ"$ʜco$_L@K%CJ&)33nXpK@za%ĩçRebk!"C\ 9ifM21lIc Ljn,i |QOK F!DVƭ Q͸32 8<`⤎Y׶ H I2,OOd V1(߳Urbf Jn|[}~YL| ,0iz/tI- Srs<>ek_#n^_ưeÓ&!2Z͕Ag闷A6K>|x?][< mwlrvLP{h>f'rۃo0pcM)}/(I@PmRmZ!:@`Z(Q;Ehzd=%.7Ȗ2SW!F4B$Shu YJ([*,⤄l"IĕXJR +'080$'. P#N1bg:C[ C+_B _ءi[ē 5X46G g&1xK9 D\IЉɖ[D't*W eE2%YJ-ځu̹f:ӎ% & aX!RZة ?i-!G3,J& :e"1"&@R#ݖ3AX ZJGI%t&pD&fW<9K "& `Z4:-cጬ4?{mo}!%~A C9}W{}w<^ښY{|睕8?RIQP}'G5+N'x7mF8`y?ڭ) 5eZ Ij(gROZch= RuHҦk%~)f_Bԝ,m԰q҈Ul.1Y!lu n.?T|᷎|-=)k9 TQ.ZSae}%уv*EMreQntW(jbA!U'M PJi@ Jd3H+ؓRuIּnZ,wuu\RHY ˏ}Źoye(y+frp5C =Dv,y4fc!z3KfN'AtfQcTzqb7Z՜ syy~^G+qב3F##G-;A<3PnP"ňnb )"m nY> RY믽z}jpn[2݉--eoP|,x]86,W=N ]SnZ/6y7J_,XЭrwB )u~ըHۛx@zIh@!El_D娄S{IMe/>e샋{ Dh$_RgBUXR `eT3بIf+"?]\~_΢tH\C-4UWbrWg#^Ѝ?O߿{ u@Zã~J_NW?vz5Gf Zk'(J2hvҌFcJ˗͉K4`Ly Cc ߟ]sQר( E]P fm 4bw62r&iԞ<;1r,~كbX?nLt^(7]ޕUTK8>J1ڝۅZ2JwVtIdLha[j&qB*хd}IL[K.`jTdޤt,V0i~lj-E,/WDZq5%iGF>b=Az|(gnލ T5*eKVF tNSp/Hv{Ne& m`vC4KSeY&j=WhW>%#w 08{)iA8%* @7;ZzB~aB ]漭Hj1פwsr4!=OD:PKbm͘~Q}I:+=ZoJ04puj"C@4 ]a¹2߳5o è0Sv2K0`G _^|Ṟ? 1fKWyӉlűGN_@ daԇ؆1Hw\V""UC:nՃ"$w@~:%h;xvG 2BmU:jSЅvKn a$xNΝ\+-I:J*K[vE©|s

.ZqR.qJ1NkéjȯcO4|hI^9JN&R}q'<їJs󌲙V'wfgZ%D_*@КO? UJt]_.J#l$ 듴oN:B3wn,S~/բJ799XR!!qY؆3baippWex[{?} ByYXRAz,HA:*V]9Z1^ά!a/8fgڦrVϋk"gR $gA}A a-~V԰DmZn}WvWL"N6ȋaՍ6j^#뤔qdtd_#NEifd&ܲR[v?14k%{'HOrK 8a^0dvߌE퓎Ț{N' CQhFfal҃$Fء2˨ (ao@[m0[+'#=9Odt6F< Y c=CoI[˫4 [AR®Z?A?W蔠\o;/v}[UP%/~?̪|wϿ}lM&ꏯbEUR4_Iwn흦[{irxuTtQRU:(h-6:DT5AhK?gz࿁?]]g%;oXNfog{9|1ljLlavV|s<&LH>zĜ1GisӔ%J WG c&ed*RTuכֿ*tʗw^wVQJ6û|f)kϟod=ߣ$qww8ttOަo?6FI~WgI>Fک;ؖ7RiQ(-(%,4uoSD]N4wƥyd( ( nQB(rbʥ,4eY2ucp&R\VR)7%pL\usGvb՚9^ӴbnOx>sŠ݃k.ǞJ +#qLeQ 3fmW:bpFht@0'wTRgRh|ya4]8M&NӅ慉fhT͒_V Ktj,Y t"H+dN+4wXXPzU@4 )GΝ9;pFSS)8 Q?w/_|= ߭x;PIVJy'PQi;rBB0uZBre?UͿXV'(y@Ipdߞ5O{G/'WPjfTQ @dow+/A6τ N zirޫ9Y^բ,vhjex YEY`dZRK J/Pzi}5n\kCySl"el[ TiS1T`=j#3J+ƯCPKaU+܀Z®^=oPF60x?*~{V{W|,VQٻ(~c5$]zCwWq kiJίetra1-7xWx8,멣Ļw5^ U糋tr$@migFf#KNxs~$,Q,Yιrq o./D`mIjƥ(8 n>4plTH$Q1jְWTSF7|cx;M2Hs5Ia&}O{|6}f4 [9>Tf>(oWlU:tg&{g9;w^Ⱦȇf{ uzBȉd=|1T^c,Y_1|w#'(SٳeʞO*&ȣRJ剳^T j${ uPr$ʼn'=#Dh-ns lu[%* eb& RG\Iq9B$clnK)J fq{frl&YZV]^Ove>,,Bba]j4jaL?l'-@vՀLFNMdY|^l/~4чOmȢbV77?}%5bt"`%eTڬ\(@6yYq s x /^\(N {5- % UbEi2JRmF'i`.}wfZNl9skNx5&Qgu> E!-\\ڐo\Ddv}ޥ~6g=? {B{33;ϤTT&}?o?1<ϓ j~2 .mDDΐ~P > 9Bt=Ğ̓iQ2Y!rȀrv6vs?g_f-1 ٬2natΎG[<=s4zlvdT1ī$ƶ bjHym:1yy8 Kmxeu筲LJ!C*t$mZ޺w-'1aޣҪ0]s4LݨeIYBc\̑h'eC\iݣTg q!?t \8qNaoiPF$_|K쑤z}JbC^lƅLnإ(Qٙ3sl2zYj587Q–0زH M:q1޴2َaZGecԬj+ϏTݔ~30*rI-v؀NدRLx)hhhhU,eG2 p(5(% $I1LH Ic,Rv D>,#`;Uֽ GGU ɓ*:|͵PlK^XcUӊ6TKcE:p"Iy+C4 0DS FRPb%Ih|9)?1 BHx%1)vTF`>9 \CVY6w f pQ=2<׀J@5|%/eBW調pU&#ͳSFc%c\|~vm8rG5}jgNm7NkϽ}:x9xzm+*氝%ZYkmP [l21}`'W!LT_{$K1TH)lw, `dv? )l_dmw!c .y'e9/-S.F} rJ$G}EX*@w"@LePU[!g o,yXNF[.Ҝk}Z-*5 #- XT3T28^Q *čE&\H "&bP 2g=7h KBSI&VLaX P(˜Ai$)Úmq2w )]"Bef|=h={@juH|tͭ]b"AnPssby{t ǬY5xH]P [$Ftt9`-` }r1 F#mb{#D\c"P%bq+EL1L"ñLy/I T Jꌀɞ?HziZ4M8IleC@ڗ ;B$Df LOx.@!Sh2s}:4xtg_'qKꋫOfZ@,vQi;+ Vru/}ZFx!AwWIVq|  ȳ.A dr.GLjYwK ڔ+nrEÅU9ޗy뭑n5r{5lh벱G M 0i B&"fNJg'4!F$2Q92;=wf b-Ƣprʔ}”ttR:TKSBxE /1cpKv MYwH 1|@U"vy!*:\P1ixK Vqf_no6lۛMU7YgqY= _dZsL[r2ڊL(W:do*?x#D%\Gd$@")m?#OU ĸzKӘƌ*RH8MC!k.\A0ðNDŽ:PCn,Ӻ[mtd-,"SM1b?1TZx_7cBY{=-nh vLز:Ɋ_O`، ᭄nՇ]Ų!Bߛn^}gnZM>r}KXQ|j-yG1@>ϐlD֡ǩQ -WTx 5GB{<䃸Ry5pP+v "0%%<:(0f OVͩk?O|R\umPhZmx,ǏnG*3亱]mFYMj&fc7ř$3nnof*7ɲ,aGW"_|hڜtΟZYNү|j~_2Y[&>2;*mOTZ$"p,"!K s6LyTgD$ i>o:[,;J]qwK/f?î9?H?p~P*s!2dܥ Pr៝E!\iƐ?k:?\ܟ՚-NoZaP:BYa9 >&gSסY_?eD 2Ϛ^9ԌR:ZF.}VŜc;㲷6ғZy$FXwe> lpw˃_G?KXrB2+_}2 )؉؉؉؉ָ(Z7da(B$beNA1VbcU%@*a-BAѿx2o׮fIvph[݇rRʼnw*(O}wXr1LV41O(NGHrM"|Ji"N2$$ '1 f'1UR@ZCb;4ՐJI!1N1GrՂ@@Az趺dnD$V5Q ɉpȃ#@QԱϽT]~=:N流WATZ*\#}4~ `k~of|Ey9ۦ=޺}0'+w B7Q}MءfZXefqio1㐚1Vt j9X*7TRՒC>,gV <38g,/ml"ce!KS&l{ +1,U !lb#)R*\$C2cX[#6QO 0NL-V=Uv”$qѐVhibbp,! *?v^3u^Z$M^%G&7ye4ǛC/P^_ b{^}~ /&gێ.jhcҖ,Mtgja{}ff6%9]ݑU<O_ bή'8WK8ʍVLHL Zȹygð&l}oMCpTΠ󎂕ӯl} &H\d}ƔeHµcv}ݝ'wwCZHzL|yN&93oy5MRZeÞY+w:n#]IZI:JYJIX(gLk7gn-MӸuKVs ^ eS8\zձSLn=f'@Op5vy4S71HjHT3@E+ EbhJrbY @XH̛{8q`vIwlpEڇ[l х ? 0|a>p{ռ<5} wK ^dIAm&(@'Ї'h8,)Չ<ЌJ-[_ ZN ABxXi(yDk[F觝qG#nnLc߼qqkVl(I-HUTbIi"' 8Z!XGPʡvU;e&K0n'WGP茡rkpS|Q䒁gsF9['yB:]+-ZM=k$(NZ/_S`X՘!y:@teNc % Z0&m6tO溺Eq` y^}:ڻgrMfQ=PYҨV ObE+%nhE6ceg R"U9i 㪉up*NkLe-#NCvk$7c ZĕEWCsMBdul09,8[8?O?y 1Fij2̅ C l%D8kJ 1\wiUǩ}㔴Ӛܦ-"Շ^_؛qkW w"'}߉v {;O<lءr@ ױ@ p.I‘>zDfILmc㨂\Pgf%>.WRk];zvȈcptR#i<1_j8$W;9tupxI\!j5p5D\4C˅=f$11*>-Coy ETZ/X]rMsij̸}yg|H+)ͅ_on\[2yZ=_1HQO.[U/j@>}7ۿf5GXr͗twD]͑mo_&sBwy!:=0ENޝˉ萡dFQi!D~zjTYL =9ۃ\*j}W.Z^K<!Mu"Wl)am -Qm D @$3(@z-gqL}MXMB8"): t~Px  8 hYJ ]5p_j0NYzMxti*F;Dzș(f'l1."G6{O 煆3B=_Y%nq^r,.%\f'Op~{嵌rGy-p~kF+yG)Hs碃e$˱\PX\i(Io3`\)=#R$''%gBLO+͵!Ŀ](56 >ÿn)RI:D%@Q*3[Md}Vfph8G^C^pǓLS4hdt1Kx+M㚃q)G0U lӭ鋪sGxLٞ5ǝ !H} " Nos 8=m+=; ]?ȏr{~x72rtsSsh2՞0ݯE@CZT.(עtZThߕ3Z4Ng8Vi=X6:3OOv!GbPB}gx9 Bjy!:9$҅FR5)'Oe|2*a".kY}U%grIX_G41||BUeZ]|x~ jkw[eN.:>ìo0KE''۫w~3?\&vlÌѠUυ %>^M+Y-~}[Jnj<" pٰ%VfI8({uF5,L; ?T{EviD Aff@=g#s Z{֘tsys`!%vT;/wu׽1M;"4o9[l]j%wI%܅i TZ2$Nv<8׌S#翱T7-Xh;E0pfZNKWn?;h'.O\k,1znsr>j{ ҫsh^d)]q7+&m0fCA3em4%@B?ޖ;xmmCPU+;'- "ZitWi&gyv+$j䊷#z5]s^KZS{s}2?=(4Y1MYfQi7F>]}z Kȑ?5//w~M=#_˗ܡ#j nB14q eY`_ >ǜsry1y$Q(a %/R$D/'^2̣w^@C\ AUbhA_\~(PyE-D:Os`̖睓ǂ`dAsm.8̮6hD8RL6靴 [u ;6#x-hc`Fd`5-F:ʹ# RhDq)լm݇@VaPİLb< ?G{#r\*5 2DZŦ>V'>m.…" C>p{#C,U25GPo#'BITQk!$+HX7("^ji'\Kbr8I #^T3R2\ɧd6jɥfvv* ]ՄfL Tt-A52)SM@E q=#6)&cJ.'1޼%uR*jy">&D.)sF ㈉1쐃:S"ẖ1M8"c9*:SHݮ]&8Z1 -ڂ`$( XE6 TQj(R--v\}ځ[i酂[Va+(U@ Ͱ1dFaKd8ޔ$&`(%ڲ׮Pf-K=XVnd\sݸ;rmZ"&ؠRCX@׉81w&vA'gPhJ3rPb)H}:-|vBfM"܇lֆ A? h*$"d_Aoϛ[.%%rm15f_QG|Q9@%Zȑ/q+L|[)Dq͎J*8J(yhGܒQ 2Ac CPO#ٷ!f5HClGgET?kgezUAQX?TC8I8p.dO)ծUI k\ZR 4V NWJZ*+28ʱ|sq*!tTD4~αXq,iÈT(ɁǾi*YLLd*Q$@<܏&۩DFG x4>ÎК* 6Ԉ:7>WVr:򗗄as*YIkCqk?J80eJƤ򁃦e8Hm]F8YD^V\bqeb&\T C}V8M#Jq'=ZWCoN6b(o`D+maO<<˘cgSD!HpwSӰR4Pg Y'vBa`c}&/L0 8 ]5hN*JU,yd5Wt@$0Ko.7hӱ+,]O{K~2;<+|W nxǐ+xwHO{veK|8]l/>w*CT/W"`Z:#O)}gr>zӯDsvCsA]w1e[;阢v{t+}Q6o}Kڃzbqo=<ޒRZc.t&"L#<̴9gK>;%^J/yGX՘{};W.?wLXxDRB-oGjpBDGdmot/a.-ElA\b^A&AG8\8IwI5V- -AlQli' ͨWq4,$O k1 }UDZ㩌.fWHp⩣B-;QPɬH 5ݨz5ӭ2QVWSL8grc=k9ӭNYT4S#ze2ݪT$ڱIe#WP ϟ72uGz`2͡mG6 cʿ㶕¦qAֆX.W!Z %AA*d` q59ix_p'o^ .լm|>=i&t\~j&֦Bk%%]BKŹba6o@-q!ѰoQPMx{ѥ׾n1o50c.6RazZ^X xtBRmo+GCɵ^3=RTtfڨDy,H} LMםQ5T|*\&n:{s5PrIPtyLg $[5v[Tg-ˍ1:k 2\(1sχlO6qufP@g5oRO 덎m H`x|C|1۾G씰*qDY)q?B3JtZ$ڍJ2s1%wث3Aky?$q"po&&5 8\b3Vƌ2 }@ +*LM]T8P8݀+ 8ؠe:`b.(kJ'G u %'{!! Ε hu`G{qaRy 8"Q-I"TJJ 4(H|4*$()%.q8`Lb#m;B//i辤g Vh2;Jgj{_aB`>%Cp`sO 6vk[>ɞI0ݶԒmX(-vbgPrc>c@sROm@^p^@.hi@U*P8`paOPX1cs!p3^TQnumXG-_+dzgJJhc@O{RZ>+pU.*4ZdbkׂSN&]SW% [>q`JĴh-A~Yw[ JKXSv6% >yԭDba!ѕ>nުb[`Y<|hFOG.WTR2w\sc(MNUb1I2hVzt%LJlm=y"| M_ʙ9B8RE\"q*b:a9;Ƒj9ic L׎;hʧ*>ONmGS1H;VZ)-0biS)8j+sVd4! G@``WwˇRfwKN=?OUgf%)Pt AEA"r,>A3k# dăFI (x_0)*.FEb ּէ[ U Vt"Wiw9j*;;e AľTdX-д rh*"^W94MBghTCS\J()lx--5ƃb0 } p(Cj9gs<>h+Ԡ9Ha0@Vб/*`haZu.zh5SƺtLZ.bUzH=OD2D¹kΤmN^K)RO'N^кaojR;rBf{6ji5 P?"41ЂƠ2+cvB\w,Fއv)aޫO?^&=ch6Ws @2:oxAa´ AF}fʵy-gdDl;&'Bܜh|rHKVwwWO)w'=z]1qwMôІv:F҆ eNNɴ!ж"}=6vJ0&;GGoq qDr`45s eRP Rzg&2fRyc8rXRBƩJ0F79SIxvI}P"i,9 "g2̜d>g6)4a[F㸨4kդv>@6^fyU(eV8+K,vYFaswVV.\Eߖ>-o|@pnϽ1x(t{wߔK>}68_unD^(4Ps܁*^Iܝ= /QWPJO)JQO_ +$䍋hw\>\s*T q} |hT Llk VwN=CU2Yb e}r e| b7D 0(vImےhͅnAybV&&1v[n81h cݶt2ls!/W /ym@rɛrQ^>IyܸҀ\yV,ϔ+zJ_S1ʧ_%gj6IOa5Z DUk>H7,c5l,RNT3 CN.^PZDe3H^:1~ZodP=>rǸ00\buVd.Fy gLdN B;&#]-ES1LS` Y.R Z AD˖)E " Lq\0ZxJ+Ӛ:sP:_+qM.R QaDjr !,&`X?CN8@-%wyȹYn(E`1B8"6p_\*e̊P6_Mk6ww=rş!_ߩ-wҊ(FXy4s"%jhUDVqHB"^b{ +ݭ.Xfu[i$@'P֗c :~d24DfbN*m[sT&q qwP 31w'SpHSC6Fן;!Ux`yᖧ nz3B9wXaZr=}K(V$gYB& |2WM¦w= a/Aϝ #׾Wl`~'s>183@i\'_'H b(ˢxyNqV|0 "9"Lzr;ҹ"5 dALO\)I PKO=ьql+9X@9e3q4!2pJqgZ,S 4Pz/=Ȓ h?]Py=+Mcf_m&Otul c*Ew776Dqor02v3yBUɳAXbzoD6}9҂ s yXf}NfG,.VZ yFH3(g$+ṽ*GbG4g8 GrXGLCRiA2k@k͋0"NHK8&hTSd[Z\-PGP%q`h 9jHM^vJ7rRtJ:%pQFv@ў=GHv9Ynqk0^s\(ڙrG ="DO6i:͚AK2~NE0u<yX/*CF ЗݶvT]DνPQgU%*KT##icI [-,7~Ι'Ѐ+t.І7[,LrŘ=aWȨD\c~,dy (V&3#SgڇӨ1[]lmkY̾wՋ7Ww 3~Ŭ??#>ك[.ybڈy#OLέVLDHĭQr6=%&Cq=z=jdYӓ3عAeX*W//,j'ǦK_իMx7pIQA75G0aだOhyOT|&br|ROh;wIvϝ*ѓ2 '7H3:Z|P6ާgt_@2'7? TR{bBD i&BD!*NɍekiTmO*pYNuT_M鯙Fߌ!Zg<&j 2'dGԅ4OnMR1Aтj=)Fܐ5xH,\v TscDRcS$TjumCR Xa-vp#D ɇ}L, 2fM}sY D 6)aznu/ۭԾmmtʵÿǩQ !!o\D˒֕&󅻌3Gؾ| K] q;Eh' ih<|di t~35T#4}8IW?zkUYsSF?P6W*2큭FdqG ކmrGz}}uw[DU/},4Vr$I* bReHF—nFO`T>΢xBR> \Q^ñ- NdJqf2LkXSrqv7n!@F]RxRto{][euX#]X72Y|-uᆲJsj*7RzF];\돮0`&N1]uS \)}SFp.w#\>-W u^x6[#Q!3_ǿ= ?Aoycftgٰa96ɡ8;9}n?ޞ=^?>Qa!jLnBjE~;hέݜ<3e\zf$TP2]q)0X9nJʸÆ"RL4$ ,RLtH/)};Nѡ|A5uNpFMi.̄B5/M7[=C ?.?t Ҙ ܑ1UhHC v}C\uW؝ !=z_xвJ{[b{&dHޅ%8?vRի~ S>_[A"L;"n6s$aCPcm@]km@j*AMpNVU5=>MnsU濵4ZZcp!mŮVm]+mj`5/ZI,} z*.V[6(EQ<[t'tZ}綱7E 巇{_-[ʽx1ڛip?*xy(ndѺ/}6wד"O 彏xU=n(*>DU{ 8#-ag˳R% xd孿c.Xnh8;fuш:0qBƴ1 ^=) CՓTXLWG^I%R2`)6R^3EQⴰ\'yO zH! J u ٱO/]jmpL~RSِODŽ_/}& "$+ '/-]3aN*"z6[ q9)ĆBă!xi.՘tWsp8_3湷 %EX{݋\G!c|Y;iűaKss *GB'Yҋd1-BBc0PwEM2c̽S+ n~߶6@:;mm$:]IκmW~n"\*c3xHi2>pGIb:i`Ԃ\P& ,=zOBx1Uj@D8I|JoDZQFcZIOQJSF[U;&]ά1j0]/J|qu /FSU54W]pu uĻS]U!$Ļ];U5TJNs5Ѻ ꮶ6bC1qz(+3"0BJWrr/2@_~_/F܀|QNfŅ7m] ?"סPC9Hf9!\@XpJOv6M2(qAk8(]Gɟiqgxv>뽇yI""_fi4Da )C\cSE,bpB!{g.wpܬ}+KQ&g4? I~l4IH뀏C;1:xlw`ܠR^` Y$*BcYX$k2f0+ĀwӯU}Pc2/TGԊ╓=TW+ )?kZ(e|񢎤+ҮfI'MQ0Smnw)]u76 #g~wnt{7s6?UFLFxX[" ]]m2Kdգ9]$rq\̞2_+]?L#\g122`XX”vEؐrͲ)7n-ɽGխ Wd۫[nMX+7 \"22l7 Z> 2!N:?>O4D_7pgc)m{yRU Z)8(|+A+ lxOL  j aJuJBU %Z) X`3BzC)^‚|un}U9כ2J&vbF)`{}hYLW[uK@7Lf+&BGbO?tvgwNUl+ 񙽶0)`H_RṘr3A a,֮c2fE![=i@P]Q@+[ C협d;G'lȈףg2Ր Cy~ jZH߲SBT*Xq\P_c|3嫘)_L)/G'{7(:a g Kׅ6"p 2 C,+K6hKF7-%b[tN,+ۥ`}Vcds YY ,!#%Т IdPVid'r[s÷D 'eΰXZI:qU,&bEϑDDh?r^c1U`bsU`Tk1(H!Ҏ1̱pX.q!* <BE@Z?ɽQ?ɽ}D;fi-xsH fSpꥤLUTFZ FKu4܈pl4A2,wL 06|F{4H #.83֩2: -H glTC\$wKHC`(H-ύH&VVHhi0@RH)" pb)]A RjHCRW~Oe܀"wE$Y,nLe .=qIJ0,D? ?lY97S_ .ADWcAvF3p|A3+F]6K--I5?=fH+9BE]Kcs0pBn:ˉwN@(<^r,#pTַX"ŶDSN3O,A;Fa <} !j\ GZK59- UBrw;1_[Y0.-av~!&œ7]0pVoɋk ߆Kƴ5עCQLgRN͌>Gp }RFnXA=fSRCXA *f;JƯw8ҹQحRcq K ;5Ha J 2Δ &hپ/XZK+y?ʿyyyڎ{µ?{/HMhnG,+rZrz;_}&<[HRD|t#UjbSrme* X7C s}9A3:N: L(0jf<}`Ppd` SrTdB'hqF3j{~n͆EO E>,@^.ǫQ|r `}]wgnLǷEn>d)>oap f]?:\PtA e+p8 1 ѸprUuY:ܺS[Ьb6>0q5ոO~tY9<5 ]~Oޏ=-jZs--ۇj^6ͫn{j l%_i9uin.ܼl@ی¡Upt!&l8zݺ LG#9%{S$Ht0 5oILjw  `uɐ7Zxe9HT0䜒nۀJZŮQ72+-|eVZfćױ(kGR r'=jAPM=ObBE)uc:i(3&91Sb7ǪGA8 a )S J1@,d%n-#&Bo@i&\q ZRLF5ZgZ9}P49, P|>#%rxK $dmPFXR;i}kx J 4z(J32Rw7at7=ԧ ɀS]pA *5ob">8 ZO&X\|KY-R$]z~b!FC'jiT5zZ~P\e7~JImTCCN]wě۝K;VaٰT[5=(m}󮵥f^DYi7hm vzh[(D9 s~{M3ĮbAme?]DINesnsJmOn|ݘD ]l~ZNWZIFH7<䈦@UqW7ydcf(+Xaby-U5n+Z:M( ΉǴdJc}j19TS#cu43̎;f9" ;.{ yb)Gz,io۬7m֛S㶬OA=ƾ=tNZW`cuw(լeܟHH_IT(80 {% TI%oBmq[4dB QsOW߉8|W.TO\P˻57H+9sRru/w}rwc9!j8)}h'JG".xt}^Qb:W>1!0ϮS2~m>3&aX=jU-b{pjQQ;@IZ˭^7|yy޲?=T?*ЎyN(rF.fe4fkQ~]dUۨcj LM/f **eԃ X?+`7zvaԓ>{uTd{\iۂsm~khëF}5CW {i]Eru=vqj3:5+۞NoKGN{ c%A$8#@Zߐ~xޏDN$5޽Mr|Hh]Mrm:sKӵv9 tBV?蛪h؅w?Qofz|{o7a>_\}#?AmjReQg¸fSM饘Ҡ$1plbttDjr#(cU:?Y!0}uPUΦZk.Xⓚ+numP +p$ HM2R=fiX}x]r)a }QT}CĈdQu+;XW?rK/DEypZ|/ś;)!nφ)_9wwX$ u?į?!džj|4`/WM]]leȑb.6%׽`+랫i ^ 7[GeaYӚ-F v:.u!A0):c)F>bHP4K2ʑFQI[fbP+*B9A,?) .cNr4H6{v 3h5)f,UQ2 Fse|"IF@G$Ps%8OT,IJd<5'$X0q`G]~ze xQgN1^?1x `~wF;B$6. Thy[@Ko;Wsw> bq˿.F۽XRo@^-L(>@1E>hV[C?Yc]"3Bc$UXYβRr%HY?R"B̼9MD%ɉV5ƿdgR9+3%C\ ,'(x(V9+p2mm`Lfط hjcpjƛ(4')ho@g\僰~ v21"aE82/3s 1I xb8V/';}ڬ Uj1F6V6ֳ߮G'Ǜ%)eJ`}|xE|Ļ.?6︹7mNbZop}|ya>Bo߹N֛?-o+m5Zd3 f 'Ԑj$$f/&;Okd9TZ^)ۆk"Xhfkty–q=Ơȝw*tCb57-GE5y$7on6 7z2dD07hSPi:H3s̉.D ,G%/zp>̾as`~ޘfs~ F )g~-` 0/T`lWzm^~SN}ǿğ_qXbdf22[S#*:2].߬ݚuVH;zk#@N#f XzhǺ;!1Xs4#D:anm̮S sՃ,o}T+Y}t(A7z=N绨BqWJDm 5_TeV^Qǽ;ϊ?<{8;QJIiJ/'(Y h98p "WY Q]H|1_wwD `H})RwP:0߇}^hiXA~M D{)fv܇{a7W1m(PKM.b2s l3(VzT^$TNHǽ3{ aٺS$+G-xJ((_:_=|deoӴsF'ursy5J *֦SJ?Pw#K(EݞRSa[#|y[ܝ1~IVݳqW#\V 7rk~T+7lw@KmsKbk=b K >9hw;qjVO9u\jc,gUu' DQJwvU!"V,y|:u1R%ޫw[om~M &AP*JpEs;7hYf>JenjSLm8ZUlc B}lfclbxٽһ"ߴ/  =Ua`~R$c_*tZϙZ`:9[uVo\gO3Xޫ?j9ӕUdmdeӱ>ΈK|,yXY3;KwA{kT!x^ꆥKθ08@s?RT6J 8cXk}B=@ Ga:PDGd4H+87vSB 7&BAx"\ ?H쟕z6n/`tvG(.xظ3 zISF9MwQS?cI1lt7 n}Gq+&c\ ӡ`ˬ祖;}d6,m=Ṥr s@?r?p:|~F!xaD^g Cӄdt& Bd༐ZdRHxpFHaxZgh]3J=N;K-ys1~exyp|0I:7I Ԁ>Kx&sc"rD42HsYʔx&R- -eʜ!P"P).D=.nBݚ7@c5q¸1$Jx=zN ?8#ڄ3{0>Ditl @ {4. nμ`N"yq%FlhI^Y(Ag422nPa0H N Ɛ(,#wD y{hqd_7:ԩqɾF Q.:`OlNQD8NL'tr IpQUιUzl#|WMQ30GQ0Gp%%Ȧ]o '0G, 3B@83 O)^aMHm#+,~y/4*}HIJ*kW'6)\ZJoD58B\,@===}Hs|FW<,@5J8 JP1cyh$g`#B8U-O0-O?<}S$=2ڬ4m<<8%q0 R l5'b7U<-o0=i&Ù(0lc( Ez0 Y?|LBnPH3J{Cٝfx<{Tf[u # 0`:;w)sJ>ftNعn5U2D,5Nw\78qclw:H+3Z^%#v|Bߝ[&v\'㭦9gMwv΀BMu2NgUtLwA}XX6F)ҧ]vjgKhlvH=8j_(B ,*@0EGq4EPS~!YjyjI[㤏T BQNb%^yN}[JV)P O 22EiNn6hdȣݚgڴ[ͪݪ 2%ٖۼ} #a=+/:oX8B?39FwFP6g }r{SkiH\-#֢_NC/,JI/?^}}0`\y<0!?.9͒dyAz /} 燐^ y+䯶\ҊѾeDݘRX1RɃ1 a185$'ʗho3R4_kρ4ȱH_qp Ĥj<̃(G<@FxR8s0: GzKuqH0_%f4|M~\JJN<@<s|L[1r{s Y$Bz>5!Ř/?9)C ߕjZv?y7O`c!/G"z`A*BGYmRVZҪ g0阓=a-/ĕӑӎBІzTTWYI #-͒i >p+R4To8*Ch 3qDp#|@^%w3Т3:T pJއ*CXwu6+m'9f`"@Xs+"p75Hk-ie+Wt>l_`}a(#}m]YoGCv[lv2/|A^>ʿ</k*ȧGO|B3 ؊Az1(,jgv8SJcsez!-+a.hU? 8βLPԤzvu\sT0&Pz)=D$]JԆ&bZ d+*/D_4^}i!dmؾ+ހMqK/aq%g:BaULOw#9pP'WA4zVK@ps5V#d, }9aU2zh3/ހ^Td%Kb4 ?=57v1 b0H/ pjV B2C. $y P+uSMtRÊmb߀?x8EQAoz!1"v!z1 K?r"SeX-㙿=J9\<̛YcZ'i/A:~l gjK)H)1@#璅dB,Gn\^#м{TfJ/&)vD{#v.V`Xc⹥~T̋׻vFfLN~<ٽyY#@sf,sxY8tgQҡ'tIoh]w'tI 4M\FFsIkWK&n m:I]7M\!غ暯{k=u5kݿv_zio[6zۆ˯_V ^ei2SlѲy$-#DA@,ba$) I _Sm vI/AA88jQ#Uzՠx,5vw \jk8ՠy|\k-3W(!UCڬ8!!w|ݑ0Z;iy~s!.$ JڢSG3FD%9pZ v0B pmWQG~C3FHx9gyTY ްt7w2h1#L9iw7%3'Ny]bIɾ>]b OɽL:w Z?Ge67;o2 e 䔫* mc ezm.@ڒr9>]$i]CnPxRT!1h1Zڒ q%MXjttCAdHaocP$v:y ijy8)8DWoM֯aPb??A(}}HeDC B5kSOROR!P!CPAD,$7"ŨyN04 Y2 ? \vY+p(:|S D$]RN H8ćeK򧝕UsSKpEᆘ8 m1w^KUK+Ag}$͏ZM[h[ûRϓ'>tEǯQbi}?>yxRD#|a̕1wޛ(xibBl1jylaErF@ԭ\RT[R A-~\5m_VV~~ I him+VεX`;D4K pK{[+P[Nzi{7dd/%@ŚhM-D4Ũ6r* ^g)m<^\^?,22I ˒Yѯ2%h'ns `/gp@vک"TvMwSr2$STZRGرV/) 2) 2HZ1{|!Zj B@ $CPDfK [",B(X*5֪Ëxl$Ν:d LTr!@jn]A耉!y-39\|ZMгg77_1| " 0P€DJ 0Y7DD4d~4(LN9ŨL'< =1rS!Rʪ2I,`P2 CF>f Ap.#fJ L$cA+pOC:cv50(4n0 *b߈ ( Q9Ds)@J7kIk\fN,@ MRA&"5Qy̭A?<}SSd ]P#I >Ҿ6hndP`4Poޑ7FV"+!F8)A[_{wVa0vgߪcz SLo5A`97O_{u|xAսp8$\3^npb0~^>.Xķ7ُ~73s @K\?m:_,ע~5go>^O;#X~i~gdbΚDͷ{㻿C *~>|Wl vdl -k *~VLOVt<縤̸_wg5 oݮȖw`yDQ_W_J,t3#:.2 "rwG&WD x9>T]g3<7* khq $"؋0 db#Zm0.]7}s{˴Rj-ȏd@#R;PkXP1a`9c@G>~;ݢ'7FTX?'T+a8Tj ֖Փ s~i, 4[FDqŊͅߵj ޺,rq4i.˗SIs.K2uӹAdQm0){>Gf BLpV}nZe!N-VJHk2[:Pdl:!Y2*JvQ)Gե.YZez$ͽRyqM^tQXkR{ gyM3̛4xh>vުq7X+ "-==y.C/[:NFv+A71mu圮=9NDԖGIG8ީiWe;Bu@X{“@Ҝ#|KC I@d!คzuu!qNyo6 i] ~i|Xq}NlDU.M/{BnG+U鴧jYu5{`qj4RQM3. !w eAW+)Wk~6Jơ-(tDg\m$S"w.}AZҝKUd C&(.4{6$ @ (0Vgr 3Er9mY~3Cr8$(G `fOuUu=US)D%1R!>⊄ΕK뾨VҺ/wW0'=ۜRaQD읇<J#tfAȌW_.~Jnnҫ*I܋JE`y3E$;ԭɹ0Dݰ!%s%-Ttq Vފ.PevjgnF6]8 0{LuXƩ&Ԯ z"87ꛜEq2e4̪44`j6L/J#k8 vB>|,_ya dQ:ݘ[kSc:sϳx0olF5@RdY,o^i7ԯ+c RD7w+K,r8԰6^j/˸tX~Nlln\{Qs:h><~ 4&샻/')e?~ꇳ<ͻW~ Igd4>NX}ׯ__=;7o}& zIzV}f;&>]-xxrr NӍ7ݿubc*4A|r fKC?܆טvq5^copbTv{ 4OYKrSaI7mez}vR54տ=Q}ru 9?]em̶9xޡ*ΜB {C0=݌UzXl[elftmyξ'= N΅ǻEg^n7028)^%h7C[-<=_V^?:EO$y0Sb>foȿ?[gH"~Wxpo4-F׿; ['Xӳ?ao磏]Ag73 Gfr ~>'ONΧK G?ÜmS t:O?^28)%}>%~B?G`)1q(lv:Ͼ[E2N~Nh_~~񰿥gwނ5<<^Z' <$0$޹a{3tfqo߼LrAҌ@N?Y$-<,gȹgfHq (٭QzZYrTw-v\=G /)A Bh pRdDvu^a4&Z݊?J" (|-DN5t;LsL fӇhEoڟF8zi$ABxL &`.g8"$$4(`xJ1a oU/20ge'۝K3*[vx,)!}9kf5 H@IVŒ )[pcf;zW$,,LiD.H% mdϕ^:i= BKť DcR\5xLQZ@>+7߈%k(7K4nI&D!ƊlنE[lUkb66=bml=M>n))F\WPcQH0WƠ11~L(=i"A `'(LFKc{ޱ݅\Kqs%!׳m?݈s}~r0Y4Lٟ6qcLrPLtMJPE:>gH6,JR)a]H[{Ԡ` Z}ͥ;r dXљabOCe4C4’ DT(66 R(`H4rJ JʯCQ(4("{>!4qHY &#H\#/B"x ƹ`1=$%DDDgJB2(EAĄ"x`= w8a:LM91ZxZ怆>a%s9)yKB-9UT_PF >:9:6Zq\#.I$ /jM4mĤO*DY oH_>C~*N JE_SAR_f Jc0l|A6rkgF4jFJj-{ϴ;F2WZR&Ya*,}a}a}a}'wM(u837F(TH04 8y3b$BfE`j;WZl꾼2snGDitx'tY.>a]ol@8kuηy<D00, aXړ>J9>&ηʢ@ 4Yհ.:|\|CЕK`gjdcwͰdvʛfQ.[xHp|帛hZ; r|75_a塦( kxN7Kh1 qk(!%ZA`[ЏYވ_YBER+#k2]1Kkxm@cj,N1~x}) ax7[|R0 э6v1ˏ]m9oK8F rX Gnq0 2Z=.>#k{}_E/$4vag`aSgJs!M-rnaI4&k-3#4e> h(FF2* EtĪ9VتmoefRSHӜSl{z}8m;Ie[Ub"`&RN}+̗GTT(5IRR\nGR !e쒢wפ'ZvF䄫G ˪ñت 0O=,pxFmÛk-Ћ{THEYd @`%Y 4(0H1 $`UZ!Ȯ\ f3Vwf8 ڝ*xAUʫ$Phb0G>d6sz%smRa9 nRQh& )g};X#I5P 8Z$ RP3JsI)H*ge *4m p4z`|\R\ZNjE؋wKm_+hk{@h)~9΀:c5rH7bv,}?bVaui%aΪѐ#0zJII%^XGϕ:D>`6:Q^i`(&C0 AY3*gphhrTjRLVc[ W!c, qj+%<Ѐ dACʐ(qT(ur wu}ýFi9}AO.ܬ6xS 4wYj3ݵ`J)ۻg[[{j";~zrZCu=Dc[!=@+Ie۱qreY }s_@G9X ]|Nu%h(> EM>]2p=s_ΣR#qHtGK}霖1u1∑Qra +k=V1Xєn"GDd).1R;krY\?6:kp.9z@Y /ADS%*WΟ2V#flY$MFcnieUL*W=z'K7.r+A\T-.w#5P|/sl<)g;pF}'e8^cPKc^~59^c9JihU,:h66p2)qfG&q2/ZkeVqmu Yupbu(65RHui]}oQ .> }_ﺠ.+EY:@5JUwTDGo=^1ߛL>9u;UXZD,M"U%=0߳lz)ML5gidӭbg䣜Krgy04xyGCmxDRt i$蛗.T~uD?&"{c! C3!{)bSq_P㉀ n;lmea4\Q_K#"smBy^` E"TR0#(RVS]ǻT`ZPN"-Z2A9c*V.n4"vNP@iLik)b͹9C9gIU*A9c&%jD/y4_TTF|37~o l{M*:Ps56Z[&v'p:( "?F+6Kka^3X$ &>O''kU|,)ܱlKNJ֕gse$%>=RsJ{Bn/Z%m&=[)ʫ{H#758 ޗ2~} ";䈔$UgyܻttT FSwR)k',t5J"BWTSZ:!bI肹wuϋt/\na6NGHuO#F*Io,RǼeЩU1F#({򍹛^ZB?C M:4\,A fP r-EJA]r Ƚ*nQEj+4  }*9ET! 7LDGdkmbqMΚk2ÞԱD _2/tb疲S>ç^9ҝp=BAx\JDwY9"+EjݰgN]#R0*AQAUg= ʉ}jݑj%$ R5hz5twC]tZ] +&U7٩*r:bM6aI؎ 3k \_MvAV y/[X&i$ \+OLr1ӫ|~M$/^,{CȂ4hĻ*9oɼ{ =`{Ej5*pk*jx),VW>@ Ǹskb]6i1TYBJNE7ܓVBYId =EHrEƠ1 J%U^DhsO'1CfZd?zpOb0I[Lmv'݈ 〔0}4<ya N;$"-& w;so,ry EKoeW+Z eWiWLtAՄj)I^2M2H Z"JǍ0#%)US*(Km! S؝V$D#3̱Ȩ'%M`D#f!Q( ,c1*mZTTsiJg ,ifipc ;KLnTgנaWZ RQEwģ3͓/`vj^c6!^ғ4.|`GNB~Gm<]qmaIQ;>Q ).ͧ:zכUUu v&|dVR)RN$F4!AD䌇`V)XA|ayhTָT 􇫯7i>pҙc<</wboؘ()M](R \S.JRR vHaj]6o~~S/'m͟f){IŘmNg1^xLc}ߐE</??ƷoZDQ"h'}Fg\%3Z@!G&t!]$_jĐ!rwDK)>yʲʡ:cTK<*nK뎘]N5~ePAYX)2+f+}]VIy~n57qnޚry;OF>FD/;߼oI{N+hEw& 1toו37_"Z"Bl>- c J7baZ!>k- 8Y5Ri(TD@YhH8PiH22P" xuEn/IQV튶 NH uͰ$#W^tI7HC &:fSN0pX <)ɒ,+! Qb=|j#_EJƔQ-AJcu'o6O}|xjEؚgA+} LIr]O(c`"*8>&bDaC<|VRSښWvBĉDl69/,gE6 n^~.r@h4R]ŀheq^p) ADFńH- s+Py=8G.ըν\^rV#R 8/zi?m0BՄ眱dVQ]+`h9+y9g o *RpWkS~j(_\ƫ5ZRnvJ6Քpym UUЎ4 s 54"I5:$\rA5ksXCR`Hm$ӊ )ΌѦXPB2(v5Թը$v춈$J'*jeYR|4y!$^V[TVz 6wJ?VQc9.~51eV4/NiHˬ:9nʭ0&D0Bsc0V_L4wۛGvۏ C͚|.Zg\/yn=NAn䷽g\{5xwdWxiX3^:_fHeƊ|2N H$=S5pwuP\ wٌZ[JHE񝨋 uCLTpذ(%*۬+݋}>2_ w!IByx5qsIV DB+g]@vXxj"P+Oc^nhHa5(%7'Z 6.1=,DCh9[@30#]ؗPiYy>a4H yF!(V"e1Dk g<`4d$6< %)2 E%Rh:Zk r)DIN$P s;!b}ya$ zI(SxKL@cC?J^*%?aMd\}8@ܒx ^3eaxx:Yȁ˒9\#/ηԖJ .CX=}t-'c0VL#3[[.; 87YeoZZz {Qlp-SxrBR^s|?V:m{2C7mehGުG8#H*MvgP-^N5SLpXCKYs[?W_~LNb&0j }}"lX_n\<՟f?ֈb>,\~4vs۸QVOq[Wwa/~ ":¢U ˳NVO p󰼟gZ<?ھ U.D{ٔ³ߩw [_Non'Nͻ/,nCXn}lJPywC9x2(1hN70r4ֿ9л a!?nٔ*)5=?[ )]| p]ׁMA,̗|զY)#5,;+eJ 1bJyaE147r,Қj*T.V qݸ8sCTQZT9b0$Ά-;>{$@Ja:M")g&(B 7/rLRp7}33_y릸H/M1VHNDs%=G%ŋ{ݮJQc3 2K̗xO٤C_9S$ixlQ~[ߪP#_F|;5DfO?WI7YD~sېxYUX|~ !%gWc?eyúCR\;JN/Sx9=qwku?uШ"g<%TGWӧs3[A=o9v%Z i%iXNN`𮣣# øߪx1~K)B+x: F0^b` L:2pgQ9Gln9:n'Q^ˡ\KS{k3sTJZ->EXә`'EZ~} kk6:J6R}$kÞʎ4PI?|D? ;+w): ee3S>LQ =< sR S*eiyl)؜^)w_ELL"zw1W? ܪbfs_=qMNWy:[2MS]0O6!$%Ds?]V>r3s"CJ{(t3h?ˀ.~^xۃl ^lFdǛ-HKy`;[9+m%@eJǜ *%mɧg$&-6Gbc"]`V4>Y"5Ə`rHS3/w$)6ߋ=Zucϸk/~utm*H1yFU?z*u"A8݋Y#ꕜN5@N9 9j",lVvǘ]CTk)鈐 z( 8Au(&䚔M1 ]BvsIaW75+!RWܰ"#bم\`Bn'ϝ?p$\f͔ }DsC_oP!i`RQ]ol_H؈G_w]2θUsPq!Dqeﲁ^C:k%%3,0;q(Mv?3*15{G&b (b1G%0\"\4MX&$02YJ3]:`[X^y&ocDSRXF=aL)5;ô/hISF&,%LĄJ:(Y>`bj{W20HZtQU F>>覓n`Q(:Se]m`h@lT?W|DI_Φr0?rpMjC"34C?V6a)͗J??l}~i7,Y Rp,(bNeqx)H K=Ki>1%+U\-Z W+T C9֐-l>]~ p7Ƿ &J][s[G+.? S[IM&"+$䈢H!%e4U$QЍk7@jaEڣzթ\N>o'LS.2gVs:"y Fq- K|!˞>vrGqv*;ku>s[] -黜YcX¡'n# |o=PZ:8'[5l =V-)EaaG7moQ8$&tv5:rR,L;'5{/dartzgbwoƛf!]빆1GqeG0NzXFt6hDc's9PYQLax?IY~洟 oF^ÐRuW2H#|>LI|x#2֐ogvF5hHيkjFVʐbv8ǣkxf+DrWe* PjkYa;0X}:j-܋E}[mf=۳p.'V럿8@GC/T_<1;udNM: k~mWQ~ L= fۭ֞׋S :GAC/mC<$" CXa3x s"dքZI:fl'g97 g&!*m!YU4"7 0ȚL)PO|  g&'A%V 808TH\Ġ'g]wPW~z޾P_>go_w^i߉}7f̟^8owEmMw84h Ey%LuiH1LԬٓQ_ [J>h}DOMޛV|ZwR-ݝ>@ŰS8\J7T+c0G7cj|Z o)tIJi*ʄaBg)=`sYٕ0\Z9eI.ȿ{{ | h.;l$=ahDwf CM'@rR̝kIJZ%A|0D%6;AYa>x3!yxk+/!* &?p=g9M3/Nc!x-[!R=+A3}4MKdV_X=+A8XR%H'g]wfk﬿N,bdA澂w>]L8y1{יԸ?1Igt/+❜ؼb-aCQ!۱®7x@k&6 V1 ѰJvP͚>@| ;7jטVNZ=\>o6Q S^<\hēެ]K3C!SOPL*ķITex1qO@ͬ́o! ) کX2Y+y[ YdqrbFkܝ7',1%8UCb-%F6!U^Š.I‡^9}41 >u,5$LeT)+6j N@핣9qH AYD-E՞ZێWKme<{Z?ds;IN=k9P-K s}]|U͑=`2 0KTPgB6Vܔ ɒ;>ZJ"TL-f8KKC\gtiNG]D) Ztf2nj|<3'_Yɿѩ%G@A AMpX:2NS!ȑt'TٔJX#Pk[U"z$Fxye5hVv6q+Ȁ3*ML1r8lnTu䪖>a_;xFN:Nvv do[UcҞ9kl{ZlBI1WHGX FXK ,%"Rrb `v4GZ~큚e&D_`dy-DʩSFE-qDVぼ| X%-N]Drs|bbfU6=UkP[*z:]W=6evY@17w{ ۓ_w>w7כs>f jN%R]t{/6lqđuc<ՐʾavgMSGS(y~Fad'0z8;^}aq0qokt'Qk %z)/N._OT,'Vk 3'>0'ȍs=w$s(39Qʁ&rix#ySźv\;x$[Q=jM3D*A\ki8l{eD)DXohz>loIHZEZ@*lrgLq~=^vShH࢖1Q67C-wyy򔩼/X";`Gι&C&'C PZ ^^yol-Wt>2k7O& Қ}d>2fjm`oGڣwllփ}R5|<Ixׇ ,Y8}<++bS{Iւ`y vBuT-,3Z5h3S_TXm-=bjk}g#yuɹ~W#AT1Ӥu!1AE7K}R=H6)?1ÃcfDDZ&mN'ﱩ.n);ʲ]}Dx@@_dک#E>p̓>ա}ѴGQTwG W%Տ˯/;#By7P_GӆA=f| @FPA∮]_~>_'ٵ2.UAגW!. Ih~^CohWC S}svn9 1B{L9DΆǔ1ԜR"tED\j]ȹ2slPbKe{RIekE[g1a@I9mv63i sۼsb>^Ԓ鏟3 [˲tWݲ֊[?w_~zAWAkhgơo|47joZ{9O|K{˅|# mT?y_^^݈gggJO2V_f f<ׯeF׶=6AoYfj=WEpwۢ57vtfp`ě-UBm7[tZ~RpͫTyI|E|V9[N|mc8*N+QA\O)[z`x ;H׾.Tx+[:"(uv{vKvj9{ .8"|!ZC#+9ٳ®yv_~~Z ,JuM i-8vt!+%!tG<BRiR IcF_u>^}r߻uwV >OIhͬ=59 D:ܾ_:s"-&E;e;Hp2]\˷ҲSmt/$Os kOg$Јm$9#ћ8HQ<ȚszVk$ޚ٧=FkE,C ajҹp{VNA.fc.@KL\bڋj^ubl K\)y ݌~l9fϞl!I{_\G*)3%m2.5߭l l' P>^&dSιZS(QƲ2dC_W\q5P,!`M#(t ((kM y{1xEւWrd-ZX4mEb-"++Vʴ_h26b@mY9՚Q KVݺ/6$OAE{3Kgخ1av, 4=Sp ;Af~R,RtރBYΨL=xM>xރ0f=8JF;I˘nVm+X٣qLd}UB}z:Ijw4発0^q}I˟ F6To;^v^iVcq6gmlO~`Hi ;*3 BoUln_<:3cJޤ.݇&j^d<]ڙ 1PV[32Y ݗZy΍~P33${V8HAbe=)Zޱ37}h*&pb\Rm2d"Vr Ҡ<JŜ 7lgΪVQ9Ŋ"Z%a5L$tD;w-KnJWUfqW<--Z-:<1>eɦYbeb>PD&I` JzԚwAhH&М4) `˦2DJASLeHKnR:"$ˑY$UTS5T@3dO2mONl0]Z%*=Fj):YrbTv&FIj"˩As.8K$ELYEbB''MY-6~KTJ$'+oIڶ$i/ޒ$ih[NyqT7@*H ˕'9䥆)Q +&@yLrKS@[/-2*Ȳ#:izihT]:#li(PŶm@k8o/P,Xf45,Eҥ_Qbm+15ቆgmRޯ=s*jrW~:Ж<^hpՁ1:ъF{qxs#Μqxs`_'EܒC)T ̥["'i%/>^f^M;xDNh#ԅEUonrK;'k#^ݽu$Q_{# `qY|U6{3VV0s*&}½# yr^H5Qjk(Ser+S}3ӄur5m*C{e?yIˑ_& ;RVR_={C* ְ²9Q )$h(GvU[yr5pq2=;mS m$v6I%,F_wmnS#X^ 2`6;q[p-1)[lp[Zx{gQ.>M^ÓT\ FMzA,x|56)F9'dG)!=y$Ȳt a/Hae NZCKCli>q>5+ ݝ i9Bk͂.ְ FLJU&Y23+UO6g_s|Rȇ@X1RrF Ĥ(02qN|bˈ+W+.Z-o.Ң,6< DՍV'DŽ%a`ar@AI/`j{fE)/;!YYs1#\.24J k7o+p:e!%V~c]U1*~ag>7Y9 V? C>)s7Q<΄2VtS1i;ezAڄbuI wMcjk$lM^ٍW蓗О0ڛJhx=Ooկ\0(C 5|40Yhc|r ʧ-"< l-Fnn'z;ߍо(j&63kaǧu(%F n!Wfnӝu^7نWyoD e-VmJv'Um y$Ȩ-y_,.c?&5߱Jޅo?\< _WOu%TWOu%4^ɏO}hNɲQ5&4P&(^lC.[ oYO~Û݇xАy3~xS߿QB;|~F_9.۠ 6'-tKkZϵIw;I!_U t!1c4J)Qv/`L2g)ϰ*I+b)<Xa+pXkS>l7N8`J%14q%!dngPY;}کJWɀ[^6VN2@@忬3%hUy ƨ v'ueoTI×?/_Sayx `Ad0daP>C( ˯d-O8&FQjF隉@Ŝh!s,?s`,SZmp " RkYL1MنR*e#)92e/fp8 Qx0ʆleN̐ hkAbTR+m"d=e3C6tI% *^/tᜉHPQ8L:&Jӥ6XL\!Vcs Qk֕X,e*[M8y%&Smh)ZObu8dbW Bˊ&RQKr.@LɗI` t\VPSp= {c"dlED2dǛVKd6%zihL`ǽ3% (JSDR?/_^}OH!z,`뤞ɂ\ 2{s[<䲄͠vehW0ymOFAf(çqm#íP1"wxT#D׹o\6ĵǻ(øN_s{`75]bÚYG$dK/k8?\@Bk4M8TCi+PY$ M[T;׊A+,3QrUlf)'VL]{Fo{4?I-.?@t\6YE];|~J9MF\o]Y;b[vwrRV2_B'& 0>6P=biN_%;ivyFid) ٗI(mi[Dy#p2= "ɧOV'u=^߾]{ɡgs.^j͓+{͂zo{J!7mϻ2Y\{sSasp[Jdx[ֲIK\ul+)(J3 hmܐ) ɉ $[tӐcI`i0hd%| "Ȩd#~Lٝ6$YRzlhf's*#K̒49vF}S-4Zi) {~[݁-AU\kx2pR(ooZ4"I!jV6'S.@>>Pkb:]g)5iU<=–ha靭6rYZ.}Ib|W|\tCԀ9cBuۗZ%!8r\2ߍx..?L_ҧX.qmUbx^ף %腻x8l|.|K݈O?bTB#Fí1wyE9v/?;/K8d6v3.#򨯻nvs3#mQd4+W~}T'G-2ٰ@N Ng6-zCv"D M5KvӢ gxrrrRߟqr ›$} C$uA$)2'_ x %EQyuvr?QuQ#E<^feR<IWDhx0IV7-O8n7JI.R/isD;zI q0]~Q1VfKԹwT3roo{mbα=X- 7ir{NEklЮd@\F8}q6>X#lKkMn>Pǣ%)SBsr˹v]rnm%/%%DX K.컬%C_qфiMsY MTk7k</e5]3[9ʍ/So߮Tj>(S?_~>fR ⣰#`8>94f&MR~\z_YnR`gi:kó,.XsBEkSr:h7 +ָ63do%ۂɫK|P<,L ׃ySqI#C IH;Kߝ^|u& -$oQn<%r,h<7y౯GdKmYSk+/ Xz`6>zt'+gA9 G\UQJ  i[FNZ^!!e$4lڲ[p f%9^9J|r.g+U(@8Ĉ1먍eYI/;), xf^R`4D"vI(Bʳ2Q9Uo+m$GE> R$SsFȃYVrKru,oP])YI塴U@m"A2pXm d졪|$ XG?aRU10͸b"=*'9AAU}\OYş&2Ű$g#VQQ<=R4 !M\QNI.JV',@xvl7i#n fU =9ř d=N-inǻuxRvZrmKYnn?-{xͰݥ.%a 6=4J/@0TNzIwж^.UЭعLѫ5Z(dLpdA7dҩ*sl.ɤU'' ǡ8!c > !T6JjSWK@,>_XCi$MX<7Om*G $6Dm_@.uZCHZ47ض-tr8Cm7_kGQs%?j޾X?rBbMj0޿UoV#g)6Lg.Kp&g&f,r\Oc\9 P/UK㧈A*_5k%գIM&\q ă2oL|̦g.ن^>%_%8%1x [%-?O} pddx&8|[Z, j:Us=&Tۻ<5Bꦃ_ g|1~BЦ-BF:@ŷO)aUڂ ^v~2}Fnd(|¾yszw)r:mxPi*${~8xySQf("uɂ`{;ͪ+mVw ƑJeWDi[MuM]@dSnRZHR(4f ^AC)n;\`Ljp`Gƻ l,^50ǺWq| Rg G\WﵔUpIS5ceEˡ%.b(9o#l0[uzS&nueoJi/7!|!OyfDB&P4YAldhQ<92c]Wζr!!X-.AȆ% |S<{8V1nq{(ɻy,uub'`Twˑq_6Xdɖ#dV1GWf4-j'ޗ~>K8 m &;0?;U)$3Vܣ{ूY!r|,jcm4.NJZ+tٰ`ؾ:,=wC bDR(v:hݧՂ¼mU AԿT4(9JuWt.ƶys$Oblq*dWvYԯNթwY)/ՙ*'nz,́@D>zscMצh_=ٸ%zouWkÑ$_a2V}ckx_r xb9h):Ddl\&E;re7o'[-2.BrF9S[ MheN|rj>4oV#nW'?_?>z5f|uj1>?VN=.>+7Ώkq߲FZJ] F^6ndw|]-GxZCJN2vI =9$**&ًLxNWS|i#>߭~O (,䚜% jS3H*|[#W(Ѯ (]@ V7{CX;h0 `ݻ#8j.t`!.$KmB(RX:+t1x83%o(  s%syEEE~(Q n&,LlKY*tlQZEfȲܘ@.iҤϨf |(HXKE#lG j7\OfֲP"TE|e,Lvs|{;tUx|5JWJa< sʷAFoPd| H(2ũӦ;hmS. mʻC6Q8b8/=w1G ك6M8Lr xm]8 rXMKڢ>.o'':0`O֢ME(n[YlB@cF@dպv_i_ fνC#boJX՛;uoUG5oAkmg~{ǘJZM\.B\BlQ w3/MN)lN &\Mu 'z9aĆؚ%)H9H0dd/; vVBm~ k"%;_/bDGx#p+h놈VZh?%~a}i뚟JV И'nPV+xO)mBO=] \"#/4nVɮ>f_ULѩ cѩ5nڠuzFQX.-N\u/ݜؾ#sFE>dBk49%M.Y`L u0J]$/ ;2]c:EkktP{Yv9ԯb*+þO\vYS{X46C{;Һpӷ{ءw7a^OU%uvsS]5Ϣ/v/O T7.H3cWqx)J1c:fTuqީiذ=2{:t|XrESjY"E::{0T66* jo3{GF.J}Y&$pO#O=aZ2=/Pof}ط~Xsªںv:f8<~^`u?-PD&fFT5VB{!bxSac!ǒCIO=rntA{wgxt'Z.*%ћw8>AsW "m^V{!It;\XoNV!*|pr˛sj(0v1V6mgTܿ -m68ɫ#>QeV=2]׾d˴4eS\?lT'M|z[!#X\vmy9 ?A_zMs-еo.t ]BB~&n/2'~_~r9{=~^rz[w4#7ߧ VW⟩[|`lNysMN`V iU3KSǤT$x^i*%q,9^i&JLA-KiyOjE0V0dEy!Y!lt(Rba w4ԥtFgPZ-ܖ0IX6iqN{A]vn6r_慻bX`klpرvB$ZMS P)FFdY^* "3DNC9E9zLpV ͵TRp̤\ҕ2wXh(b]ӈi)Pj1VJ:F.9T ŰD'i!%wH 22 ɵ(Lkr|2 k#W\Z<]4Zݵo t~Z!ѦǶ[|]m=v~i]7] hk >q!W|p>hbƴ脉rFFƤ mm:ڧJ÷} H z.W*33Kբ(e/ǣ?һ]`h[z!q AJXS`&B HHI҈`4p!ȍ%a4 6&,B5I }HteC/!s'ɣ`-p%ZrH 6Mm ][o8+_@bgN7i J8/i,?EɉeDz)StġUd2!fD"\\p6X)4LD9eSj5x Sj0{S*!xԔ@$ 9مiq5f O1)i,b,I" :"#`my&G=<hδ8k.EB+8i5c;F # fF\1 V*|ynM{ftϲteċ`f4 ^!9t뛤NIo\ j)ISyC&SEX}-G攟(^֟w *,|h^Ơ=EĻ8UE8ͻ? wtrA[Zx7:=WI$ޯWv9I[cAsqkðtW{&4'4N~35 `Pg/epY<,1FQQ*ۛ1Jo:?")Js^U̔ 8Wio:K2cË1žI8AB8Q]S}s6]Ĝ#R><ź3A8zb*tlIDJV ͨMQ,Bb1)_9!q,zӠxi3y rDlJILp,ĚV!)ؘDGp&P%/5wwX'*/!1S֊(&H(hei M$b"aRPX+ 5_t)gJ6^!%˯jj^ܿ#^0Ðc`ى ^`wvf%X{NLWIpjZa%9}#H}Ɇ[düu,d RBɆPa$"Ɇ՝jcAPܺVݠ&Q^.\1@aB]5AcdÊ3a#D^@urTk0@i5 93).sԒjv*jC9R+PCi5#Ԣp,99x%<RN嬪vI9{995hw!:U3uS&$u GuBQź]Nl-꣧uCCs ));!?nBw"0 }Gv8P5^&Ӻա!߹)2$rk|~K2|P<Cѝrad޲qOWva ķ H7XwQs*pUoj@ڹ[c(<|Xrxy|o]C͹f|#j53a"7Jj5+3:L܅oxm)ƫ ¦3f %߮@goM=L7_1?p` dǨ٪%FYb $ilm ɃO.דUn候~r=r:LBՐ%;m)^dP@`PCpu.[Cv5^YoLlwlkcCD5n"{/py'!%?%ݬ{K׸9eOF F\4w`ߝuڿȻ;LAXJQuTNۥY9NR8-S^-fnGE2aX67!tؓB>WOgaǫƜ5en}kV'ʅvWŚ} ۰^\P$%{:7}~/<<.=W|sodsϫqQ9Q9} [5Ĩuiݣ(Z8mZp.E)>moZg.wX/(%WZ,S+4^Wiy~}S Lފ]vK?>*M AD8_j\`ş%/d˞ ^&Ezbwut'V?zQ.E4jwW3NVR넞BPEV뗴o?/ew#G6KY=֙=&LJD}l-rۢu涯n' u]:Vɮ"NL^F:.dr7]2QlTxeg$Fds@R8n@N7 m%>/o(:5̺g6G9MFPx(ԕlFqs1eJ" C"c,[ qGFEȢkq0*KSQE,H*VQfꍧKܖ=Y@ ]HpaT}CE)wGjAg;Kׄ>&Є&;357yq$cuOثdɚo 8It~wROyIOhLKt.ԅn LcZ-Lz'y܅K:)X_bJ %;H)̍ Np<ƫ&ZU4_^ޣ>YmֹN>U6 .5$&kuvA%~Vdrv-)Y8Ӆ^=Yco. QşFv=qvH5/²\:f) G[[t<{<-g?ln1v>1S;08lcc MOboZy[Yv!nLl2M1 QM!Nfv7I&6j[]ںP5.U\ݏG/kvk_n%E}.鷷:*_ﯳLH#p S̀E$A,LDdR7;S B ee7.e핔WP~yjb`&F1F)So/.KP9o4/6ս\NRUC;3so[{%/L8y=VѲѯS oI$ :BB9{:5Ȝf!^@r4ݭajAv"XQ%Ao ɂ3XWz͍fb~^'{G[!N{fBItˊFE%#ݎ0UЈkɽ{g8)h 6f '1"摢0b >q"Mu+r1J$dp! K Bh跖EبDԤ(Ѯx3F$B')5qjp=_y©HQ8ei=؞߾?awu,-lܦu/Y3n( O?xRg $5~>q`\)K.l_w^t76">KNjj̇d2M)1vo#2Z &f&#sŚ'ϣC%bxWt r;<`K0ՇH* jV{>{= 4Bn)&ʻ+p{Ӆj6 p%ؒhN9Fbߓ%ߝ*JK,7TE@'Ang8-vaX oT!_fpadIwbi ^V'sĄ3li@D)җnPiHi* @+ %4!0 -*̸*(֮;y[S OŐH.KnD"Uy;f JfmӛM^2.V7(Li )$0h ?7d69 F>Om޽^3x/.݆@8\:„m+Wt$DC!-1&Qspfv%xeoz?” ^WO ۛW B'h"(֚ LzxmEdSɭr.4`Y*dA6+|Ⱥ77m\xbtJ ;FM 1H(fD2҃$-B@3ra?zLlAM!N^'ll4V A]loq>`DM|zi}枑qa)σ/ƃ88$l(3$DƊ%Q3I"l8*‘8 Fm~2ZQM߯ a3du<8Y| *3Ns(n~9_lー|u n,5~mgĎWi؃`;!@{+XxrA`X-xQq([ī,-`(!D,:2ZTYa"&ZJAǵuվEZ$86YZ ܆cR;1KF "24%ڑIܵ&b[̦H)zzD3ZYA`m5Si#LF"$UV@' '5~Rs=䌑BNeUJ1wp6aUYizs4玣v).6YJ=L*9CՏk ȩSwY'Im1N.Kٰ.BNOZ 3ÕsU(GQ=jƾ{ԹQWdT*JZ=)峤y넖~{5:ݦjZfcj~ FqG`%"¡#o)Ǿ:DŽi{*IH)3}u<Hu2:*i]Qb(]8lkO l5kZzt(sUưB½RUq ]PF{ƛٻFrWIp-2!. 6ɞѭ$y& =dKe"Mv U,X iGTc\<3~_D$7֥-O )ޒl_/[;K;ި7PZEk*(2oZV[ky;SCWSEU~0 ?;^2.r3>6b,hz'N1Bg헗$@Dפ0 =%$'xAXM%4"|er].(XCG͍Y9y^0 g`qz3׾ooEV~TڻD]ZvY/L?ltScY4}ibJ*+N6J [ׯ Řq9s3d% ",bR[T8!0*@Cv -z$a߱mԡݫ0*Lb4E ʥ䘂_b(Is#dAZ\Oɲڇ(WCFT(jPQ!)W˙?r%:yGRTj D!:<% 8 K+U_խϔ=PƄw17maM󊊎(Won5LY]6_)EI1`=ӜEhBWkKI@r8g[Z/-nPH:L22TQ䙦TIMVyˢ* d{1~xxv̴WOxpmbnl|ёw\6PJrN[JĮ+ 0Й5:ţ󎎷9Q9ұկGnAyJ@C`Bx8%L& ]1D rg09)\hTpoTYG `L"HN6QoDqUr(ՖgX(@A b\܋1T=~ջr6Ec:!q(-ʱ6 ]h%53QH+5*EZl h;[CzuU TH"kV8L2a>D"WMb`mH9^c'BKA [c}D8KAseFr†?҄sh:OSB<1r<`8@g{ Jw&(I!@۝^Me$bTfN% xyl8N:P$|گ$r=KB@찡7N>mH\&oKmk@(ԘG$"~A}-WЍɻ? OL7Lȓʙqv8˄s4(XU8 nkWK_ xGl ;#=]]?9)JE$}=2^3ɞ#AJ*RO]y@7&O6JUۘP!6hrIo&?Ӊ͌k>aʼn68Q>Ɗ{[8[O$%8Ñ6,ѤOa1:"L@'4$Li|#F聪gi?PE"wtnq/1;31h܃/Xzsh8d |n>ۏ}ǎCx}rG~ \מ(+֎r,+59َiqV~l@uߗ <w9_gwLɭ*PO:w|rstJt [[ĸNwD1}剒c5Xtk_ ݺw|q,j ѭ b\;RۀD`AdtC޹6)"y}=^s<NyOahI8 H,绛ۯzq0o7 yN{Cd)i7ףpI~[GL */Y}׸f?A@n:&'1T VuqCӛ1 U֍c( Z?JJޟgeP%q0͘hFh{ə&tMY\3pl=ZISmx8 nuPo6^3 7QQ~K@|T辝o#myG0 6:{8q0qHNN18۔#ޠ$>y542Rȴ;iVCVG׫:Jr}^%1j+Vj: څMZ:)'*j<\IZU0UkvcXN{'k ;5W¬RRYuXEɭ=bDӕ.v'TfW1CǯEIPfAC{n @kUA_/h1;᧿-O_ gR5$sA߿${B\TW$UiV 1y\FweZ!Wv ؉~Ys^|T TF[|_kPsZ>je`fMN_åp.?~_n})[(Nۛ%6[$ .KgS|ʸZڼT62]/z#v.pSgk+yYvf!f]یL؁p&{=l;,]סQ;^s=٩TD"_tkq ~ް9$Ir)RlJSLICO T!x8dS7[67Ov}م?7>t-F'LU8lruhҌ+#ç+0trI2"VxvmãNU@!ݑdWEcM| ,#SP]&"玘LPLf0-iR&82x )RX *z c%hifRF\?ɣq\j}3p2)0煛<=ϫ;b|yDLX~va9zٙ~|RD=n_6eH=YdkadPݚwݗRaڸy1&{ k:cHnDƜrw583(^5H8䍰?jM 瑹:4 Zp3Ho!NU>[i9j*I"fރC]0IF :L{*2dMrudO42'?ԒR?lyآěǵwP /s7Qͧbǡn\GpʵJ7 &v |~f4w)nO 98V܍wП&f!u p}63`pmK ۧՓO K'5-]!>v?qⱸc!07X'l>$O>8l?Ms>cǛleo}oK4XyEZa0ՏOE1/hwA˶;#ހד1~7gpcOV!Pe"˓icpZ CjuO3(Y9B A%9daYl<쒅2+{`6= HrְV~A_9@ذ)i3miju~6ΎkVJc^vSD9"'9xŨFK=b&uwDf)J16TV>s;(- F6^dCQ\jl_q䵳`iG4@`R^4.rOEz)2wPNF;jA$2_,u>wimpF vSSC$B%"8z URed\քe^%3D 30G5ZЂqyj*CO;D5Zc:>O1 KJ|a2D xe ] IĈYdP"/3իF:e<҈~I$cUv ^\56y`~?7ѣ>{}-FU>Gt$GJH)\bzkx"li*@>: WY 30>1e>: ;N5(+g42<ϔb.ڷ<}tT˲FQlaeVJ EoDKߒ~_j]۸WT*Q4[O$$|J]F'i뻺R3CQ/P|Hvmkt7h)j)%8K?U ^å.Z' AUf5ɤeN8(?kSF%J5<-(OE"L˩D pkA2(EH!T*DRz}R#>"]J5`1~\R(J?)'9ՀPqRJ NJIn%$J5AbC⺥8)c!ۢZatR"R<-]R%PAtN$ql_hښŘםRX1]@SiIw\9`s6BI1H0?'+*W ^ɘ/pN ǐv,eAN)RP3OĊ`PDn&Bdgxn,8&7Q،8\+3&;M5EfR[)qRĚK1Joj@](m&B7fgUX$#/r@8##ޜtZ.t&FOKoT&қVJoj!JQ(jRsL߄isD:-i<&wyGND Joxb`ɶu;:[MsTN[.!%'VPL"E2LB/C71#,Rϋ({P}EbCQuBdaMٜnKjSHBT9nejZ#EA!;K׀>3i!ĵM3]toAQ_U[aӘ 0qǖ]Y}8SwFkIy3FiX\~N~s_w 1/8|%Bw[02Am2|{K2s.Ro"lbV+ة߻rɜ;Of?,ҠW_t0-^Lu4؀iAFKZr?mGOz--8. gi/~;]FJ5ۋ#q RQٔ ='80L'v/;3E@:ej~e*dEBP[95ydBJɈL2*xʴM-,RRpLȕD̊b51;yr**yAeIG!s^M wfH[]4<~5B~^X'ۼR6XB5=՞6^9&iE3*Sw0DQuƛۛbzӄN%#:_ {)ipyM|g.xKײp7E;;ѳ$|s:_L0&uiB; sA6n:aS4\O[#vxkjiFdKE:S*}\N)A JatpLDGF*'שHR$~sFp|w=&cHjh=LBߨ'7`gB>uKs; 0e.GoUzFu|7*]yUY[AqjۨC,V-I< 6OB5M j\( wοiٝYԃ3MXK݄8z!zt{HP]q]oqv!!%x_Cĭ]9! j3e! T#(Y H CH5F?Z_wHݠ;)hC5:~HMO'ٷa&tlM蘀VVn:YPiU*;jsE AF8GCVBq$I%@#'Cǘ#@S7[pRj o&{#G-s~OIH,X?^9YODO_~ C^=]\$p.2ۢC" 1mP6ob11o R6Z&b$TSD"dj%͜"g#op1rh B(Z{5a3Pa!8k@S*]7Qõ3j.{tONdGs<C+.}Pdw7'g'~ךrk] L_ Tc%@%t3NP*B(F:Bc=l 5Q%[0'x;Wу>w WsPQ$c/㒠ӌ9'sc0G,Ra LBaiRTHe0pAQ}W(\vAR)=D]j RzRJ"璼 tRz*AJ[JeʱabƑlBIeK(lXMnF݄^MMM+{zij=qV쯤?>;=]}S2ux0AnI} ed{ܭ<2wh8^cVUI_yǍk&Znhͺp&"ty{.&wb?|'s]w[419ryN"MX9HX~ d =p{N/><|>tiI%xN|r?͝)yUFkFZqa>j(Pqj zFbm-.7Ni@5RM.oxy~.Sy2{%w[T *pQN~X=.G~ȏBlۻCSbn_iӻ?V;rKmU˔7AhCa_?^SzX?pvT35jW>tޮ_}6Q!+'M?5LX>D]j=!aiӈ]R'9~m RzR"Y~*!LIiN5! ';Uc:td-~\=$fH{^ (F49aj(u5Ǐz?nrl7LZU).?޻Q[ z|VgX$Xԟ6$CR-h )]#$mNV'a at*䓾P^" ~^7;ި-ɦά\{߇ lۋ\?>qѦz^7Ґ Ti-}}&FbՈ˒NU6UXBOycnR'6o"eO vxl6K Zz^{qad; d-RqLvޗյ JIţFaRy ؜ U:qEHC$NqwEP#Qe(O2PpTr&@H RJDf9՞?G1QiOФ!LS`PcApưfLcE` Ґddw+'$} uY^(Un kΪH&ұgU.OӅS!DR{+?ٍM,c0_L.^טƩuiñT BR[4uԩ)ŜpGocRs$*gؽ&&,_]eLN2rENDk0H&B֢\+e:`I0Dhc#d罘KPiB!DFbgŌ3cFޕ Y$^r8L]j6ؐpY@H3EH)DZ!cjsg|@NKJjXhnMJ5Bi:bhlTYC LraUz^2YNxV[Hd`4Ue\Tj/,kVX5+ɰe1$tEhdX*O_;`' .PeI&S XᏩM> DxN0ޙq~ڞjYg߅'`\:t>n>}Ro&s}|yWBzh*A!e)|>yJTrF4'_&75AvT6F`x S =:uU\7n::W/V }dgj_Qe*J&[W/ApF3$(٢d c|Li4@񎱟rYԶ&9ᩭ N`elŇ?20p tU6tиaj {P'dגH~޺c_]a|ܝ~hn I 4 ? @\Qy(C k _2m߭c>pHc +`N24aT!ApEpp2_7FX/L]thH>uK` ]]Lk`Av}'9DŽI.GD g]؅<;B#WY;'TzK:BvJj;Z ZGB%$ LgNg|wY!;Xض)f yrt\-nւw;X>cZ馧AFW_> ZQy7F:u-1QBћc#Ht|MA [fwT3IbACFloGZ"zq}:wEIN}<}e#{VLK.^LmbAb-xYJJQm6́=(OiHe#Jt^\Tt^x OL:.~v;J8BkUs: " /tcE0EFߠ:GkT!zTV T^\JÅHhKYhO/50cs4T@UTL)XIEZqT"BK\Yƹkk%P͙}*&+Vvʽ7_/֮no? ;u]qx\MQ?<P_ݷOUoL/W|3կ>n| AVIEjTN嚟q _Jl0IW^,[*;:>HJxD|5޽kUL` -OήFCKRB m|vCLn,n]p.nT];_Nʉ[G?\dw o(,v76}[c`(Ō$ry:.Oa%+&kZ\L(+L-`Kd8orXBWO8®[Be^ߖy HQJrj:,'vXw@ZQ7wwu{jApȴ k bB9PyyJRenh ("=JV5i.c(OL;[}G?dT7*$j $(#TLij>CJB4\2b:dͬ(lT7l'mZvҦEg \WR:m+bSظEGXSLrjU  Zb)h[Њ0@՘bj ģrmsP[DB}L&JfS&p+hEaR4nRFGQ *NVS0(1Ђxs*-y#Tۚx ]7{KxUS =^6#? ݣ&Iƀc^ y KD_4xmEi܋:%WڳvT{` FbǑ$005 "8^JzhtK+p>qN|*RgM/ |N+7R HUr=:VVK|w WRoAΆlM*X~,?7ؑkfrdJ 9& Rn$\_GOͫXF]1|}t~c,R{Sa{Oͷ" Hg5;DL0'<} HBqw8执jpA3S5U4}sOUk0lhdnY/1FQiQxf\>]ҏHv \%ui qݠNx@8,0/l'>_w:h+8h}ЭWaG 1qݭSXV?#v04A/?^]I~n/΃mouߠx%QKV&\_,3yڝaba̳p\w*ȦͲ焰}l"/O݉=hٍ,"[Hy:^I]X1vs6pB]|#S"3zl( +/t 2͖mY%˷$RV"Bpe}|[1I%A )uQp+3Jn (m<˞B*޼] B4QB Gqĝ+:Nq,nTA!8xJ'ph&}U!:r #A9BRRkPNr^ZZ U0AaaXI)݉r@ /e;Ht;GQxyYZ4X)t.D&N YnpA2nJ1LقenrCԇ/B64p2Ts 4g@_Y;mA{ ^R}̱:ˠz1^QHDpr۾EK%93۾E"|(ym@ĵn6.sBFD_\ wAq%!jxcosQloͯ4T/'qɧ/2'|^/r32#anca.%Lh\+#˔%+0n%P# >7ϛ;nVm皊mf̤y5:mMw*3s[DlI=..YiC  BQX˕f ˵4޾v  *\c(d#PP nhK^ȹTr dZ(qޒ)m- EEkME" D`P2w3ICgKqv{ 18!u*07e)Є\+1ZIՈQ`IzBIFŜm$v\rU4X:ÌumYr<:\`7!(lBw\< 5D%@8࢜D됛Vɽ_1oѽdN cd9'E/}W{yg:%"Y^QX%L41ypaKsz_] ʉ*Z*K0vDEJ%.sCɹD0 )6XYQXRh˜6@z[Z㺿V Zݎ:PVS¸R7ղsbtlsNJT;w[BB)jY 4/0f܂AgR678Ľ2}IN3ܬL]`"BUji.UlVfr2#dmxW0~4IV& ={!;FMF:܈Ze\*+Jc2e"pd6;fRPSBܠO|D2/ uV,3 2˴s.9&ZpvjHJqnuf;voYh|I)50cs4S64T;|^Yj_%{~fɃ9 BR, ec<`/!8;\gٕ;g2je3|}Qўt[Oqh!š=ֈX(`M0:'̴y>sJc>+NE4#ۍ-C97Fܽ.wSYn"68ΐW@;Cr}KP^:`\%²3\ge#&G2@}Th 1hYoS)J8A([+4 m !yǼ/%J×h55_Vn4bT!8Ґ=#$wS{LU!}^krDr6M'K^6M&dc,wm5}? AhqpRrDR ΒDɒMdJD$c9͉}" OcNC@<FQRiW}xiRI=5 F \ x=88MrQ uҨr"DY0<Ж c %$A@/}Qܖ&"(Wn]哹=%kN69ÂK9Gm9:4̸mB\3P+cr*ZIn󜹵]Meyk ?8 i4s%it'%*kB{BR$s4U6ߗD-|_]A(3BP%Dgj|HYτK iE(@hӖr EZKc|@Z"&.C;{?HVVDR*.LPDo8aL oH3uOhAPi*wu._ӏp=Ô^mRJd FK|pwֆ0R2q:MDffpغx ~?IxQ}*q  Tn{~\8p~qfȮX1XD:YRdڞR3ÞT3w=lGQX-u L|,sv|y3%D1 hG8}9qRXeSझ I:-l?pJe9 jI=:a׍I Q91ԤGu Mb+br~^tBV^gN&tA^\z| L v.y$I䍓țDfi&$$PaFS!&xQDQ1k5 [ּIo֕bĨ]RCV! ؉eqqgvs'M?EY Fr[l?)k6c=RWXX&0˜!a V!#xˤ LdRi' by}A!mRoxoF^X COpwkM-TIJZ? 嫡`v96"*{A,HH5Xi XPې""!#AYș!&c%i|ƚRr7YQ1 9(&RS喃@#Un2Tp'1;H˥ G"` 6ic8g ,UcERk#cX j\T'7fG-k\ha7d4͟nԛ Ux%E=p—ϯy['Tz>bjGoC=|89oD~J&߽h.RZN hxV`+*4̕M4AHGP.Y$)?Λ$@j9NovFt{a✘胧a%ZX„ѯ1Ww~cjD+(t?BSjfeڀN13:!S k,`(OWsShAiGE8XWA] b0k܎>K%W/@ t:Qu*Uś˟j:,bfyy+~5eJ. NmBpE܅ MmQKjR [Q{v2\KpP8I0v(w~Iu&0V=ԡZ'[u[sA@۳8f~9eTT=w=DZ@ ^Sb?+۠ZZtF .a!V]GlX3yASS-(RO"QTy0z˃cvAlycy2hqRNĤyd'k3tq[2+c9mrn%Qe<86P<;OK"X"/%Cjz𶾡rM,U5RY?`g9VjV~:Fn 'ݪbPDt~v;NC&<*vXʑV D*c R5[ɴh*͑;Ւ-y<П^*MJ}EMKߠOR. ['Ïm Nyc:Rc> "aB_ry}+FEj Ԅssu8cG;+dz0´ Id&fOoJ՝MdUh''_mf/1"~v`Mpw T.gM~^XNj4+IQ NnDJ׌iRTƒ\Yv‚CoUSJڑMl[VDC*"Τ0$i#Ĵ1P2X+f5XFKh3-ӏ=UY„#cS$|B3)?d6(@(3R,PıFjBx{PW!(t{?Y0\?2_ŞeLO/3!r_s.(a?f4z54=3Nx"oO `nt *5*=e:Ò_}z_rg 佺p8?^L??ӏwj &#\n|vv5w)\{__Ժ`חazϒ`$߿~r|;Kd tr_,s~|G^b{rX{];3]bNm=(N2%/,h~9a5ԞuofRi,xUh7˱迋 sBWw@'rr@RC YlW;M"Q%h;Y+؟Lઽ};͒5&Q{_Yx~C #Bu'x!~z(jMit29쮮EyG*s S,"0bFFQ%J3f-|#"qB4xs& lLz87.L8c6DuDOIìe[<'/! o^^/^=gS|~E.%I:.6y pr&/'Z^@OYƂWiXcJ'(`-RH~ Wew}vr9W TSo r Vc)6S#PR(.^iJ1œJzB\+&skGY|)>$g}s f5F'j"jbv{,xr3 \LkZ ГEg(lv1qn]ɰZ33iHFg .XL{@ NZUJt]>NV%\vmJHX0Ƈŏ¯)9aV繡!ä[ƹp~ [-c΀st3ޓ/Lm̗i6R.z)X >p!gD(Q=+Oc5秅љRP+fp⎪7ƣ%߯~ (=__SNZ5wpf;MdQI,TeQ|M)f$Xҵs&\L͟!L1LZsi˫8Y原3Aͨ*˵FT1o꥝6]-}u0+x'TuڀzsK*` T <.b/N++@yp]LQZV+i;Lgq̹,+uE- WAL˧qri\I%]!; ], tp$Q2$=N@ {'8j̩DTl`.- L?r^Ӏʺ j#'ֈa0V9&1R\ꀐ~P-3%s)b=:)v0)o \ϩ۾Id9]!":RTiNXsĒ am42y ?gY+"̵@(l C-V፨^S`@(&3l4 3ȩH9X9!$Xըz_6 p_KsdŸf\ɂ4}d݋4xPr`+ {'b (r=|xu=eؕ&uS) @92I6 Syq0`4\G%N1/d"FFJ\929kb3*Ӊ⿐}%l [V oF?Mbp5s˼Wz,rˀ(F{m]_sE+X=[: lOJiQC=h% seq*l.yna_4͟{4ֻHc/Ⱦ߬Q#v^e=n1@W7$=c*D*)nHm'B Bu;H+ɺ$$%;6J-~?5-%&NJ:or$؆wRe 5X7/, JRk1[-Y-45TԩVtfݛf2XN5`usRx/&-:,Q®&wxSQT+S5Djw:(HJi eP/ujU |_+NbCx<օpBiXT2458i)cP-*J[{_/< 鎝##fzӲN:W+FwekdFޒ{ ǹhA"ࣝϔZ8&r\f+}Y+G8A"J!%rj_[v+|(vq:K%"6[^$-8oAQ8Sر¨v+J@_/*]*~[_0tY bO{]1?v#JCj9%㜋KF&ǴZ/Ab_b5j\\ďw^np>Hjo0g 5sQ?W5p3L3b̥ J+!XhfU@P A$ ?ףOjE]]jE4$Ox_ E⋫Ml'1FTZO-Ji%qerp+p9AT S'ßH(|5G lV~')cg^[k94+D<@4go}[qw (UpVDj,RWqK9 %_<$yp"ΔTl)l ge}V5;74:k(MՄ+ҵT0HI `yrqMI$Ol>E%b!=d6Mq}̲Tھ}s*m=o`bK2\ÝbrN5SZaLjG^ ieoDŨN}FM:5Gf {_6Q`ct4(IwA'5wm#J2,p)]TԩƬҏ?8I<"=P}sRӴ1D$: -EMjLŖBb"׫1hߠ"c8d??rV.BbW%8+B:> Ve#3t\฼P"deMH &_jFaȇYO ?&Oqg4NջObW3fhGd/r(^Az3;fJ2f[Wl:]G9l?6'[_#W-` N:Ϩ%(,'>1 ÙV8c}hVgf'|~icDza2RE6Ɇ*$C㘊;H0# ͦz&67A'SoPPkYNBsA.%JD;mOjz'PC:U#/ILzD-%c${"Ik~OD{i9%FZΌu#TX oܚKZ;~TN{"M!onh"t?E) Y ˘O`skAlypw=*eYi\֑%=ZYIg# F$΋],Mf 8">7F`otn"W'qRلQ ZP3 Rێz3k-n@0`_ ֛Timc8˕A4 ǿaWƉKi٘=^j ;̆% E \*;O҂j EKQKJRƘS&T"gBZ{D$xFXҌt ǵ4ԨVZ$ZP'hbSqiVqk%*+*+Մ]TԩVqRrn=-HN5JR@-R@iZZP !NhYk)iZJΝ?ՌKkJR5Fr_[:hN L~1a5ӹ؛:Vt1Y!~ &9@N<as-a΃pwW+[F6@Icuf(Qxiֲrl7++Xma=W׍Kq58YL޾1i'4UVĢ\EF={Ep: goAD)Ip_s.h,`oG.\LSJuY#ȁ&.^\{ De{7k}{74 E*,tt &P4Ƙq "[䗠) GJzAa\suXdgʸJF(@\"MrnVIbJ|nĻqgDI%<n fi!ϝ$kUc:KMZ-jof5*'ٳ׉} dvKO]~mf)=,&Zciw52!LcqCY %rNKob2$rȞ*%%[ ;UAB;&m#cʠ2v`њF7`4:WăfJc&oe5x")ns[j)L0*ipsLh ^h2׬  {Ma,q"+ȏw.}sMQHBX*9 fmL1 1E_{_$e)5.N S hq/Q,l>5ĺ;ZT!'Q+B KEy׊ؖ9Z:VM~Qi*ﯞ8}jKx4`>bpHv"e޶Xbq9$ٶHg=(c5cjvB{zɸ\S>]nǍg&_T`|1V4\sf6ɦ޵?~*|.'h~({:'R0 )"-RRLF'ITyk .:ïH%ĴuA6WMd5BCozu?s KCpOQr&9aZsaZ38Obbz?P- ?ݔMNFӤSMiґ Bn9 9O`$*iE뒨éANߖԩf b#أeF~$<\Mt7-a~oƟM>ٲrKWÀ?hNvŏ–h8d*ߥ%_{in\gO^b^lnQey֑h+:tzͺqR~L$iFx S"n%S[_tq[gY{ȑ+,ˀ?f 6;/&q&~$ef1Ȗlڒ~K .VzLPuA t}G6!dҭ{ItCqTb89U5v?)+lc q3/LĢamRX@z+~v>vDIR*D) I-<'.ӻ ^`;f<gWiI.6ԩx^Pٍ`(Z,3y^1A dpڤs} 9UhRVؓԂrh/9 i%Ԕơui\Je7%x\Pj̉KKJRVD8 YTPSZI>qQrNRVr qT8SGMA.F xo S-8T9%y j],pg'曕P4j?'IKsP :Zr\ "K!8BJb\P+O\z\ KYLI 2vJ.eƥ%P0?q1riH\jC)y"|΢1lՂ@P TJ/A er%DK49BEh aՆԈT\9 ^q Dv#/﮷oEb)ZR ,몋m!Bׂ)ZW24ʼnJ&#]Kc,6-帅(}i-.= jfErd:49Zٞwf%*um; TZW}ˉFh!lmZц3D-ɕ 0 )\X[Ǜz1_ы;'ˤKޏͬ_|M#~1*nVW@8UpG]a eN΄ ((R,'lhp%*osbߘR]_t>`B=Ue\痷f)gz޾MHY) ]6%cҤT$acjmy /XͤS_ZEw)8 Hp`LhWY\qvogt+K+̄{t%O {R [Y2Yfk57ڣ\pUi5xeI{$&WTMeIé%k0z,iD؝F@%&%É -h-&HԔA+K:ȷNdlJ?` @X%{5鹲  P erkL˕%)+,x=d:S8tkr_ *C~|E)P1-B*|z6t;n.:yl$2bEm: QxC& -lV8ƒ+e&˰fwMދۗوj<{DӿDtbI@Ɔjxj4@'֊b'yՔ6R%Nr*('Ԝ\ N:ɥF49鉉oD5v"'=^.D0zI`HazkL#HMaZ3xO mT^xmp]islΜݟ+~'%v4) wv?:PGDL{G綐^0srci,uN 6w`@c*`RLb >LnYg םm^|oRm]8X.v1:=QSlفN܁#%R$}2[ewT-\Xӊ~.={hp8 rQn!˅UJE2 ȼ18Ect=>dUR~*ir1$콀=jWuCC/~NM"Ӟw?UƊ(ӌ2.%g|{&P@"*㌰4KBpD[ TN4 j=P kv6*6e` A(@ƾig$uwX[eE5W =xykT$.￯ \].뒹Z]u6WwcXְѩ5Pz85eFbO$Tn\5k!JwF(fРHк,!GIA<`2o,Q0k嵷-,ųJFXԃENF.kCo/wrt˗PkBsXkuWƪs/v9ј `w/(^Du222* 01:xU[lU4i~8z20S6QwעtC*uNLPs:.^hmobk[Ve6͍{m {Z۲+,Hi:pj:ٜrN Be {]9($ʛ\`˵Zr!'`15ac+"|_o@+--q!A@"醘!( x"/}?}h)ٯ})SsO;h̞mV#^"KBnگ _W*fÚ{֬͹Kr;CqSmD=L!xYJ8Hah"ݦ"M+CK׉ҭY OʈC5I7* J8Hah"ݦszMѤ[vJҭYSI1i?s1i5Քv&sRYOnygWY\dὋ, %оCk%xmn: < ؿ?V>V\6Gk,X=;`?r~Rޏz]!e}O0uph I"@LNz>a'P:IoaI*Wm=U=%t"J@QF!K/«h4Qz^aj}Id*OQ9)OJ:EUi]V 㸃*OD)7}@ѕw31hw؄d[)o 5.&<-Xwy<85έ oYwmPf/:>Mp'>Nu?|+od6֞eP֢*G6TA9hD,eƳ=x(Zx 휜0(.G;uQ+1J'hӊ6(d8A 9Da.]z Mx;;4'SY+[ڡt(/<3́A%043`-XUaX,vhM# Trz xdLF2e)M<]=@1Fhž(8YȻ#:|H9HjZv0}+T2 d2 n.mz;\0xf{$&WoM_é՛k0zlDQޫF=,M!G~ w˥),!OjS6&ПO&ɲft]VϽ/; i @9Wb`pNSy4T:F2Te" I n4ʓžTHt\ܻ.ckwChmTgQe/EA 5kYiugCHs=Zi`_V@1HM xUłRc6]Q%.! fnxŰmZ)8CM 4`4xs~ 8yiiX wvĖE U9PP[#eqR k$N2Idh R)$%g=tXT.h@0Ne:IDjp 9ډ7 愡NV *+-Q$?{q Ie} v<B_wuL]LR俟RCs! WⰧꮪiڊfqb :# ")e@[XL3B0vLТrNlE5Pس3 H<w.Q7IRF\kτLIX Х1BxQidl'tL3#|OrO뜗֖N8R!TJb +%bDSHD!k&\:C=X rduL>G+U\9;4w:=Qxg1! 0c]êh!Gv z-d["N4է:rtO{-!9VV#]_M~"(̏e6+VOJW}7e\-2i&4,]R\&`=%^bʬFZ2ifLs!-[KwXCxA8uwPm+!+v{z̐RuK[es)#8n˸JH^sr1L%/񄚒$Q\⣢ehOT Ԣp6 ,uuڪx/'_{LA{r>h7Ȉ9\y'SRqF>(6h>,Cp:<g6}8;p\`k͊Gr@! Scc_y.JFKY;Ax!׈_On13w.TB~μmjH]8r 6Ske!?Kf=^f'VO.dڂ q"iɬn" +qfcD|>Sz.XZ tf-gS,T!F羐IKjEnl7 %_zp@;%/o!3ߏZ#T+ъqTzSQ%H}={c#VO5H!AgoSMQ8}M]1im7NOI27!!s$SLx-D~B6t.)Gn/ej>$w.dsWyڭ+9m)kMn32[;eJb.ؗk~`_tf-9.ا_ Ni 广Ӛ{ns=:LIp/֪aD,~ rQ=l}# nVSz ljIVLl5|hc>@h$dJ"\bnZYڷ3!Pk!#Do#֘XcwQ.WhC;#18 G$KA#6^qG1KA:2D"@6 "6(9R‘!HKd{Ey9?H0ȣ)Zv/XD*4`:BaROpC{5oX DŽȺ0cI 1~` y.ޯ_A e?Wz 7 KcQ~ڒ,im ;ɩNwO4y-U&jj£%@Y̑b;7sƨ{|f~[E-ݺoȓ`<9;gfE5|R|\v={Q=hT}9c^/{t4p1yPo|ۛz }V|׏ ?sfb4ҳ@K<q/#I|}g -L,\/}nOc3wFY,>/}8c1w77WUyNU~UKU[ JrgFdq}L =Kn,Mv\k jK,3%/DϯZ~kN҃kCL&Lk9F'7ak8]zzt6KvLkc=S`ǪS5מ$Ĥ=.<,GU~ 8r9~cGoz;B;^hԀ`fVGts.ʊfecL{Lznmӆ\,8(`.N#DP,kҠ? 7{IbYfOكKb٨/oyԊmqO$7K>wh׵g1o44VKĒZv,9~ ;,Q O,iQͧq~?$/J.eG.~V 9:Zz߉ODr53 Ru Ϳ9C9kcij"L7SoJK6\>OINP5+΂[3dc-ԍ O, 媨НMNiv/-7g JKQN[ (;aQs_/ɒ+YVs[2[`W"lH)ELX(C)!8qc'qyX&Ԉ,iA2P-X‡6X52;Wrq(.J@\|U}Gʃ^zjxfXi˛?aS6Az֔(FO$T;E %3֔ b@'3+g$ZdrN1#Ť7BIu(+IDpb/BIi2(DD۱:F3QSADS2 y/AVEVYݪ$Q.rkUsRt*yinݗu]19پ*zz'n=[؃7/H"aUwP~TADuU'?$^(6}'(H7=y }J(J/&3XuU" v{8=X1Le?5J\ݑd ˟O2aSy"_2) ZD2YtgGn UxQJn yW85}ՆlG.b*; ]¼S^CauF軿/ǃ]zzWYxK;'XZ$GQDk^LSk<{7Z(4A쎞+Sa,> `7;1wYV*k zHhy&.߹NhΉl'Zy'=#fsf~хx!0$Dn@ߊ&p(~ւp*4Q X/=Bւ6DEпLZ5".>/!ߠI2: *-¿/z#k왉1RPRk`r!i.xB"( t4KA}3y?ߕ`ٌLa1a`ԪOm(K?_/?N}-͊)U3y\ ?\d(D!vRxQ.6|7 & n~{GVN)kp4CVh%E>H02'3]ի/»rg^Gd'} ĐR8;ʭ5!" 'c;Jq@hDEH`F;+=b4)'MS$$b w~amJ"SN$l$)IR߳]L j;a)SX*4Jp"» H\N"iX%#XKN;㩩Z "%M2Dok}T|}G PCa->" L ML 2K<޻XJG}aRhBZ\"l)*cmGoaWsFϴd7^%k]qbqr/ʁ]տSob1Q.~(a Iݴusxc\# €ma,: FLH474dㅙ(WmۺDAH"ˇ3x>Z RDGD5RmQ E@0r\wULiHӒtK{ 0o-H!;%UXꋤz r N"Bu Q"5 ))ZV΂h6 b Γ< ̟}e]F=]LA A` 2|Ns K@9jӜ-ab;`a9M~NMIzI(޵@u7Ud:?E$;Rb-}hq_x}HZS{FKdNPg;`49\Il9uA/gx}\==6Kr+J=gy}Lp1ذ9?&ygDOŅlqI)oA2].o@~uooBdCԽOi~jr 7.\f ɷ`w%=eQu-˴,o./1S7Jju1c*Sl>QZȷ8=,\2_v#(|ʮ^,o^T&]h9<]j  ɂ +.|t!q'ѵq? 89 _LG$=s7;XN?GBY|-=>,:Ǩci闲&{'w}m: DI>.b&:c;I'~)'MNtrS`:M[K֏(RH%_vӌgJyS Z0ّ XA;#[8-gT\q(=//'j%M7C/n|7|jѯOc##'*pA ?C} 6#?jz2Z+(ȧnIxSEJTJI&)V"I7N:JC  -$  p$oE}$D%`MFDQA$DB}VQg^ҌƈW/hM/%^ѿ ^K*;c&W3%j.eH+ #oRE-F 5q5J.R~z}?XEGO^"U#D1L; # {\= V9v.ZnNϮsh !xOpΑN,MHmS>GӇOԇ=S"R%=3z@wju\o\O\v/c\v"TwH=oy 5i7?u/5spx1>~'Ksf*;TK)rY Ym*LZD|Ǐq-v|{ǍF`sMhǡ4LPĉR3780s4+>rjoȚfõ_"FNhJhPjF;xMS*yx#^Z(%$I4&aseQK^`bdU 1hE4!!=l$!Nnap5^_XL9 | XP.KL8hJJPFT ĮGH`J!NTaUP"ׇhOݿV۸V;1Y{vPړ1?i3k[ACHnTu;'ѣ4k O$*܇=' aOl ?~䲻cE9jݍ>S 7dUp(7 m4@c0v˷$sZj]u/.׻/WؼB gLS\e.K`0,Qx^$G w|:g~P:ڕlvl4n6SD^T)" 0o;p(|ǭٹ0Ƭ+P."bi3*Ӝv 9LY:<= pe7ޡ辛e0P\ҍzC_rZ^!-y43rl93NK5ΐRނɃy 5`hwJvBfkWO&ց3 3Q2"Е1O.f(:|*$ $|Pi0 (?]",.Vy(mEښ<|eIfP#ٷ,]MGˈxL̙3J i]=VJ((KǞR)CQ}?n(6k.7C8FV]e6ϛjt4\p1!s&t$y!a"u!} S1;גsXZtnZy̬k|ɚd| ]߹G*eǶ |}7~gs22JB-HDddQvś4L0F/8CnؖOZg XM{8$X {Ìn 3qiԘf¹m7 .f~N;og8."qQ'"b?TMo= J8W v p[ :``$,B&βH-8*%7s8wʌbNɲtps3ړ9yl^UgQ{1iP`FhFpyijy7HR4),k֊琬ޓDӵ|X?\p㰿fY>c*3WL997P8Jl<_Umjei)rvI\D!!3&sa]f6vf kH &U+1\Lur`dmVy]x  7Q6̿}uɮfuX]|"gX`k|FZ88t>o8jt-:żҎ6"y+u)}dt2( ŝ'j6QFE씥U"sFdds2bre `G@Kē(|  L8-,|B|;F{sp6tpmNijDWX v䱰(2# ^1'@;i쁶x4<>(}&7d\TB+~q(`@c; 5(;r $ْ58Լ6PŠ J?uΡr?=ON jJ* RyPG]2b^:ڑ3|dd/ x ?˫ 1mFMVGtlAB4ZC$ۂ73#2`M; dE JeqMR{EݨC11U 'ͯk3|P\ 3"SId"sլ'C)#cLsݯ M[|8rh<+:¤1 8WޓKEG\(Ȼ*n`\ΠRgwja(Ng ^dJL8r5n~1IqQob]xBq5mu^M <3&^m+ B\!E@X{ ^\LSBoD [Zڤ,$dtSZ3t|j$L k6ϨwV0 "^ݽG93Oϳ>NO}b#2{=xB)3U=NywWdYJD2əQ %pM۳jP\p.k]qҾz:> Ӕ.Z<"2|4b|5j?>lJN8 4#[$[K&) nwӀoę@G'j`{~(֎ ߡhf#FʇpyJ}5>dF rp5K f8P_ҭv^ r{$|xrAfCv@ޗ̣ҡ+Yȴ8v }< "YeԷnnl'z QSE{B"7,in&v&J[3c-ƍ3Lfvmȇ)6Xnp11}4Ҏv QoO |QYV2%Xjẗ~3vvުMkyVd瘊YP.02: ku(ML-:\/ٗ~#Zy1'eȒΒTE&WJڌ,z6&hǮn1HwX H aQdz?UvxR3Ex8Y5NW܇W4^pܤS'<\C;7K <)2DoP@6c\o+kbIq3lI] nYYbP;;cZI|eqFm:suVZjдn hoi4yp^Z$glk11yRwӟVU-769+F׌>}0&ds/zZ֟O"NT#SϮbU@͗Cq5dcY0<#肋nQh9qB;ٖJY5%hWR̙'R$!A#ɛ&B}v ;vVr|jcMZ\v'&䫬<߯?^-zYZ/yEmϩӠhm1/[̪fbas17ՍyEZ]-~w ͭuka}O// DA< ĂP ;׽G͝u-_#U5f8Jsa[N WνAbld!2lg;Ê3;>̍oW wACr[-"uO uqYka58*?}ړrzJ=i,7 ^ ;&Q ߪjի:\+7@lvы\]#X,J*%Ǯݓŋ"_'%_;VɁ5dEL k ؝fc>s01F}Y֨bT}@їp}[TFCoXz(9"kjY=ua5ޅ?|$6@֒h8, B?\xLceRH8UT;*BRYǤͳ ݭ Y r+zmaoF}l/m^٢D3͝ULѪiD!Dy)?\TLU7pz69L0V@ $hKM k+7K~?acڏޣZm ]e#dȉϯRtyuhղilAh[w35tfPc:HhV\VZ @]N#A,Gь֓_^uWz-U$C3YY,?z9rC?= 9>0 3TJeΓߺ"IXv̴J >Qo mcKΣv?v-XZeB R!g"|=|Fѷ6l;<ғ.R E+Qx**[-F 5cא~kJYg-ULNQ8>_@D:դ{nlދaڊR 9t|q^cX @. ZBDɔKb7N^;9{}mǔب$.ܙ`j^€VdV@;&|te/uZ'v֌9jë=ibDtM7-VZI+f~kZt{Lvr' ;_nlŔļN&qxMZRaGijUxR[])[\Jg׮ĪB biER( 5?~>xrO N&9Nh$u6js6;RzkX_~G&Qk}O'5bpaw*T~;s[쉪8Gk;;+ |*AjadWnho07}}/ c&/-|0Kѣml2ov"EZ~D^QlI,F"w7{٨?=7w$-~]Rj+iXuvI_qutb k/ Z1>z4jsnVulK(DOY"aC)[oez\hLVlQTi'h^Nꩦɳ˫Jgu<ͣ[$!=u;8/K<8;;WtqTkjȪzmD:q#kblk7ޢG-DuQt{8|޽K8{5^&5roofg')zGrQ1 MU^W]SrA3E㑿0[HG.oA䖍~Gx U)yZ;Si1D1i{uQXne˿'h2.yLXEpnJTA}즕z-.^51 N&A'4Nqx^5A%QaKf k}|F+@f6 S6UT3~>ֱdl X<9-=zf`3/r5ېF\>OGe;xyt0-U- #e|ԟi󗭥iv!tX|~$ﳋr E丳ZY׎\2$v& :W\ U6]XY Whit*ivGnһזXi G$tJIn|mJNo!Hs~>*8jii9-s>2,B-Nbf>ƀV[ m`>VcN(r&nGԑ"YG|>bRHIcu;-Dm,k -9rH䏲 6{B*yr|tЅPjGU+ー6E$!IE p Om '[(Z'QQDa*nGad#B/[ 1jZba;Qk+cw RڵƱA\̯mv4UFk3cYق\I^j 8xIvA:TɀTZ3^C*FAhzJ  °.'&(_ @A]kCë]BY5bB%t1֕dӎXY#܇ˡΎ|y$qQI6r ;9mlL֞]`Z̛A7[2ohvl]_&ԯvRk+ۮPɒe9 /]i,m>_oiE(l-t_̮ZvfWӌ7vs,Bli%l)cOTńRTQ"kb*A!@F 1-l hW0yx2~^E<6m5b5|#SvmMw%FzŃ/\20pT;C~u;nr?w`'YPtc0EXﴟJNTOZ/|%YE,J^i,jzX*Tx8\ d\Դɜ8p:#`˦o,!mo٪E;[wKyt\58LD J gHz3`#`G2P 6ykP@e/[W86ob^AHoLJ: <30 s=jk 8>^WRXg>jRKRI$kE$_~3;2Mp~8pΛL/g"ͦnҭb'N>~z~ˁu]-ӌDsvblbhamuE,9YcUhx_ޔ7t`[XGUzxW/r9] G7^?{2b)~Lo]=MgPSr c?\sp7෩b?^8[`B2J]FgpYsmTQ^y՛s?;xa_631 0Tb̸C(ɕ?@Qm]Yp)瘟FH}4lj+7<~e^9;*NbbX<1k6-)_)-pX9[jߎ OI%㚁<~L^ @ BGJo%# "#Fdkjc=rW݂:g6J퇻# &`7L! wffn+o7Z`:C<4x=E]ud{o_Yٿzh_\K3lޘd`ikxNޚ{p桅9sϥB* NoP96~tI $$[NE$+W?(˯/Sw@Deï9[ /o?ÅhުqbzW?{=+ABd :/2 xNAbR ?N|z*F}"8;~%upqc옐8-%p*iiE5 ;vFPڗˠ0-!Ȉ;_U-Ǐc\"I,EAдXk"H}PM49a rˁ8ߘ ; iFb_g~HQ\KIּc~ʶh-=+o[V/hwzHcNDJq_Xa&5`4ኈE'-e}|Ӥ $ݒ Nsby`Jw uZF=!ɀ:٩H) @`OՖOvLRCܤ {uE/KyJ՗M1j#?R*LIY1~ =/ 2祟% ]JE41,9|OuhJd|@(̂ƯOYlBÔ8ò?s#h:N v`Sa+s[R1R!D ~2ڜZqT%.^j-a RNBS#tN$9 o8Қgxx F KBTرQKf|Qk%뢏Ge:W#e٘p>TV3G\Y"`R#T/BW6F0$6wӕH.oeKR5{WL_UPhR :a̿$oߟqO?L*6x=rWn z{ "2D0 OX&P_;'3܀ϺoMSLw>SX3{<1THý9~0C g`7#d0?rk 4D]su#k4vщޙ:M=&Ia1t=x`sb~x}ӁmFю<٣}[4PtSXyIvF O$׭Z}2̻  B=:<Eh՞ 'fUq+լ3uU~+9IK7܉iW=@yy]\?sU^MFG-$*P#Y+s;bݛ%A:,ܚ@֝Wa8_"Js]x !+]Zb q[ K8=f['+T ޏ(ЫC׻ɑͩBHleп\+x6ZSD$:xDDK7QyWE1ɳެs,yp8;-:w8Opc COg`WozYxs/n\}.xr0V4ĵ݈*/~I4rwcӟ b ^9>P3MQA43W$ .?ԕoɺa!XT bT'mix維[z]uBC>sM)\n\CnĨN;X}"n5֭ USLUa=EԖKh$;H3Ρ< ^_`pZTA'{YGsIv?dκ'-Jk%x{ݸ X?|k$KؼTnW$N8Fl qJ-3 Sj =++K`\!`A" 'X.1 rS 6$Uҍ2v󞟇؏壆Os>}2% HdTHlA|*pvFcob˞wݟuka9ZMԶsҢEHÐ厐 lN<7VrV;BIauLHFs\7ҴZHE*Z+fEb%7+ 䊖 4fJ _ \- \c,>HEaY+mK+ͭvЯwZhr~:,q$xӏOo2gԽ1F'[yAAAKi 'a!Mp:" U/8`%bw56:Z?= ƹ@kñ֚-$J%Juhݩej^lc81o> Z]_} \LNez_gʇ3h 2pNZ,P 0b|oaO^Rc7i!IN m;Iޟye}܅|*SLRdg[*16mipmn]hg (3R^OLʋ>Z#s8a.#)/TcB)ra[ET}T\t~pW̹N2x|4O#~ӈ|>3j$(epfwC (wT)%Ik[ҽ jˆ;s&ҳo8Bw}?#XܭdV8f;="&V|++΃Of 5};iJt^ tbwNZMпˋ¥w{׳A0RRu΋0?3f9q](r ሂSٚw<=y &u+InQ ?h3)c=f#p,#o ͐ Z~NN^fbʚC ԫ:NKP|.ïYWd\W^W^W^Wg#`XˑS.-F0ĉKK,^HW4 yW/YV'xzb$EYW/Fnmv>H~JHFXcbz}N!8i)jQh4}Ne %9f+J˰H#^48/pVBaGb϶w*5=SlW#̹xVbu{p|sۭobt,:42M$տD>lRMq~㵔2 e&Mq+pb ɗu_fXJ][gEN'KsMaÆH}:48f<\q#2c0`F`Nae*:yNoz&0<9CGC5Z>p)Қj-N[KYY5Ԇ>9S|%s̸BLŭd >&Nk#T:/$FTJrJBo cKE/qN!؀ '8̫RXEΊ"/O1oRh3GHQr쭁Y91>`"?8ߖ7M)b/beف-M+M3Axs90ڠ`]:`2pjpJ,)͘b.ӤYii>BC^e5w'k C1+ֱpF!"bD/fּE 1bW™;]3z仪x-0G~i7Vᭆn QiUm+Q)'; jndˁ vlldDq%iL9W>J>GIF vY&A2Bţ}څ1d4$)ɫ }Y_[ΪR3ʪ>lRwe%`0 hZ_vחorEJpq.26k͗W} +EH;_> zYӟ@W7|wן݇s`MK):ߝpyu}+ݬb--UZ_}ۿ(*;DJ[wWw+bd& ]I}o>~*,HS8fx_[ɼ\|ӷ}Ko띔__=?RWz wKX#[VoQ =6o4{:#C)Y,M<*gm kTY IqȆ] XxaJWj^A6LBʽ-.̦2ywu}g+vwy{øn$DPW $|IZjQ.;*τT|L$BMZd'un)5slrWt 3 {[xK;z?} ==\äE - *0ֲT l^\ɘ T`DQK>3_{enq) X~Ee/ kz2s߫-C`+tUieAӣ#ACY@>q,`Sk[1<3zJ΃%ћa5b3Ɠ(}ڨ5(xu>=S{!I- bIO9Z*#OZ(t؈O-Z#' ɔ<4*Wǀj. a 6))]Ui9zle;a*Ii0(SnWbj=Tm?B=oRk)ne(#U %!Zm qtpGm^[m"[\\|p4D!寱W-PsېQdƦC0bSF(Bviaٟb&m[A,E[*J%IEmQR<@FC17hr!x.a%ʥ u7!H05n8C, &p3l*=l?6+Ei3\[_* 7ˡthd^)6$ 3ӎ {wP=Äʇ #^а01h F% JcX ΰrXԢXDR:cB) -QvC볹(hf][T4NGa0vՁz0kXk!r_~`J`̬+Plݵ^?r4F"SXuIydnbG"ˍvUV ! I`ubҚ6H~yV`l5x+l5 ly00 ^荬r+|7Ux:,SwsI}Ia,P*s¹sY!,)TN.D Of  M.@u8 !lH8yCzjcQubCQ9dZ0te y*:E?}z˺1t ҏhݦFu1ĺ2\yh6dҺ͡!\ET8TmIMALcu;ޡ'˔h96šֺ͠!\E{::+ jWu*]vQ%Jdyn%P}Lwsi.3kDK`urC0Ru!Ͼ+L@ɽr/O7ܞ+֭IDc!~{rMTaҽ@ ߻x~n+.?}e#i}J->Їr bM5}C1Ƥ@&Ɉ3]CT &EhBQ/2'#gE6dC%F}fS`vX) |__^yzV{]*WlG}xuSO~^(yQ󢫟oh/eRx<婠L-G [6h#@*g?+|*u\~ԁ^ƨQ`/YO뷱{&YrEfTw56`*QDY`q?ΟOF6IHJeӦMAph:ء,J>NU_9}=J&JIbq{eT"Xܲ:_kk6O3dmIim>zdC,'$yF'.!'2v=!L*2$x},_.U%bhi>TWNE^JsŸ<@/4o/2m= R4m# 6XFdː45(dpҩy0 6YCaO}VF6 QX6dG T%(F9OɱQgs=Vzyp.>TI5Ѳ~MDV^x+5R_IՆ̜ށM^S;vFvA5w=$73u򏟁VjLߜN67$YA 3:bW&ЇL3Fg6=jo9`^_?gwh,6Aeɵ d O=c@׉vԼMDYHx9!uItsfIv xc%h4 %_LDP1Ps@EdXtaJL^,S$E\9^1Zq[+%)Zű8~)ɝZ^(jyQ򢫖R+|v )mMV{c( !z""[hQdY]J2uI$Sͪ^ωUS15vd* ]% KS`a/"1 /XKŸ^(ES$.DK–6iL4s5I,[lTu'ۂ* yD&$רgCL`" ;'C\R7Ī7teztkN@F- WO1}Z!uThT6AH*N І@ .Ϧr!Axc7%U>mRf;Won46\I&fa06DPVO>Y#um,"Jdc"(DX{G}Q+QC%JV &Vb .ֻze括\5JenT䐩YJȼ쌗0+ F+HqƻIACJW(Ma9ñ5v'P-.U 9g{;u˷pU\5t/S|sx!؛ænf9jv⺻YfIT$g]t_<˙!tkؐ5o؉ 9hDR[< Ӂp9ӣZURV , ^_0|x nW|K(ρ !pty\mb2Q>{6 c4(ym mT mhl\ \Yi͙ye[`xQgݟ*yq (pB`/xX`p/k}OǐɀXF/ېz?QW#Ӫ RU1EIJ!{ >!bLBR:hw값b-km#I/h/ A"vOYIJ7|DQ>.v-3=WWW3c XZ, -ͥuك4IfA"Y ." BB4c:g&s85e9x9çj6BQ 1Bm dl+.%״9GcP%zTJ0hJ.5Ue_sъ@C/k.zWBOeڳw(,Ҝbwb\:W;c:7ޅ\O^"A䧫YUE]^[W=|6L\떛-K =\_V\>F_5o\WIb;qr%%At5۠0L sHJyrf=#oa5g-%F"/vߚ_'9}\ oEu%XsqP8~wWRrӿ3iq @ǩ@J͌@ p|[Cxf==36y.Ǟo| RJc{?b@ZKv>?%G%H[~F"?%m.􀸹4p/zs|0~^e;^{3D=ZBO5K|KOz_ Le {JpW ZW9F!.r4iR[ٍ't#eq˻!z@_hk,?UIs{'.ۺ//n\2⷏]7R]$iߖ;NŚ`$p4dH!L!*'sd;-'ȉ5gO-^Trgq~K) blN2-JInp1+t1D 0^o!METUE*BQ T7@/#2NdeNBL&J_(I0}54W᧔@TT>-lTif5p5\VէkhkP)×RKQhF=!QhBM%n.JܪiVUV˄>E[;:^J[.1+q;\_V>FO%n-8{QŖ^yA8>sRˬU m+0n}UW3¾¼ʆP[.PZJ7y?cN~|^i!xWG=̼5h;f=x섭Gt3^,HF0,1"ŷ?QN]O )F͝[̿|VVJTJ@/dztjzZ$Z?--=nBBV.khxQ螢hw-f?Mh2ewu3]g&,]/=> ӗ+4 ?-Wfa:UBP*COQ:$w.dJ{MQ.wJ6QHMLuZ߹noWx_LHS4/g$,;G8'J!Pߙ_g+)/TˍtYa-Ob[dM1[xc{9w0ÈN+pY6=z’ÌgPZ}@lֱg_#{l{V6o :NJGp a"3ORX{BS)[XKU??R@O}|)!^‘8+#B!H;M\1}Դ:/wܟdka4imQ'2gejpM_ޚ0Lm^|^/ċ*OE˳wI P[KJ h3"(Oہlyl uɣxvY7F:m` @Q#5?bΕz~YW#5 ׌\FPId[v|Xjp1!1d{> ],kt?UUD,V0Yxo4*̖MSE!u:jy 1=U-I`(5ߩ(m2Jm9ҕD@0]h*n-moz'9zg'b٫)uq}0ڳWiqG^t8Ҏ8I#G8{.jv ݬxOKxvI[IgtqyVBij/{dUhbh?l¬2eMs25@H8#Jw5#:ȕRh|6 g#o ݮV%W|QIzPg7ڰ_GGYP1Yۯ)D*PR"ThT31aHm ,%!$ ]QX|5}[,ړx,1ޛPT.01Hk!D)OLV8c!4_ { [:_f_qJZRڒϷG)z0e/'R YPP@ GdZ  Tst^rϒnILwٷR"DYݒ򿣜8bMU7щRKKOVwZCOf QT}C)ő=NށѪͭp㍀@nB D'P=?:t@,7p}FSFLAŘ#1phQ2$Hc>V4!~xG +/ ]qkb? oW.ξL'3I(;yUk6[巑L< ^{NSB z4R f⦨c~$+4 $G*HI+T CCL2A/^ ^xdFPCN3cF?%^zCQ=lt\0oieS0JҞuէ|R[)ѐ#&YA\NrlQ>c ! Leг9.N&$oQU- $:L7`L L!mKrq_3N<#R3P;H#zJe+m5L$~Mf3r7yXQ,5Q*R&VD"ISfbPX%<&f B(a3pd2&xr8xƷ7}SYZ&3Mi"PaP ޢΕ l;;m #{F,bExp)d/Dd cˀ^F' +@4Kz+.H]StiKlqWAy@KZ2^y:cBˀ@$8 1SQ90^DS #^K2Euߜz6VcBo#Z^bG;^2V@9+Ɛc=J|͗{,] Dy)?t f"dpѨKʖ֬)1xfWldf ruu22.aDVZ┧Dd= <!U;wW!l@9Ն_D2^O;:@3R!vE(zRz{fvRXZQ:r:zX9dgj!1EC)4:֠"‚ @5p )hkSYOpj4=;7⑨[cKh `ʑk2 зQqDIaXߥb|VW\K軎usˣPQf7ٻF$Wx$> n,i~ yJ%7H-{ nb̌"+pkˢ L\Wશde{:U7{X0 >_Ww%% F 5n`/.7̞c6! v.h%L0IrpK/fM,r5#@n{x |r4xxݵ^i$ {D .i\5:˼X>QƇU  <_v]u>L xB:(mbѰ;pH$8e? ?M)=XXF {1ywդXOu|tGCTgsb\9#Γ >E;(1 4/Q9CJҎu.AZ <WLj1+R<^plP*i_{0v:3}Vt9nUVOL43V1cq0[ҕOG0B%?yBYWbuϛ %ѶJ<5 =v%^&:8$0d8;m^ QMf.gZ6zkqE? |HW_oz βgWxbN)J29ey7;YbvOq5ًсFzᦷc9=w{'0][py9HW%l]B@^kj3lr!9 P{mc)5ky9/Zz}IeP b@_Mmd GFS¸"xi|}ag3E -R{'gqld6DW0I9* 5MQf-r_<@c:*E0Otj(ӎ[E_u*)r[]4sPMS&@6s6 1:|)$#N3'ѫd@KV k>WV* H8qXUCMabW赖TKΧa|TZ&S"QUIC!Bbqq>0WԤ 5u;4y.9| 9dI$iNJd2u)4cSLNh I|v+w۹$0{AG,(CY8l|~ R^;AG'sY^?T\ l{X0x|t*H+w3} /<'EZ&V\ DUW𵱔]l8eq aXL:N6+.fP'u6Zۢ"LfQ h=l3rD YI پ7`t4.V~ `T^gU$3 _SETHV&\,9?ia)嬄* $j'3a#ݷ6iI4,Ψ\ 85x#(u|*ubա2@{R>,K$%|d_>2{r=*cނ cy"iD\!3 Ajɀ!$sՅߣMJ\oV4kDz' Fƃ5$wJF;X"A:|]|n16>\[/t3s%rY>F.Rn&ߗ(9@⯩pYJNc[;}`ŝ검N0u`qD$iUrlp&EM)TN37k ]kӑ{:#P\}x͈O 9A`qYD:8vUCh_00 >=<QN#7=%JArrj- (WL,,?IhՇ5 [({N)=xf SD"rO.mRąQ>t>V`|:#,ⱞp-6FV Fqa7>I[9axQ;"t$vd'\4/=3 :'Fܓ۴: A[7Zxu1BY<8+8xr#yC Wgfu wOG+^u8['HFXoL]Ƈ2C+ '%U"4-§il#p3&1]f[XgoC^Cd~w:/{D_M(jwF/io1LzRcS&[XN7=ߛuo:_+4\-@XG繌jZ#_7ԻaV#kROǽm͊\ zIƏy5p R+^|P3@|?Ηm|_bd[X0P}=ݯʟ kR`M6iF}+|X>ytO[.l8ytm>y#-myZ㟯zq2]e0l6]SSzhGf&J դwgT2 OjKҬG׷8&?7Uy i9G#cmlj$OUa@t13z7'`$?z&?"s~TwT͍ k8`R)Km8T&kI*  /.D++x +K`gsR  ŭ+^&i4&DR9 DLӒ-Zj(eE4g"<=4~̞~WK0O8#PK-(F γO~S]%ӄ)w8(Q3[z"̻$G8ǹA*_-|?rLiJjȻٗE0SwHn1L? gIƘ~p]{%6 o3B/ +9t\/6MCUB>h RdpbGIOkڴE ؜6b{RvXwk!4s86Pm*)J۟b:<'?@ΉO1HAޢrl01y$p,\^sCr(PĤb{d4񮜂s]($py4@R8pC@nf(GC 4Dr4DQق~ibu,0cv t2|=!7}nwLjQxoXsǪ-2ƈ壻p2\@_4 ptfZ[P\m:ߍo#e\6N _\8kBª7LcAT󎔪4ߏvzԻVQIqos뀴,:]7vhח"c9_8#jD(F''@$dP2-Ss%:vLdN~,2G iʒX(4DK S|CN&|+XB"GC plT|r1"r^eHVm%A[^נt#{o>7Zwi }Z2]oĿN˵<;iF] >ϥ `EB*r2`辫?_YN4EY69۳@ķL٧ן87] F;”^*V8`lTVoA5=5WVSَXS%#5Uf*U&*J+*jN` X6 UPnzŔE_2J `"LZasԴt4xFq z9]:@$'%m9E#N t #?$EW<y p&vS0u'V˯ݘk?[72z~7~hSt?EB}q6 ɑ05Z6"GS6Nh[8&Y?}oixw{~4)~&eUs>ʛ?2efYdwI-{Ͳkȧ;VUzΨW4^ +"B*IߤR$1MS@-zR6vgRkQ+%9&CLBEC(}IEaG2Z`Jd&9pB.xe2TYQzQ2T d֠[U$R#RBxL-JIQ2BCB <P[:i\˴e\Q%*oxjew "~<).b/8Dؕ8W,V Xqyk[‚GL[Ԋj}9>b_<朮KٜI}KR-J| 3TbNqJ6|~& ytN~wZ3cP3o}@Wj|K3mN8ċyD a]l=)_#, sR qzIҾk|k\zbc{=uIQ_[woNr|Y,{.O jhڕ֏bO&$ʺj7ez2:g'TP9r^>sqOj,u!Yl8-&m/UfMUWBf.jxKIE#!屢̣bv"_k؃(03: xR\JE$359Vpv9{LEi%5#X}Q RV"R̥Ԕ#GȌR1$NN#2x0_grQC!PRwPy;5Zű]vƗmɗ8@PU'1?ݶhrEQ  s 9 mIf:uG:3,țNwn 3-(M"Ĩaa,8D>84W9K6F-ŭ#݌06E+Fܥܞ VoErcXVrҭ?L8Kic cZJuc~ʳ=TďyO}c8AJ'V F9aXŠHmQ DĈx5h"1%u_2 ^/8W55&KeU Q p4,B X2*ɹ֕JQK֎eŒy=RFŇJ i&?Qn&CsS烹XM|~} SҢ|mMbI)Rj%zb-SZǨ FS| `<3ɛ41Fkk.DȬd @1do#i`RrP1Gy$!P02&]?ecjm(S\H5;O]7ddRB*Yꞻ5lUe㧇[A)?cg?>|I Ը#_}u*4K $~Ǥ2OfBZ*"~_4pJIweƓ-O(bTq bw;3/77^IU?{gWa"(W3?eKoT0vQ  qJ]W{Oc[GOO;7^ ?k3O\zs5u4fmh989~>$3=Vv^(ENt* m3S4*:D\gmBdV 5@O7+=ӕi_ Ǔѷ{Y{l=\ 29:9v5koUf&Xgjr$߆Rr#H8|M4cgqGgy Z51w7&gTm[܊CR.@n%3ż.|u*ŧGw[+=I6B{q bt f6͠sحM4ykl%oT߽ŕ:?L<ר;8T&>u%tkF.\%.j.Y\iI;R۵>@km8dmL-Ciാe:~:q>"O|z;QV5ޖiS,? =6Ol(µ {Mj@6 U:@F}d^hŨݕ0v?6Z8:FgSN p -˲PJm劅b3;^fm#qK}78T-9h=8Sd{rId/, V3q 5Ts8|4-zfG[I0֘iNPozȍ_3KVHy,yK-l6{Fkf$# ^,An,YUM.MyPX/4IT:G|0Y˛%C Fzfz1k7P'׀V u#XT3wH4"UqG %p).|'A rth% m5X_ƛm"(G"igZhk&2kbq"$S_9[̪TW[ooBۃ𤘻"r"+]t㝶WɗG 5UN0P#`PSWjLR ņkO`Rpw K4e\2&A3;.JӬtK5Bћ^Y)JKҧ~HfoVzVJޤפ,pYV*+lnBB[oBkSBFB+X=$2@]D~Hu -ں%ws@:|*Sܖ+?AGfMπ(n^'DŽi=Z= 7~)ks~&R-4w?dMFZ.=S쫪"+aViJVZR|ى {,gKRVV q.qDbjtd\NK|gyAMKZW=[ 6ᘽW\0 ׻q{oI4CgiDxm 2(\Z3'—O_rb)`1qz |]{cjW`k#F+jC5h|;ctLD{{%x$i vT92D1kBhtZ:F9i & VNrE.ѐp<"I ^Z uڶ5֨Hí 34ʺF{oԎl@;#)&Ѯ i9Z,fZ_[K[}Z|yF}X]@Mmp\j5%R@Ζ;-cp@Zh9ޒ/ !u6P̜,<L3X鮅N ۊ]ߣڊfq#rӣLӖ 7kpt;˔,a<>{+$BN\!8&[2WuY1KQ]'B{"\)B3O 4 @T]IhZtS҄  M/PAMk4~Qtw;@Sm,L`x#: B0tGL:t>ap!a=[IA*xF+`4%H0ð$&^!^cȀ0[VG_h nI_ K{z3q2F-ѧvMzU{cŮS⣳v=r{P֫tu4c}/MMw2oo;MMwsv7dsubwX(a֬qC7ƃ^Oy{bFm]ۇhuӍ~*rXn1;?*w,䕛hMG e3zRN7r|'N0w+o=ưWnE6Opw6Kn2H1oxXN-Oʛ%z1,䕛`49ᙫU3ȵЌ7`w=h;@/b%NQmmre[ȱҮ <:!)o8:ZޙDĞ{l BR{Y%s@4*@YI wdpO`FɐO2ArV9?@6OA6LbVcD'HtNY /BD(>C_"f+nI}"(ZgCb"lp <[MTIO7du1M>.fZw@kŴmeF.ϔ ñc{ '80)UléF^}q-:Smhb5^> )ίӛE#ҶʧRW)o8Jg! */5j&k~8-':SPOU*boSMdT`b=MȦS2*ET tJ&t'nmMȦq>n˕ ic,FV6).v+ `!D6e)bSskm2c6w׻UsYnsܬ4}H^ƨoՇ]i#GS;menwY݊?zȷe^}?WOu{{mu&׏w.v3wڐRs(jA=JسH)@J#*嬔_"Tbiu¡JXWɘ+rCONt^ 2QKJF@Ϡ1X^:=W_+hVez1 kk]XQ$c0y%LX7v<-]m$׮oQn2nmuHoUmnnrͦXeY}Jf !DMJPgժ?6Yc­a71`@g79C1`TWh\膝3+x:_ IZH9vAL ׳H5"!/e_g&[I鈕a:7 YVjˌ]נKDZXH~=vMtb ]e8m`Hwqӷ s GH#łh{qyr jT[RնBc `8 (SFSjp^?#&u,A$Ic?Z,f$W(2Mi"1D!5hfI tVЀS2~%-&a4M.d{ok&= ޑ1 `;KdGLXrY±:*v# map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.056074391 +0000 UTC m=+1.866900865,LastTimestamp:2026-02-26 11:10:56.056074391 +0000 UTC m=+1.866900865,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.121321 4699 manager.go:319] Starting recovery of all containers Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126445 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126525 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126542 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126558 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126568 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126578 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126588 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126602 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126616 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126631 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126643 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126657 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126668 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126681 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126690 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126705 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126722 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126731 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126745 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126759 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126773 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126785 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126799 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126813 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126825 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126841 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126860 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126874 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126888 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126900 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126913 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126925 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.126944 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127007 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127022 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127041 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127066 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127082 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127095 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127129 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127145 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127161 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127177 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127191 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127204 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127220 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127234 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127251 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127268 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127282 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127295 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127313 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127342 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127359 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127373 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127386 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127409 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127423 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127433 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127444 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127455 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127466 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127479 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127492 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127507 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127519 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127531 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127542 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127552 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127575 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127586 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127599 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127610 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127641 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127661 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127675 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127685 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127698 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127709 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127721 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127766 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127785 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127800 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127813 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127826 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127838 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127852 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127864 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127874 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127886 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127899 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127911 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127922 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127932 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127944 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127954 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127967 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127983 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.127996 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128015 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128027 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128038 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128050 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128061 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128080 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128092 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128110 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128144 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128157 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128170 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128184 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128197 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128212 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128225 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128237 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128254 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128267 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128286 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128303 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128394 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128478 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128499 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128558 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128571 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128636 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128650 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128662 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128680 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128708 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128742 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128759 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128771 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128783 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128795 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128807 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128820 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128849 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128858 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128868 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128897 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128907 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128918 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.128929 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129004 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129036 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129139 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129151 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129242 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129306 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129320 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129373 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129383 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129492 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129506 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129520 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129531 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129541 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129551 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129566 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129582 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129616 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129639 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129653 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129669 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129679 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129689 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129698 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129707 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129735 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.129744 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.130891 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.130925 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.130934 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.130944 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.130954 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.130999 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.131024 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.131045 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.131085 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.131635 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.131658 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.131669 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.131767 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.131782 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135015 4699 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135102 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135225 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135257 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135278 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135299 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135318 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135339 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135381 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135438 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135457 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135476 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135491 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135511 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135530 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135569 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135602 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135626 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135664 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135679 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135861 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135887 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135903 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.135953 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.136047 4699 reconstruct.go:97] "Volume reconstruction finished" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.136060 4699 reconciler.go:26] "Reconciler: start to sync state" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.146972 4699 manager.go:324] Recovery completed Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.159850 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.161554 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.161614 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.161625 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.163456 4699 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.163475 4699 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.163504 4699 state_mem.go:36] "Initialized new in-memory state store" Feb 26 11:10:56 crc kubenswrapper[4699]: E0226 11:10:56.218820 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.252282 4699 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.254730 4699 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.259040 4699 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.259380 4699 kubelet.go:2335] "Starting kubelet main sync loop" Feb 26 11:10:56 crc kubenswrapper[4699]: E0226 11:10:56.259519 4699 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 26 11:10:56 crc kubenswrapper[4699]: W0226 11:10:56.260560 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:10:56 crc kubenswrapper[4699]: E0226 11:10:56.260609 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 26 11:10:56 crc kubenswrapper[4699]: E0226 11:10:56.319144 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:10:56 crc kubenswrapper[4699]: E0226 11:10:56.320287 4699 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="400ms" Feb 26 11:10:56 crc kubenswrapper[4699]: E0226 11:10:56.359966 4699 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Feb 26 11:10:56 crc kubenswrapper[4699]: E0226 11:10:56.409982 4699 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.213:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1897c76e8f2e2097 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.056074391 +0000 UTC m=+1.866900865,LastTimestamp:2026-02-26 11:10:56.056074391 +0000 UTC m=+1.866900865,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:10:56 crc kubenswrapper[4699]: E0226 11:10:56.420095 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.443422 4699 policy_none.go:49] "None policy: Start" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.445678 4699 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.445762 4699 state_mem.go:35] "Initializing new in-memory state store" Feb 26 11:10:56 crc kubenswrapper[4699]: E0226 11:10:56.520920 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.525328 4699 manager.go:334] "Starting Device Plugin manager" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.525412 4699 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.525428 4699 server.go:79] "Starting device plugin registration server" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.525881 4699 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.525909 4699 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.526257 4699 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.526471 4699 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.526584 4699 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 26 11:10:56 crc kubenswrapper[4699]: E0226 11:10:56.534766 4699 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.560718 4699 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.560966 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.563104 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.563187 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.563202 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.563433 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.564378 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.564476 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.564867 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.564944 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.564988 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.565304 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.565753 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.565812 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.566222 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.566288 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.566318 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.567520 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.567552 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.567567 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.567623 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.567664 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.567696 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.567720 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.567910 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.567942 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.568799 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.568845 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.568864 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.569021 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.569097 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.569133 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.569257 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.569316 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.569330 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.570064 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.570097 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.570127 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.571453 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.571484 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.571497 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.571711 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.571746 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.574973 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.575037 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.575053 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.626187 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.627230 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.627270 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.627287 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.627320 4699 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 11:10:56 crc kubenswrapper[4699]: E0226 11:10:56.627954 4699 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.213:6443: connect: connection refused" node="crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.642374 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.642415 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.642457 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.642481 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.642506 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.642566 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.642679 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.642749 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.642801 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.642839 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.642883 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.642947 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.643031 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.643088 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.643140 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: E0226 11:10:56.721618 4699 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="800ms" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.744918 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.745006 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.745032 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.745054 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.745076 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.745099 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.745153 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.745179 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.745190 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.745208 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.745234 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.745226 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.745278 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.745370 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.745546 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.746017 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.745541 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.745265 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.745314 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.746222 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.745517 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.746200 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.746276 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.746300 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.746323 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.746353 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.745862 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.745389 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.746417 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.746531 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.828897 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.830563 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.830598 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.830606 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.830630 4699 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 11:10:56 crc kubenswrapper[4699]: E0226 11:10:56.831092 4699 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.213:6443: connect: connection refused" node="crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.905173 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.924321 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.940450 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.958525 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 11:10:56 crc kubenswrapper[4699]: I0226 11:10:56.964360 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 26 11:10:57 crc kubenswrapper[4699]: W0226 11:10:57.037766 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:10:57 crc kubenswrapper[4699]: E0226 11:10:57.037861 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 26 11:10:57 crc kubenswrapper[4699]: W0226 11:10:57.061081 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-d35da75391c4c8e7c96b84a226f1de3160a84a3aa73572d927d673ba2153c5cd WatchSource:0}: Error finding container d35da75391c4c8e7c96b84a226f1de3160a84a3aa73572d927d673ba2153c5cd: Status 404 returned error can't find the container with id d35da75391c4c8e7c96b84a226f1de3160a84a3aa73572d927d673ba2153c5cd Feb 26 11:10:57 crc kubenswrapper[4699]: W0226 11:10:57.063349 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-8746dd7735b13fa5391cba71510e4192195b692f4db52e504c78bfc50e515d54 WatchSource:0}: Error finding container 8746dd7735b13fa5391cba71510e4192195b692f4db52e504c78bfc50e515d54: Status 404 returned error can't find the container with id 8746dd7735b13fa5391cba71510e4192195b692f4db52e504c78bfc50e515d54 Feb 26 11:10:57 crc kubenswrapper[4699]: W0226 11:10:57.065521 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-f4b46e19a235baeabadab9914ce0093d472ebf5699e8e17e1f6a3ca1047aa08a WatchSource:0}: Error finding container f4b46e19a235baeabadab9914ce0093d472ebf5699e8e17e1f6a3ca1047aa08a: Status 404 returned error can't find the container with id f4b46e19a235baeabadab9914ce0093d472ebf5699e8e17e1f6a3ca1047aa08a Feb 26 11:10:57 crc kubenswrapper[4699]: W0226 11:10:57.066764 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-ab153560b725156be9926758f43e0c12ac1085335ac564a5eae51d4751966c97 WatchSource:0}: Error finding container ab153560b725156be9926758f43e0c12ac1085335ac564a5eae51d4751966c97: Status 404 returned error can't find the container with id ab153560b725156be9926758f43e0c12ac1085335ac564a5eae51d4751966c97 Feb 26 11:10:57 crc kubenswrapper[4699]: W0226 11:10:57.070377 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-69ed669d0b1cfcbcf0f180f5869f0ffbeb79283372bb6f72c4a3483f1dcea9f1 WatchSource:0}: Error finding container 69ed669d0b1cfcbcf0f180f5869f0ffbeb79283372bb6f72c4a3483f1dcea9f1: Status 404 returned error can't find the container with id 69ed669d0b1cfcbcf0f180f5869f0ffbeb79283372bb6f72c4a3483f1dcea9f1 Feb 26 11:10:57 crc kubenswrapper[4699]: I0226 11:10:57.118896 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:10:57 crc kubenswrapper[4699]: W0226 11:10:57.146259 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:10:57 crc kubenswrapper[4699]: E0226 11:10:57.146393 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 26 11:10:57 crc kubenswrapper[4699]: I0226 11:10:57.232045 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:10:57 crc kubenswrapper[4699]: I0226 11:10:57.233979 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:10:57 crc kubenswrapper[4699]: I0226 11:10:57.234048 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:10:57 crc kubenswrapper[4699]: I0226 11:10:57.234061 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:10:57 crc kubenswrapper[4699]: I0226 11:10:57.234095 4699 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 11:10:57 crc kubenswrapper[4699]: E0226 11:10:57.234807 4699 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.213:6443: connect: connection refused" node="crc" Feb 26 11:10:57 crc kubenswrapper[4699]: I0226 11:10:57.264695 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d35da75391c4c8e7c96b84a226f1de3160a84a3aa73572d927d673ba2153c5cd"} Feb 26 11:10:57 crc kubenswrapper[4699]: I0226 11:10:57.265632 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"69ed669d0b1cfcbcf0f180f5869f0ffbeb79283372bb6f72c4a3483f1dcea9f1"} Feb 26 11:10:57 crc kubenswrapper[4699]: I0226 11:10:57.266528 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ab153560b725156be9926758f43e0c12ac1085335ac564a5eae51d4751966c97"} Feb 26 11:10:57 crc kubenswrapper[4699]: I0226 11:10:57.267554 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f4b46e19a235baeabadab9914ce0093d472ebf5699e8e17e1f6a3ca1047aa08a"} Feb 26 11:10:57 crc kubenswrapper[4699]: I0226 11:10:57.268464 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8746dd7735b13fa5391cba71510e4192195b692f4db52e504c78bfc50e515d54"} Feb 26 11:10:57 crc kubenswrapper[4699]: W0226 11:10:57.362357 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:10:57 crc kubenswrapper[4699]: E0226 11:10:57.362442 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 26 11:10:57 crc kubenswrapper[4699]: W0226 11:10:57.447970 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:10:57 crc kubenswrapper[4699]: E0226 11:10:57.448168 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 26 11:10:57 crc kubenswrapper[4699]: E0226 11:10:57.522678 4699 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="1.6s" Feb 26 11:10:57 crc kubenswrapper[4699]: I0226 11:10:57.943895 4699 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 26 11:10:57 crc kubenswrapper[4699]: E0226 11:10:57.945407 4699 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 26 11:10:58 crc kubenswrapper[4699]: I0226 11:10:58.035055 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:10:58 crc kubenswrapper[4699]: I0226 11:10:58.037032 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:10:58 crc kubenswrapper[4699]: I0226 11:10:58.037099 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:10:58 crc kubenswrapper[4699]: I0226 11:10:58.037140 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:10:58 crc kubenswrapper[4699]: I0226 11:10:58.037176 4699 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 11:10:58 crc kubenswrapper[4699]: E0226 11:10:58.037769 4699 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.213:6443: connect: connection refused" node="crc" Feb 26 11:10:58 crc kubenswrapper[4699]: I0226 11:10:58.119952 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:10:58 crc kubenswrapper[4699]: W0226 11:10:58.859107 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:10:58 crc kubenswrapper[4699]: E0226 11:10:58.859798 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 26 11:10:59 crc kubenswrapper[4699]: I0226 11:10:59.119038 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:10:59 crc kubenswrapper[4699]: E0226 11:10:59.123512 4699 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="3.2s" Feb 26 11:10:59 crc kubenswrapper[4699]: W0226 11:10:59.404936 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:10:59 crc kubenswrapper[4699]: E0226 11:10:59.405038 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 26 11:10:59 crc kubenswrapper[4699]: I0226 11:10:59.638567 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:10:59 crc kubenswrapper[4699]: I0226 11:10:59.640214 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:10:59 crc kubenswrapper[4699]: I0226 11:10:59.640281 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:10:59 crc kubenswrapper[4699]: I0226 11:10:59.640300 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:10:59 crc kubenswrapper[4699]: I0226 11:10:59.640340 4699 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 11:10:59 crc kubenswrapper[4699]: E0226 11:10:59.640991 4699 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.213:6443: connect: connection refused" node="crc" Feb 26 11:11:00 crc kubenswrapper[4699]: W0226 11:11:00.035026 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:11:00 crc kubenswrapper[4699]: E0226 11:11:00.035092 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 26 11:11:00 crc kubenswrapper[4699]: I0226 11:11:00.118700 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:11:00 crc kubenswrapper[4699]: W0226 11:11:00.399234 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:11:00 crc kubenswrapper[4699]: E0226 11:11:00.399307 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.117925 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.280506 4699 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="10b818c7e0d202c03b73fbaacb76a36fdf224d3674ac77a316eb94ce14a45836" exitCode=0 Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.280685 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"10b818c7e0d202c03b73fbaacb76a36fdf224d3674ac77a316eb94ce14a45836"} Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.280841 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.282837 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.282885 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.282897 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.283033 4699 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035" exitCode=0 Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.283112 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035"} Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.283169 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.284185 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.284225 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.284237 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.285227 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b4003edefbd2aac9a706e0d56e2791c34c4bc9a820e5cda0ab4cf3172fc4f5c6"} Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.285275 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bcbdf473c08abfc93be6ee643eb86aebdaf8cae59cbe4c844b800862b15f7434"} Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.286382 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.286826 4699 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="b3a519dfbaf61432d8a2ac84be99b349ba10be387e76e3482dd82f11dacd1e2a" exitCode=0 Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.286885 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"b3a519dfbaf61432d8a2ac84be99b349ba10be387e76e3482dd82f11dacd1e2a"} Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.286922 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.287211 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.287248 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.287263 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.288067 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.288108 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.288138 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.288581 4699 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07" exitCode=0 Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.288626 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07"} Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.288747 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.290423 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.290451 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:01 crc kubenswrapper[4699]: I0226 11:11:01.290488 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.119721 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.211824 4699 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 26 11:11:02 crc kubenswrapper[4699]: E0226 11:11:02.213173 4699 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.296306 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00"} Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.296359 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.296240 4699 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00" exitCode=0 Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.297905 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.297940 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.297952 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.303890 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6e0c4b60e5b4a7561832379bb034e4202ac67058677a2e7e362703e8f5952b83"} Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.303961 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7a914b4d63550a877b25fc802fe067357dc144ba29c8d4cff98b2d3d1c10baa7"} Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.303976 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f2e3cbce38225d1f5dc3e506a9bc813f8b1e76e2f3315a83ce984c0b51ad0b22"} Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.304271 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.306317 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.306351 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.306362 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.310704 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6"} Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.310779 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356"} Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.310794 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e"} Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.314081 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b1cda06107373ef4a7be9d68d9a39ed9f7351913e1deb1bd9e7d825d93ee54a7"} Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.314137 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.314163 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e0bc27153e659e049d639cf7b8963c1485433aed35f5efe5e88f1cc275d92a39"} Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.315515 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.315563 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.315580 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.316231 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"aadd2bede3bd40a4bdf48952422350955b12efacb3598661223bc1d386191df4"} Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.316340 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.317329 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.317374 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.317385 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:02 crc kubenswrapper[4699]: E0226 11:11:02.324547 4699 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="6.4s" Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.835735 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.841841 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.843959 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.844048 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.844068 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:02 crc kubenswrapper[4699]: I0226 11:11:02.844110 4699 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 11:11:02 crc kubenswrapper[4699]: E0226 11:11:02.844856 4699 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.213:6443: connect: connection refused" node="crc" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.119074 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:11:03 crc kubenswrapper[4699]: W0226 11:11:03.292209 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:11:03 crc kubenswrapper[4699]: E0226 11:11:03.292346 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.319361 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.323507 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec"} Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.323651 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c4fe347fb042f4777ab48cf760a13b19a6e283d98b2b10d80ea49490675e5357"} Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.323789 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.324994 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.325045 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.325062 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.326677 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.328092 4699 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4" exitCode=0 Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.328147 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4"} Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.328257 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.328299 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.328327 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.328268 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.329840 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.329868 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.329867 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.329841 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.329902 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.329916 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.329904 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.329996 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.329879 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.330320 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.330345 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.330356 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:03 crc kubenswrapper[4699]: I0226 11:11:03.865778 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:11:04 crc kubenswrapper[4699]: I0226 11:11:04.335711 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99"} Feb 26 11:11:04 crc kubenswrapper[4699]: I0226 11:11:04.335772 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536"} Feb 26 11:11:04 crc kubenswrapper[4699]: I0226 11:11:04.335784 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6"} Feb 26 11:11:04 crc kubenswrapper[4699]: I0226 11:11:04.335788 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:04 crc kubenswrapper[4699]: I0226 11:11:04.335875 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:04 crc kubenswrapper[4699]: I0226 11:11:04.335896 4699 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 11:11:04 crc kubenswrapper[4699]: I0226 11:11:04.335961 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:04 crc kubenswrapper[4699]: I0226 11:11:04.336573 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:11:04 crc kubenswrapper[4699]: I0226 11:11:04.337043 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:04 crc kubenswrapper[4699]: I0226 11:11:04.337085 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:04 crc kubenswrapper[4699]: I0226 11:11:04.337096 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:04 crc kubenswrapper[4699]: I0226 11:11:04.337210 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:04 crc kubenswrapper[4699]: I0226 11:11:04.337270 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:04 crc kubenswrapper[4699]: I0226 11:11:04.337295 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:04 crc kubenswrapper[4699]: I0226 11:11:04.337751 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:04 crc kubenswrapper[4699]: I0226 11:11:04.337852 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:04 crc kubenswrapper[4699]: I0226 11:11:04.337868 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:05 crc kubenswrapper[4699]: I0226 11:11:05.154298 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:11:05 crc kubenswrapper[4699]: I0226 11:11:05.170942 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:11:05 crc kubenswrapper[4699]: I0226 11:11:05.344094 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8"} Feb 26 11:11:05 crc kubenswrapper[4699]: I0226 11:11:05.344176 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22"} Feb 26 11:11:05 crc kubenswrapper[4699]: I0226 11:11:05.344212 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:05 crc kubenswrapper[4699]: I0226 11:11:05.344280 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:05 crc kubenswrapper[4699]: I0226 11:11:05.344280 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:05 crc kubenswrapper[4699]: I0226 11:11:05.345700 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:05 crc kubenswrapper[4699]: I0226 11:11:05.345740 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:05 crc kubenswrapper[4699]: I0226 11:11:05.345750 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:05 crc kubenswrapper[4699]: I0226 11:11:05.345813 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:05 crc kubenswrapper[4699]: I0226 11:11:05.345868 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:05 crc kubenswrapper[4699]: I0226 11:11:05.345879 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:05 crc kubenswrapper[4699]: I0226 11:11:05.346400 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:05 crc kubenswrapper[4699]: I0226 11:11:05.346443 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:05 crc kubenswrapper[4699]: I0226 11:11:05.346457 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:06 crc kubenswrapper[4699]: I0226 11:11:06.346837 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:06 crc kubenswrapper[4699]: I0226 11:11:06.346961 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:06 crc kubenswrapper[4699]: I0226 11:11:06.349102 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:06 crc kubenswrapper[4699]: I0226 11:11:06.349235 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:06 crc kubenswrapper[4699]: I0226 11:11:06.349102 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:06 crc kubenswrapper[4699]: I0226 11:11:06.349289 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:06 crc kubenswrapper[4699]: I0226 11:11:06.349308 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:06 crc kubenswrapper[4699]: I0226 11:11:06.349253 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:06 crc kubenswrapper[4699]: E0226 11:11:06.534929 4699 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 11:11:08 crc kubenswrapper[4699]: I0226 11:11:08.274292 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 26 11:11:08 crc kubenswrapper[4699]: I0226 11:11:08.274542 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:08 crc kubenswrapper[4699]: I0226 11:11:08.276256 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:08 crc kubenswrapper[4699]: I0226 11:11:08.276327 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:08 crc kubenswrapper[4699]: I0226 11:11:08.276342 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:09 crc kubenswrapper[4699]: I0226 11:11:09.245456 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:09 crc kubenswrapper[4699]: I0226 11:11:09.247104 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:09 crc kubenswrapper[4699]: I0226 11:11:09.247185 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:09 crc kubenswrapper[4699]: I0226 11:11:09.247197 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:09 crc kubenswrapper[4699]: I0226 11:11:09.247227 4699 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 11:11:09 crc kubenswrapper[4699]: I0226 11:11:09.552090 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 26 11:11:09 crc kubenswrapper[4699]: I0226 11:11:09.552395 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:09 crc kubenswrapper[4699]: I0226 11:11:09.579752 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:09 crc kubenswrapper[4699]: I0226 11:11:09.579810 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:09 crc kubenswrapper[4699]: I0226 11:11:09.579829 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:10 crc kubenswrapper[4699]: I0226 11:11:10.376581 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:11:10 crc kubenswrapper[4699]: I0226 11:11:10.377220 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:10 crc kubenswrapper[4699]: I0226 11:11:10.378772 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:10 crc kubenswrapper[4699]: I0226 11:11:10.378860 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:10 crc kubenswrapper[4699]: I0226 11:11:10.378880 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:10 crc kubenswrapper[4699]: I0226 11:11:10.383079 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:11:10 crc kubenswrapper[4699]: I0226 11:11:10.457011 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:11:10 crc kubenswrapper[4699]: I0226 11:11:10.699166 4699 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 26 11:11:11 crc kubenswrapper[4699]: I0226 11:11:11.368837 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:11 crc kubenswrapper[4699]: I0226 11:11:11.370362 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:11 crc kubenswrapper[4699]: I0226 11:11:11.370433 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:11 crc kubenswrapper[4699]: I0226 11:11:11.370443 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:12 crc kubenswrapper[4699]: I0226 11:11:12.371383 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:12 crc kubenswrapper[4699]: I0226 11:11:12.372536 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:12 crc kubenswrapper[4699]: I0226 11:11:12.372591 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:12 crc kubenswrapper[4699]: I0226 11:11:12.372605 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:13 crc kubenswrapper[4699]: I0226 11:11:13.461108 4699 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 11:11:13 crc kubenswrapper[4699]: I0226 11:11:13.461919 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 11:11:13 crc kubenswrapper[4699]: E0226 11:11:13.966816 4699 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:11:13Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 26 11:11:13 crc kubenswrapper[4699]: E0226 11:11:13.968622 4699 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:11:13Z is after 2026-02-23T05:33:13Z" node="crc" Feb 26 11:11:13 crc kubenswrapper[4699]: E0226 11:11:13.972239 4699 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:11:13Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1897c76e8f2e2097 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.056074391 +0000 UTC m=+1.866900865,LastTimestamp:2026-02-26 11:10:56.056074391 +0000 UTC m=+1.866900865,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:13 crc kubenswrapper[4699]: E0226 11:11:13.972684 4699 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:11:13Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 11:11:13 crc kubenswrapper[4699]: W0226 11:11:13.972764 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:11:13Z is after 2026-02-23T05:33:13Z Feb 26 11:11:13 crc kubenswrapper[4699]: E0226 11:11:13.972874 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:11:13Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 11:11:13 crc kubenswrapper[4699]: I0226 11:11:13.973938 4699 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 26 11:11:13 crc kubenswrapper[4699]: I0226 11:11:13.974004 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 26 11:11:13 crc kubenswrapper[4699]: W0226 11:11:13.976350 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:11:13Z is after 2026-02-23T05:33:13Z Feb 26 11:11:13 crc kubenswrapper[4699]: E0226 11:11:13.976434 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:11:13Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 11:11:13 crc kubenswrapper[4699]: I0226 11:11:13.979178 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:11:13Z is after 2026-02-23T05:33:13Z Feb 26 11:11:13 crc kubenswrapper[4699]: I0226 11:11:13.979346 4699 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 26 11:11:13 crc kubenswrapper[4699]: I0226 11:11:13.979437 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 26 11:11:13 crc kubenswrapper[4699]: W0226 11:11:13.979817 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:11:13Z is after 2026-02-23T05:33:13Z Feb 26 11:11:13 crc kubenswrapper[4699]: E0226 11:11:13.979891 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:11:13Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 11:11:14 crc kubenswrapper[4699]: I0226 11:11:14.121831 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:11:14Z is after 2026-02-23T05:33:13Z Feb 26 11:11:14 crc kubenswrapper[4699]: I0226 11:11:14.379451 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 26 11:11:14 crc kubenswrapper[4699]: I0226 11:11:14.382444 4699 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c4fe347fb042f4777ab48cf760a13b19a6e283d98b2b10d80ea49490675e5357" exitCode=255 Feb 26 11:11:14 crc kubenswrapper[4699]: I0226 11:11:14.382530 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c4fe347fb042f4777ab48cf760a13b19a6e283d98b2b10d80ea49490675e5357"} Feb 26 11:11:14 crc kubenswrapper[4699]: I0226 11:11:14.382743 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:14 crc kubenswrapper[4699]: I0226 11:11:14.383683 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:14 crc kubenswrapper[4699]: I0226 11:11:14.383735 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:14 crc kubenswrapper[4699]: I0226 11:11:14.383748 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:14 crc kubenswrapper[4699]: I0226 11:11:14.384282 4699 scope.go:117] "RemoveContainer" containerID="c4fe347fb042f4777ab48cf760a13b19a6e283d98b2b10d80ea49490675e5357" Feb 26 11:11:15 crc kubenswrapper[4699]: I0226 11:11:15.349540 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:11:15Z is after 2026-02-23T05:33:13Z Feb 26 11:11:15 crc kubenswrapper[4699]: I0226 11:11:15.388635 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 26 11:11:15 crc kubenswrapper[4699]: I0226 11:11:15.390966 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8434a1d9c6c5f5c25ebe428c5621131d21b6a77cab0e7b5990805437c2e11e90"} Feb 26 11:11:15 crc kubenswrapper[4699]: I0226 11:11:15.391191 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:15 crc kubenswrapper[4699]: I0226 11:11:15.393421 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:15 crc kubenswrapper[4699]: I0226 11:11:15.393484 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:15 crc kubenswrapper[4699]: I0226 11:11:15.393497 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:15 crc kubenswrapper[4699]: I0226 11:11:15.402330 4699 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]log ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]etcd ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/generic-apiserver-start-informers ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/priority-and-fairness-filter ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/start-apiextensions-informers ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/start-apiextensions-controllers ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/crd-informer-synced ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/start-system-namespaces-controller ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 26 11:11:15 crc kubenswrapper[4699]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/priority-and-fairness-config-producer ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/bootstrap-controller ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/start-kube-aggregator-informers ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/apiservice-registration-controller ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/apiservice-discovery-controller ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]autoregister-completion ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/apiservice-openapi-controller ok Feb 26 11:11:15 crc kubenswrapper[4699]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 26 11:11:15 crc kubenswrapper[4699]: livez check failed Feb 26 11:11:15 crc kubenswrapper[4699]: I0226 11:11:15.402426 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 11:11:15 crc kubenswrapper[4699]: W0226 11:11:15.988026 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:11:15Z is after 2026-02-23T05:33:13Z Feb 26 11:11:15 crc kubenswrapper[4699]: E0226 11:11:15.988144 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:11:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 11:11:16 crc kubenswrapper[4699]: I0226 11:11:16.121611 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:11:16Z is after 2026-02-23T05:33:13Z Feb 26 11:11:16 crc kubenswrapper[4699]: I0226 11:11:16.397536 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 26 11:11:16 crc kubenswrapper[4699]: I0226 11:11:16.398515 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 26 11:11:16 crc kubenswrapper[4699]: I0226 11:11:16.401877 4699 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8434a1d9c6c5f5c25ebe428c5621131d21b6a77cab0e7b5990805437c2e11e90" exitCode=255 Feb 26 11:11:16 crc kubenswrapper[4699]: I0226 11:11:16.401951 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8434a1d9c6c5f5c25ebe428c5621131d21b6a77cab0e7b5990805437c2e11e90"} Feb 26 11:11:16 crc kubenswrapper[4699]: I0226 11:11:16.402014 4699 scope.go:117] "RemoveContainer" containerID="c4fe347fb042f4777ab48cf760a13b19a6e283d98b2b10d80ea49490675e5357" Feb 26 11:11:16 crc kubenswrapper[4699]: I0226 11:11:16.402235 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:16 crc kubenswrapper[4699]: I0226 11:11:16.403873 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:16 crc kubenswrapper[4699]: I0226 11:11:16.403930 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:16 crc kubenswrapper[4699]: I0226 11:11:16.403944 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:16 crc kubenswrapper[4699]: I0226 11:11:16.404762 4699 scope.go:117] "RemoveContainer" containerID="8434a1d9c6c5f5c25ebe428c5621131d21b6a77cab0e7b5990805437c2e11e90" Feb 26 11:11:16 crc kubenswrapper[4699]: E0226 11:11:16.405034 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 11:11:16 crc kubenswrapper[4699]: E0226 11:11:16.535244 4699 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 11:11:17 crc kubenswrapper[4699]: I0226 11:11:17.121765 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:11:17Z is after 2026-02-23T05:33:13Z Feb 26 11:11:17 crc kubenswrapper[4699]: I0226 11:11:17.407561 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 26 11:11:17 crc kubenswrapper[4699]: I0226 11:11:17.710464 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:11:17 crc kubenswrapper[4699]: I0226 11:11:17.713076 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:17 crc kubenswrapper[4699]: I0226 11:11:17.715624 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:17 crc kubenswrapper[4699]: I0226 11:11:17.715680 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:17 crc kubenswrapper[4699]: I0226 11:11:17.715698 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:17 crc kubenswrapper[4699]: I0226 11:11:17.716502 4699 scope.go:117] "RemoveContainer" containerID="8434a1d9c6c5f5c25ebe428c5621131d21b6a77cab0e7b5990805437c2e11e90" Feb 26 11:11:17 crc kubenswrapper[4699]: E0226 11:11:17.716725 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 11:11:18 crc kubenswrapper[4699]: I0226 11:11:18.120916 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:11:18Z is after 2026-02-23T05:33:13Z Feb 26 11:11:18 crc kubenswrapper[4699]: I0226 11:11:18.312156 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 26 11:11:18 crc kubenswrapper[4699]: I0226 11:11:18.312383 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:18 crc kubenswrapper[4699]: I0226 11:11:18.313963 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:18 crc kubenswrapper[4699]: I0226 11:11:18.314025 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:18 crc kubenswrapper[4699]: I0226 11:11:18.314039 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:18 crc kubenswrapper[4699]: I0226 11:11:18.325568 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 26 11:11:18 crc kubenswrapper[4699]: I0226 11:11:18.412769 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:18 crc kubenswrapper[4699]: I0226 11:11:18.413918 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:18 crc kubenswrapper[4699]: I0226 11:11:18.413977 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:18 crc kubenswrapper[4699]: I0226 11:11:18.413990 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:18 crc kubenswrapper[4699]: I0226 11:11:18.972255 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:11:18 crc kubenswrapper[4699]: I0226 11:11:18.972459 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:18 crc kubenswrapper[4699]: I0226 11:11:18.973526 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:18 crc kubenswrapper[4699]: I0226 11:11:18.973558 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:18 crc kubenswrapper[4699]: I0226 11:11:18.973572 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:18 crc kubenswrapper[4699]: I0226 11:11:18.974210 4699 scope.go:117] "RemoveContainer" containerID="8434a1d9c6c5f5c25ebe428c5621131d21b6a77cab0e7b5990805437c2e11e90" Feb 26 11:11:18 crc kubenswrapper[4699]: E0226 11:11:18.974390 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 11:11:19 crc kubenswrapper[4699]: I0226 11:11:19.121529 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:11:19Z is after 2026-02-23T05:33:13Z Feb 26 11:11:20 crc kubenswrapper[4699]: I0226 11:11:20.122645 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:20 crc kubenswrapper[4699]: I0226 11:11:20.159942 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:11:20 crc kubenswrapper[4699]: I0226 11:11:20.160111 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:20 crc kubenswrapper[4699]: I0226 11:11:20.161294 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:20 crc kubenswrapper[4699]: I0226 11:11:20.161387 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:20 crc kubenswrapper[4699]: I0226 11:11:20.161403 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:20 crc kubenswrapper[4699]: I0226 11:11:20.162018 4699 scope.go:117] "RemoveContainer" containerID="8434a1d9c6c5f5c25ebe428c5621131d21b6a77cab0e7b5990805437c2e11e90" Feb 26 11:11:20 crc kubenswrapper[4699]: E0226 11:11:20.162257 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 11:11:20 crc kubenswrapper[4699]: I0226 11:11:20.164054 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:11:20 crc kubenswrapper[4699]: I0226 11:11:20.423355 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:20 crc kubenswrapper[4699]: I0226 11:11:20.424952 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:20 crc kubenswrapper[4699]: I0226 11:11:20.425235 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:20 crc kubenswrapper[4699]: I0226 11:11:20.425268 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:20 crc kubenswrapper[4699]: I0226 11:11:20.426375 4699 scope.go:117] "RemoveContainer" containerID="8434a1d9c6c5f5c25ebe428c5621131d21b6a77cab0e7b5990805437c2e11e90" Feb 26 11:11:20 crc kubenswrapper[4699]: E0226 11:11:20.426606 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 11:11:20 crc kubenswrapper[4699]: I0226 11:11:20.968811 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:20 crc kubenswrapper[4699]: I0226 11:11:20.969893 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:20 crc kubenswrapper[4699]: I0226 11:11:20.969942 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:20 crc kubenswrapper[4699]: I0226 11:11:20.969955 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:20 crc kubenswrapper[4699]: I0226 11:11:20.969981 4699 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 11:11:20 crc kubenswrapper[4699]: E0226 11:11:20.971174 4699 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 11:11:20 crc kubenswrapper[4699]: E0226 11:11:20.971181 4699 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 11:11:21 crc kubenswrapper[4699]: I0226 11:11:21.125261 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:22 crc kubenswrapper[4699]: I0226 11:11:22.122460 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:22 crc kubenswrapper[4699]: W0226 11:11:22.587726 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 26 11:11:22 crc kubenswrapper[4699]: E0226 11:11:22.587780 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 26 11:11:23 crc kubenswrapper[4699]: I0226 11:11:23.124010 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:23 crc kubenswrapper[4699]: I0226 11:11:23.457164 4699 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 11:11:23 crc kubenswrapper[4699]: I0226 11:11:23.457298 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 11:11:23 crc kubenswrapper[4699]: W0226 11:11:23.683784 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:23 crc kubenswrapper[4699]: E0226 11:11:23.684374 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 26 11:11:23 crc kubenswrapper[4699]: E0226 11:11:23.978492 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e8f2e2097 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.056074391 +0000 UTC m=+1.866900865,LastTimestamp:2026-02-26 11:10:56.056074391 +0000 UTC m=+1.866900865,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:23 crc kubenswrapper[4699]: E0226 11:11:23.984323 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e9578464a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.161597002 +0000 UTC m=+1.972423436,LastTimestamp:2026-02-26 11:10:56.161597002 +0000 UTC m=+1.972423436,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:23 crc kubenswrapper[4699]: E0226 11:11:23.989924 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e9578a4b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.161621172 +0000 UTC m=+1.972447606,LastTimestamp:2026-02-26 11:10:56.161621172 +0000 UTC m=+1.972447606,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:23 crc kubenswrapper[4699]: E0226 11:11:23.996349 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e9578c9a9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.161630633 +0000 UTC m=+1.972457067,LastTimestamp:2026-02-26 11:10:56.161630633 +0000 UTC m=+1.972457067,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.002189 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76eab5cff3d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.528908093 +0000 UTC m=+2.339734547,LastTimestamp:2026-02-26 11:10:56.528908093 +0000 UTC m=+2.339734547,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.007237 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897c76e9578464a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e9578464a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.161597002 +0000 UTC m=+1.972423436,LastTimestamp:2026-02-26 11:10:56.563165407 +0000 UTC m=+2.373991861,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.011936 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897c76e9578a4b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e9578a4b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.161621172 +0000 UTC m=+1.972447606,LastTimestamp:2026-02-26 11:10:56.563196988 +0000 UTC m=+2.374023422,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.016705 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897c76e9578c9a9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e9578c9a9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.161630633 +0000 UTC m=+1.972457067,LastTimestamp:2026-02-26 11:10:56.563211318 +0000 UTC m=+2.374037762,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.021241 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897c76e9578464a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e9578464a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.161597002 +0000 UTC m=+1.972423436,LastTimestamp:2026-02-26 11:10:56.564917108 +0000 UTC m=+2.375743612,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.026034 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897c76e9578a4b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e9578a4b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.161621172 +0000 UTC m=+1.972447606,LastTimestamp:2026-02-26 11:10:56.564978779 +0000 UTC m=+2.375805243,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.031057 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897c76e9578c9a9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e9578c9a9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.161630633 +0000 UTC m=+1.972457067,LastTimestamp:2026-02-26 11:10:56.56500369 +0000 UTC m=+2.375830164,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.035738 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897c76e9578464a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e9578464a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.161597002 +0000 UTC m=+1.972423436,LastTimestamp:2026-02-26 11:10:56.566262887 +0000 UTC m=+2.377089351,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.039957 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897c76e9578a4b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e9578a4b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.161621172 +0000 UTC m=+1.972447606,LastTimestamp:2026-02-26 11:10:56.566305848 +0000 UTC m=+2.377132322,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.044962 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897c76e9578c9a9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e9578c9a9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.161630633 +0000 UTC m=+1.972457067,LastTimestamp:2026-02-26 11:10:56.566332149 +0000 UTC m=+2.377158623,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.049645 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897c76e9578464a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e9578464a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.161597002 +0000 UTC m=+1.972423436,LastTimestamp:2026-02-26 11:10:56.567544114 +0000 UTC m=+2.378370558,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.054347 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897c76e9578a4b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e9578a4b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.161621172 +0000 UTC m=+1.972447606,LastTimestamp:2026-02-26 11:10:56.567562034 +0000 UTC m=+2.378388478,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.060163 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897c76e9578c9a9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e9578c9a9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.161630633 +0000 UTC m=+1.972457067,LastTimestamp:2026-02-26 11:10:56.567576185 +0000 UTC m=+2.378402629,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.065506 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897c76e9578464a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e9578464a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.161597002 +0000 UTC m=+1.972423436,LastTimestamp:2026-02-26 11:10:56.567650277 +0000 UTC m=+2.378476751,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.067243 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897c76e9578a4b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e9578a4b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.161621172 +0000 UTC m=+1.972447606,LastTimestamp:2026-02-26 11:10:56.567684278 +0000 UTC m=+2.378510752,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.071737 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897c76e9578c9a9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e9578c9a9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.161630633 +0000 UTC m=+1.972457067,LastTimestamp:2026-02-26 11:10:56.567708849 +0000 UTC m=+2.378535323,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.076393 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897c76e9578464a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e9578464a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.161597002 +0000 UTC m=+1.972423436,LastTimestamp:2026-02-26 11:10:56.568827501 +0000 UTC m=+2.379653965,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.080416 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897c76e9578a4b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e9578a4b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.161621172 +0000 UTC m=+1.972447606,LastTimestamp:2026-02-26 11:10:56.568858072 +0000 UTC m=+2.379684546,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.084842 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897c76e9578c9a9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e9578c9a9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.161630633 +0000 UTC m=+1.972457067,LastTimestamp:2026-02-26 11:10:56.568873572 +0000 UTC m=+2.379700046,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.089169 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897c76e9578464a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e9578464a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.161597002 +0000 UTC m=+1.972423436,LastTimestamp:2026-02-26 11:10:56.569291744 +0000 UTC m=+2.380118178,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.094385 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897c76e9578a4b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897c76e9578a4b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:56.161621172 +0000 UTC m=+1.972447606,LastTimestamp:2026-02-26 11:10:56.569327155 +0000 UTC m=+2.380153589,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.100189 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897c76ecb68ca75 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:57.066551925 +0000 UTC m=+2.877378359,LastTimestamp:2026-02-26 11:10:57.066551925 +0000 UTC m=+2.877378359,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.104949 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897c76ecb70313d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:57.067036989 +0000 UTC m=+2.877863423,LastTimestamp:2026-02-26 11:10:57.067036989 +0000 UTC m=+2.877863423,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.112728 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c76ecb862b88 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:57.06847732 +0000 UTC m=+2.879303754,LastTimestamp:2026-02-26 11:10:57.06847732 +0000 UTC m=+2.879303754,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.117305 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897c76ecb8ae145 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:57.068785989 +0000 UTC m=+2.879612423,LastTimestamp:2026-02-26 11:10:57.068785989 +0000 UTC m=+2.879612423,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: I0226 11:11:24.121248 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.121236 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897c76ecbf09a6b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:10:57.075452523 +0000 UTC m=+2.886278947,LastTimestamp:2026-02-26 11:10:57.075452523 +0000 UTC m=+2.886278947,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.126261 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897c76fa8017535 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:00.767544629 +0000 UTC m=+6.578371073,LastTimestamp:2026-02-26 11:11:00.767544629 +0000 UTC m=+6.578371073,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.131321 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897c76fa83340c4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:00.770808004 +0000 UTC m=+6.581634438,LastTimestamp:2026-02-26 11:11:00.770808004 +0000 UTC m=+6.581634438,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.136298 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897c76fa8667987 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:00.774164871 +0000 UTC m=+6.584991305,LastTimestamp:2026-02-26 11:11:00.774164871 +0000 UTC m=+6.584991305,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.140475 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897c76fa871e765 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:00.774913893 +0000 UTC m=+6.585740327,LastTimestamp:2026-02-26 11:11:00.774913893 +0000 UTC m=+6.585740327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.144856 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c76fa8721415 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:00.774925333 +0000 UTC m=+6.585751767,LastTimestamp:2026-02-26 11:11:00.774925333 +0000 UTC m=+6.585751767,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.148816 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897c76fa8a9560a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:00.778546698 +0000 UTC m=+6.589373132,LastTimestamp:2026-02-26 11:11:00.778546698 +0000 UTC m=+6.589373132,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.153786 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897c76fa8ce3600 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:00.780963328 +0000 UTC m=+6.591789762,LastTimestamp:2026-02-26 11:11:00.780963328 +0000 UTC m=+6.591789762,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.158567 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897c76fa90c7ddd openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:00.785044957 +0000 UTC m=+6.595871391,LastTimestamp:2026-02-26 11:11:00.785044957 +0000 UTC m=+6.595871391,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.163185 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897c76fa959761b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:00.790089243 +0000 UTC m=+6.600915677,LastTimestamp:2026-02-26 11:11:00.790089243 +0000 UTC m=+6.600915677,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.168108 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c76fa9a7edea openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:00.795231722 +0000 UTC m=+6.606058156,LastTimestamp:2026-02-26 11:11:00.795231722 +0000 UTC m=+6.606058156,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.172504 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897c76fa9a9c902 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:00.795353346 +0000 UTC m=+6.606179780,LastTimestamp:2026-02-26 11:11:00.795353346 +0000 UTC m=+6.606179780,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.177607 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897c76fbb1a2118 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.0879286 +0000 UTC m=+6.898755034,LastTimestamp:2026-02-26 11:11:01.0879286 +0000 UTC m=+6.898755034,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.181906 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897c76fbc9e2cd0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.113359568 +0000 UTC m=+6.924186002,LastTimestamp:2026-02-26 11:11:01.113359568 +0000 UTC m=+6.924186002,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.186415 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897c76fbcb413c5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.114794949 +0000 UTC m=+6.925621383,LastTimestamp:2026-02-26 11:11:01.114794949 +0000 UTC m=+6.925621383,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.191058 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897c76fc6dc9419 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.285221401 +0000 UTC m=+7.096047835,LastTimestamp:2026-02-26 11:11:01.285221401 +0000 UTC m=+7.096047835,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.194387 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897c76fc6e7b571 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.285950833 +0000 UTC m=+7.096777267,LastTimestamp:2026-02-26 11:11:01.285950833 +0000 UTC m=+7.096777267,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.198094 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897c76fc7060b18 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.28793884 +0000 UTC m=+7.098765274,LastTimestamp:2026-02-26 11:11:01.28793884 +0000 UTC m=+7.098765274,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.202878 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897c76fc71a5a6d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.289269869 +0000 UTC m=+7.100096303,LastTimestamp:2026-02-26 11:11:01.289269869 +0000 UTC m=+7.100096303,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.206998 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c76fc73dde5e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.291597406 +0000 UTC m=+7.102423860,LastTimestamp:2026-02-26 11:11:01.291597406 +0000 UTC m=+7.102423860,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.211568 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897c76fc83b53ec openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.308208108 +0000 UTC m=+7.119034542,LastTimestamp:2026-02-26 11:11:01.308208108 +0000 UTC m=+7.119034542,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.215836 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897c76fc871584b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.311748171 +0000 UTC m=+7.122574605,LastTimestamp:2026-02-26 11:11:01.311748171 +0000 UTC m=+7.122574605,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.219945 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897c76fd4ba234f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.517845327 +0000 UTC m=+7.328671751,LastTimestamp:2026-02-26 11:11:01.517845327 +0000 UTC m=+7.328671751,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.223737 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897c76fd4f89c78 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.521939576 +0000 UTC m=+7.332766010,LastTimestamp:2026-02-26 11:11:01.521939576 +0000 UTC m=+7.332766010,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.227557 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897c76fd50bb6ce openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.523191502 +0000 UTC m=+7.334017936,LastTimestamp:2026-02-26 11:11:01.523191502 +0000 UTC m=+7.334017936,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.231401 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897c76fd55b3a02 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.528402434 +0000 UTC m=+7.339228868,LastTimestamp:2026-02-26 11:11:01.528402434 +0000 UTC m=+7.339228868,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.234956 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897c76fd5869186 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.531242886 +0000 UTC m=+7.342069320,LastTimestamp:2026-02-26 11:11:01.531242886 +0000 UTC m=+7.342069320,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.239574 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897c76fd5974654 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.532337748 +0000 UTC m=+7.343164182,LastTimestamp:2026-02-26 11:11:01.532337748 +0000 UTC m=+7.343164182,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.243735 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c76fd5b3ff6e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.534220142 +0000 UTC m=+7.345046576,LastTimestamp:2026-02-26 11:11:01.534220142 +0000 UTC m=+7.345046576,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.247698 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897c76fd5f53604 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.538493956 +0000 UTC m=+7.349320390,LastTimestamp:2026-02-26 11:11:01.538493956 +0000 UTC m=+7.349320390,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.251595 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897c76fd79cfe8e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.566267022 +0000 UTC m=+7.377093456,LastTimestamp:2026-02-26 11:11:01.566267022 +0000 UTC m=+7.377093456,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.255456 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897c76fd7b20cfc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.567646972 +0000 UTC m=+7.378473406,LastTimestamp:2026-02-26 11:11:01.567646972 +0000 UTC m=+7.378473406,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.259581 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897c76fd8eb947b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.588194427 +0000 UTC m=+7.399020861,LastTimestamp:2026-02-26 11:11:01.588194427 +0000 UTC m=+7.399020861,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.264018 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897c76fe16e158e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.730964878 +0000 UTC m=+7.541791312,LastTimestamp:2026-02-26 11:11:01.730964878 +0000 UTC m=+7.541791312,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.268565 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897c76fe4ee9029 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.789716521 +0000 UTC m=+7.600542975,LastTimestamp:2026-02-26 11:11:01.789716521 +0000 UTC m=+7.600542975,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.272865 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897c76fe50a12d4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.791519444 +0000 UTC m=+7.602345878,LastTimestamp:2026-02-26 11:11:01.791519444 +0000 UTC m=+7.602345878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.277965 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897c76fe55986ad openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.796726445 +0000 UTC m=+7.607552879,LastTimestamp:2026-02-26 11:11:01.796726445 +0000 UTC m=+7.607552879,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.283309 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897c76fe749e7da openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.829257178 +0000 UTC m=+7.640083612,LastTimestamp:2026-02-26 11:11:01.829257178 +0000 UTC m=+7.640083612,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.288709 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897c76fe76d3a61 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.831572065 +0000 UTC m=+7.642398499,LastTimestamp:2026-02-26 11:11:01.831572065 +0000 UTC m=+7.642398499,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.293460 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897c76ff3db399f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:02.040107423 +0000 UTC m=+7.850933857,LastTimestamp:2026-02-26 11:11:02.040107423 +0000 UTC m=+7.850933857,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.297902 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897c76ff583ccf9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:02.067932409 +0000 UTC m=+7.878758843,LastTimestamp:2026-02-26 11:11:02.067932409 +0000 UTC m=+7.878758843,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.302450 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c76ff6076bbe openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:02.07655827 +0000 UTC m=+7.887384704,LastTimestamp:2026-02-26 11:11:02.07655827 +0000 UTC m=+7.887384704,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.306422 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897c76ff60a39ed openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:02.076742125 +0000 UTC m=+7.887568559,LastTimestamp:2026-02-26 11:11:02.076742125 +0000 UTC m=+7.887568559,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.310821 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897c76ff71b4db2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:02.094638514 +0000 UTC m=+7.905464948,LastTimestamp:2026-02-26 11:11:02.094638514 +0000 UTC m=+7.905464948,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.314831 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897c76ff7344e8b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:02.096277131 +0000 UTC m=+7.907103565,LastTimestamp:2026-02-26 11:11:02.096277131 +0000 UTC m=+7.907103565,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.318243 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897c77002744e49 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:02.285020745 +0000 UTC m=+8.095847199,LastTimestamp:2026-02-26 11:11:02.285020745 +0000 UTC m=+8.095847199,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.322575 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897c7700387ab58 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:02.303066968 +0000 UTC m=+8.113893402,LastTimestamp:2026-02-26 11:11:02.303066968 +0000 UTC m=+8.113893402,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.328960 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c770038e7869 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:02.303512681 +0000 UTC m=+8.114339125,LastTimestamp:2026-02-26 11:11:02.303512681 +0000 UTC m=+8.114339125,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.333739 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897c770039fad76 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:02.304640374 +0000 UTC m=+8.115466808,LastTimestamp:2026-02-26 11:11:02.304640374 +0000 UTC m=+8.115466808,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.340752 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897c77015d455bb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:02.610081211 +0000 UTC m=+8.420907635,LastTimestamp:2026-02-26 11:11:02.610081211 +0000 UTC m=+8.420907635,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.347039 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c7701610c28a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:02.614041226 +0000 UTC m=+8.424867660,LastTimestamp:2026-02-26 11:11:02.614041226 +0000 UTC m=+8.424867660,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.351382 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897c7701702966e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:02.629889646 +0000 UTC m=+8.440716080,LastTimestamp:2026-02-26 11:11:02.629889646 +0000 UTC m=+8.440716080,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.354926 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c770176dd91e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:02.63691907 +0000 UTC m=+8.447745504,LastTimestamp:2026-02-26 11:11:02.63691907 +0000 UTC m=+8.447745504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.360164 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c77040d4b982 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:03.331527042 +0000 UTC m=+9.142353486,LastTimestamp:2026-02-26 11:11:03.331527042 +0000 UTC m=+9.142353486,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.363948 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c7704cf19855 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:03.534745685 +0000 UTC m=+9.345572119,LastTimestamp:2026-02-26 11:11:03.534745685 +0000 UTC m=+9.345572119,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.367768 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c7704db06af8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:03.547251448 +0000 UTC m=+9.358077882,LastTimestamp:2026-02-26 11:11:03.547251448 +0000 UTC m=+9.358077882,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.371889 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c7704dc8e20a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:03.548854794 +0000 UTC m=+9.359681228,LastTimestamp:2026-02-26 11:11:03.548854794 +0000 UTC m=+9.359681228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.375705 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c77060c39f52 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:03.867277138 +0000 UTC m=+9.678103572,LastTimestamp:2026-02-26 11:11:03.867277138 +0000 UTC m=+9.678103572,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.377396 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c770618f0e37 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:03.880609335 +0000 UTC m=+9.691435769,LastTimestamp:2026-02-26 11:11:03.880609335 +0000 UTC m=+9.691435769,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.379320 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c77061a7ff7a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:03.882243962 +0000 UTC m=+9.693070396,LastTimestamp:2026-02-26 11:11:03.882243962 +0000 UTC m=+9.693070396,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.381910 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c7706c99abc7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:04.065854407 +0000 UTC m=+9.876680841,LastTimestamp:2026-02-26 11:11:04.065854407 +0000 UTC m=+9.876680841,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.386209 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c7706d536858 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:04.07802684 +0000 UTC m=+9.888853274,LastTimestamp:2026-02-26 11:11:04.07802684 +0000 UTC m=+9.888853274,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.391073 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c7706d6fc8b1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:04.079886513 +0000 UTC m=+9.890712947,LastTimestamp:2026-02-26 11:11:04.079886513 +0000 UTC m=+9.890712947,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.396715 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c77088e25879 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:04.540379257 +0000 UTC m=+10.351205691,LastTimestamp:2026-02-26 11:11:04.540379257 +0000 UTC m=+10.351205691,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.401368 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c77089c0bb44 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:04.55495354 +0000 UTC m=+10.365779964,LastTimestamp:2026-02-26 11:11:04.55495354 +0000 UTC m=+10.365779964,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.405071 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c77089d4cf42 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:04.556269378 +0000 UTC m=+10.367095812,LastTimestamp:2026-02-26 11:11:04.556269378 +0000 UTC m=+10.367095812,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.410303 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c770957710f8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:04.751452408 +0000 UTC m=+10.562278842,LastTimestamp:2026-02-26 11:11:04.751452408 +0000 UTC m=+10.562278842,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.414828 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897c7709673fd73 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:04.768028019 +0000 UTC m=+10.578854453,LastTimestamp:2026-02-26 11:11:04.768028019 +0000 UTC m=+10.578854453,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.422377 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 11:11:24 crc kubenswrapper[4699]: &Event{ObjectMeta:{kube-controller-manager-crc.1897c7729ca57718 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Feb 26 11:11:24 crc kubenswrapper[4699]: body: Feb 26 11:11:24 crc kubenswrapper[4699]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:13.461868312 +0000 UTC m=+19.272694766,LastTimestamp:2026-02-26 11:11:13.461868312 +0000 UTC m=+19.272694766,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 11:11:24 crc kubenswrapper[4699]: > Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.426692 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897c7729ca746a9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:13.461986985 +0000 UTC m=+19.272813429,LastTimestamp:2026-02-26 11:11:13.461986985 +0000 UTC m=+19.272813429,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.430998 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 26 11:11:24 crc kubenswrapper[4699]: &Event{ObjectMeta:{kube-apiserver-crc.1897c772bb2bb73b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 26 11:11:24 crc kubenswrapper[4699]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 26 11:11:24 crc kubenswrapper[4699]: Feb 26 11:11:24 crc kubenswrapper[4699]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:13.973983035 +0000 UTC m=+19.784809479,LastTimestamp:2026-02-26 11:11:13.973983035 +0000 UTC m=+19.784809479,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 11:11:24 crc kubenswrapper[4699]: > Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.435221 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897c772bb2c864e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:13.974036046 +0000 UTC m=+19.784862490,LastTimestamp:2026-02-26 11:11:13.974036046 +0000 UTC m=+19.784862490,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.438850 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897c772bb2bb73b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 26 11:11:24 crc kubenswrapper[4699]: &Event{ObjectMeta:{kube-apiserver-crc.1897c772bb2bb73b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 26 11:11:24 crc kubenswrapper[4699]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 26 11:11:24 crc kubenswrapper[4699]: Feb 26 11:11:24 crc kubenswrapper[4699]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:13.973983035 +0000 UTC m=+19.784809479,LastTimestamp:2026-02-26 11:11:13.979407907 +0000 UTC m=+19.790234341,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 11:11:24 crc kubenswrapper[4699]: > Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.442444 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897c772bb2c864e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897c772bb2c864e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:13.974036046 +0000 UTC m=+19.784862490,LastTimestamp:2026-02-26 11:11:13.979471019 +0000 UTC m=+19.790297463,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.446803 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897c770039fad76\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897c770039fad76 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:02.304640374 +0000 UTC m=+8.115466808,LastTimestamp:2026-02-26 11:11:14.385739106 +0000 UTC m=+20.196565540,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.450530 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897c77015d455bb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897c77015d455bb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:02.610081211 +0000 UTC m=+8.420907635,LastTimestamp:2026-02-26 11:11:14.567235821 +0000 UTC m=+20.378062255,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.454904 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897c7701702966e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897c7701702966e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:02.629889646 +0000 UTC m=+8.440716080,LastTimestamp:2026-02-26 11:11:14.576824281 +0000 UTC m=+20.387650715,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.462081 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 11:11:24 crc kubenswrapper[4699]: &Event{ObjectMeta:{kube-controller-manager-crc.1897c774f06b1a12 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 26 11:11:24 crc kubenswrapper[4699]: body: Feb 26 11:11:24 crc kubenswrapper[4699]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:23.457264146 +0000 UTC m=+29.268090590,LastTimestamp:2026-02-26 11:11:23.457264146 +0000 UTC m=+29.268090590,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 11:11:24 crc kubenswrapper[4699]: > Feb 26 11:11:24 crc kubenswrapper[4699]: E0226 11:11:24.466087 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897c774f06c2daa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:23.457334698 +0000 UTC m=+29.268161132,LastTimestamp:2026-02-26 11:11:23.457334698 +0000 UTC m=+29.268161132,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:25 crc kubenswrapper[4699]: W0226 11:11:25.093018 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Feb 26 11:11:25 crc kubenswrapper[4699]: E0226 11:11:25.093110 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 26 11:11:25 crc kubenswrapper[4699]: I0226 11:11:25.128452 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:26 crc kubenswrapper[4699]: I0226 11:11:26.124100 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:26 crc kubenswrapper[4699]: E0226 11:11:26.535337 4699 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 11:11:27 crc kubenswrapper[4699]: I0226 11:11:27.123269 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:27 crc kubenswrapper[4699]: I0226 11:11:27.971696 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:27 crc kubenswrapper[4699]: I0226 11:11:27.973427 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:27 crc kubenswrapper[4699]: I0226 11:11:27.973468 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:27 crc kubenswrapper[4699]: I0226 11:11:27.973477 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:27 crc kubenswrapper[4699]: I0226 11:11:27.973684 4699 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 11:11:27 crc kubenswrapper[4699]: E0226 11:11:27.977907 4699 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 11:11:27 crc kubenswrapper[4699]: E0226 11:11:27.978222 4699 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 11:11:28 crc kubenswrapper[4699]: I0226 11:11:28.124391 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:29 crc kubenswrapper[4699]: I0226 11:11:29.124014 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:29 crc kubenswrapper[4699]: W0226 11:11:29.968849 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 26 11:11:29 crc kubenswrapper[4699]: E0226 11:11:29.968936 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 26 11:11:30 crc kubenswrapper[4699]: I0226 11:11:30.124751 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:30 crc kubenswrapper[4699]: I0226 11:11:30.658554 4699 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 26 11:11:30 crc kubenswrapper[4699]: I0226 11:11:30.675106 4699 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 26 11:11:31 crc kubenswrapper[4699]: I0226 11:11:31.123199 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:32 crc kubenswrapper[4699]: I0226 11:11:32.122790 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:32 crc kubenswrapper[4699]: I0226 11:11:32.472618 4699 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:40112->192.168.126.11:10357: read: connection reset by peer" start-of-body= Feb 26 11:11:32 crc kubenswrapper[4699]: I0226 11:11:32.472711 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:40112->192.168.126.11:10357: read: connection reset by peer" Feb 26 11:11:32 crc kubenswrapper[4699]: I0226 11:11:32.472771 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:11:32 crc kubenswrapper[4699]: I0226 11:11:32.472912 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:32 crc kubenswrapper[4699]: I0226 11:11:32.474627 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:32 crc kubenswrapper[4699]: I0226 11:11:32.474703 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:32 crc kubenswrapper[4699]: I0226 11:11:32.474733 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:32 crc kubenswrapper[4699]: I0226 11:11:32.475951 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"b4003edefbd2aac9a706e0d56e2791c34c4bc9a820e5cda0ab4cf3172fc4f5c6"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 26 11:11:32 crc kubenswrapper[4699]: I0226 11:11:32.476245 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://b4003edefbd2aac9a706e0d56e2791c34c4bc9a820e5cda0ab4cf3172fc4f5c6" gracePeriod=30 Feb 26 11:11:32 crc kubenswrapper[4699]: E0226 11:11:32.477819 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 11:11:32 crc kubenswrapper[4699]: &Event{ObjectMeta:{kube-controller-manager-crc.1897c77709c743b3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:40112->192.168.126.11:10357: read: connection reset by peer Feb 26 11:11:32 crc kubenswrapper[4699]: body: Feb 26 11:11:32 crc kubenswrapper[4699]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:32.472669107 +0000 UTC m=+38.283495541,LastTimestamp:2026-02-26 11:11:32.472669107 +0000 UTC m=+38.283495541,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 11:11:32 crc kubenswrapper[4699]: > Feb 26 11:11:32 crc kubenswrapper[4699]: E0226 11:11:32.483794 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897c77709c8545d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:40112->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:32.472738909 +0000 UTC m=+38.283565333,LastTimestamp:2026-02-26 11:11:32.472738909 +0000 UTC m=+38.283565333,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:32 crc kubenswrapper[4699]: E0226 11:11:32.492601 4699 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897c77709fd7be5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:32.476222437 +0000 UTC m=+38.287048931,LastTimestamp:2026-02-26 11:11:32.476222437 +0000 UTC m=+38.287048931,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:33 crc kubenswrapper[4699]: E0226 11:11:33.067816 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897c76fa8ce3600\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897c76fa8ce3600 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:00.780963328 +0000 UTC m=+6.591789762,LastTimestamp:2026-02-26 11:11:33.062108385 +0000 UTC m=+38.872934819,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:33 crc kubenswrapper[4699]: I0226 11:11:33.124223 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:33 crc kubenswrapper[4699]: E0226 11:11:33.405257 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897c76fbb1a2118\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897c76fbb1a2118 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.0879286 +0000 UTC m=+6.898755034,LastTimestamp:2026-02-26 11:11:33.403473437 +0000 UTC m=+39.214299871,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:33 crc kubenswrapper[4699]: E0226 11:11:33.424021 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897c76fbc9e2cd0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897c76fbc9e2cd0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:01.113359568 +0000 UTC m=+6.924186002,LastTimestamp:2026-02-26 11:11:33.422559584 +0000 UTC m=+39.233386018,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:33 crc kubenswrapper[4699]: I0226 11:11:33.460987 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 11:11:33 crc kubenswrapper[4699]: I0226 11:11:33.461336 4699 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b4003edefbd2aac9a706e0d56e2791c34c4bc9a820e5cda0ab4cf3172fc4f5c6" exitCode=255 Feb 26 11:11:33 crc kubenswrapper[4699]: I0226 11:11:33.461393 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b4003edefbd2aac9a706e0d56e2791c34c4bc9a820e5cda0ab4cf3172fc4f5c6"} Feb 26 11:11:33 crc kubenswrapper[4699]: I0226 11:11:33.461475 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"55229c06747f2b5d388af00f4d2aa770f2786ea7f8015579fb05381eee44235f"} Feb 26 11:11:33 crc kubenswrapper[4699]: I0226 11:11:33.461672 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:33 crc kubenswrapper[4699]: I0226 11:11:33.462923 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:33 crc kubenswrapper[4699]: I0226 11:11:33.462974 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:33 crc kubenswrapper[4699]: I0226 11:11:33.462983 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:34 crc kubenswrapper[4699]: I0226 11:11:34.123656 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:34 crc kubenswrapper[4699]: I0226 11:11:34.978506 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:34 crc kubenswrapper[4699]: I0226 11:11:34.981091 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:34 crc kubenswrapper[4699]: I0226 11:11:34.981169 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:34 crc kubenswrapper[4699]: I0226 11:11:34.981190 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:34 crc kubenswrapper[4699]: I0226 11:11:34.981234 4699 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 11:11:34 crc kubenswrapper[4699]: E0226 11:11:34.986439 4699 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 11:11:34 crc kubenswrapper[4699]: E0226 11:11:34.986859 4699 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 11:11:35 crc kubenswrapper[4699]: I0226 11:11:35.123169 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:35 crc kubenswrapper[4699]: I0226 11:11:35.260388 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:35 crc kubenswrapper[4699]: I0226 11:11:35.262344 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:35 crc kubenswrapper[4699]: I0226 11:11:35.262401 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:35 crc kubenswrapper[4699]: I0226 11:11:35.262413 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:35 crc kubenswrapper[4699]: I0226 11:11:35.263108 4699 scope.go:117] "RemoveContainer" containerID="8434a1d9c6c5f5c25ebe428c5621131d21b6a77cab0e7b5990805437c2e11e90" Feb 26 11:11:36 crc kubenswrapper[4699]: I0226 11:11:36.122521 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:36 crc kubenswrapper[4699]: I0226 11:11:36.469180 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 26 11:11:36 crc kubenswrapper[4699]: I0226 11:11:36.469600 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 26 11:11:36 crc kubenswrapper[4699]: I0226 11:11:36.471258 4699 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c2551d40b2ef08b55319c037bd0cce11c3db01d06a454a47ac46c3f5fea58b7f" exitCode=255 Feb 26 11:11:36 crc kubenswrapper[4699]: I0226 11:11:36.471310 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c2551d40b2ef08b55319c037bd0cce11c3db01d06a454a47ac46c3f5fea58b7f"} Feb 26 11:11:36 crc kubenswrapper[4699]: I0226 11:11:36.471351 4699 scope.go:117] "RemoveContainer" containerID="8434a1d9c6c5f5c25ebe428c5621131d21b6a77cab0e7b5990805437c2e11e90" Feb 26 11:11:36 crc kubenswrapper[4699]: I0226 11:11:36.471455 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:36 crc kubenswrapper[4699]: I0226 11:11:36.472318 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:36 crc kubenswrapper[4699]: I0226 11:11:36.472349 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:36 crc kubenswrapper[4699]: I0226 11:11:36.472361 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:36 crc kubenswrapper[4699]: I0226 11:11:36.472854 4699 scope.go:117] "RemoveContainer" containerID="c2551d40b2ef08b55319c037bd0cce11c3db01d06a454a47ac46c3f5fea58b7f" Feb 26 11:11:36 crc kubenswrapper[4699]: E0226 11:11:36.473046 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 11:11:36 crc kubenswrapper[4699]: E0226 11:11:36.535500 4699 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 11:11:37 crc kubenswrapper[4699]: I0226 11:11:37.123267 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:37 crc kubenswrapper[4699]: I0226 11:11:37.475938 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 26 11:11:37 crc kubenswrapper[4699]: I0226 11:11:37.710363 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:11:37 crc kubenswrapper[4699]: I0226 11:11:37.710553 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:37 crc kubenswrapper[4699]: I0226 11:11:37.711687 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:37 crc kubenswrapper[4699]: I0226 11:11:37.711739 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:37 crc kubenswrapper[4699]: I0226 11:11:37.711752 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:37 crc kubenswrapper[4699]: I0226 11:11:37.712347 4699 scope.go:117] "RemoveContainer" containerID="c2551d40b2ef08b55319c037bd0cce11c3db01d06a454a47ac46c3f5fea58b7f" Feb 26 11:11:37 crc kubenswrapper[4699]: E0226 11:11:37.712549 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 11:11:38 crc kubenswrapper[4699]: I0226 11:11:38.123432 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:38 crc kubenswrapper[4699]: I0226 11:11:38.973026 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:11:38 crc kubenswrapper[4699]: I0226 11:11:38.973367 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:38 crc kubenswrapper[4699]: I0226 11:11:38.975177 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:38 crc kubenswrapper[4699]: I0226 11:11:38.975247 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:38 crc kubenswrapper[4699]: I0226 11:11:38.975268 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:38 crc kubenswrapper[4699]: I0226 11:11:38.975970 4699 scope.go:117] "RemoveContainer" containerID="c2551d40b2ef08b55319c037bd0cce11c3db01d06a454a47ac46c3f5fea58b7f" Feb 26 11:11:38 crc kubenswrapper[4699]: E0226 11:11:38.976195 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 11:11:39 crc kubenswrapper[4699]: I0226 11:11:39.122627 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:40 crc kubenswrapper[4699]: I0226 11:11:40.123879 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:40 crc kubenswrapper[4699]: I0226 11:11:40.377230 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:11:40 crc kubenswrapper[4699]: I0226 11:11:40.377441 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:40 crc kubenswrapper[4699]: I0226 11:11:40.379098 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:40 crc kubenswrapper[4699]: I0226 11:11:40.379167 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:40 crc kubenswrapper[4699]: I0226 11:11:40.379178 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:40 crc kubenswrapper[4699]: I0226 11:11:40.457204 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:11:40 crc kubenswrapper[4699]: I0226 11:11:40.486458 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:40 crc kubenswrapper[4699]: I0226 11:11:40.488181 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:40 crc kubenswrapper[4699]: I0226 11:11:40.488233 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:40 crc kubenswrapper[4699]: I0226 11:11:40.488244 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:41 crc kubenswrapper[4699]: I0226 11:11:41.123054 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:41 crc kubenswrapper[4699]: I0226 11:11:41.986950 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:41 crc kubenswrapper[4699]: I0226 11:11:41.988828 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:41 crc kubenswrapper[4699]: I0226 11:11:41.988879 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:41 crc kubenswrapper[4699]: I0226 11:11:41.988893 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:41 crc kubenswrapper[4699]: I0226 11:11:41.988933 4699 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 11:11:41 crc kubenswrapper[4699]: E0226 11:11:41.993099 4699 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 11:11:41 crc kubenswrapper[4699]: E0226 11:11:41.993224 4699 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 11:11:42 crc kubenswrapper[4699]: I0226 11:11:42.123512 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:42 crc kubenswrapper[4699]: W0226 11:11:42.806818 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:42 crc kubenswrapper[4699]: E0226 11:11:42.807361 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 26 11:11:43 crc kubenswrapper[4699]: I0226 11:11:43.123990 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:43 crc kubenswrapper[4699]: I0226 11:11:43.458094 4699 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 11:11:43 crc kubenswrapper[4699]: I0226 11:11:43.458227 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 11:11:43 crc kubenswrapper[4699]: E0226 11:11:43.463520 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897c774f06b1a12\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 11:11:43 crc kubenswrapper[4699]: &Event{ObjectMeta:{kube-controller-manager-crc.1897c774f06b1a12 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 26 11:11:43 crc kubenswrapper[4699]: body: Feb 26 11:11:43 crc kubenswrapper[4699]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:23.457264146 +0000 UTC m=+29.268090590,LastTimestamp:2026-02-26 11:11:43.45819737 +0000 UTC m=+49.269023804,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 11:11:43 crc kubenswrapper[4699]: > Feb 26 11:11:43 crc kubenswrapper[4699]: E0226 11:11:43.468629 4699 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897c774f06c2daa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897c774f06c2daa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:11:23.457334698 +0000 UTC m=+29.268161132,LastTimestamp:2026-02-26 11:11:43.458268522 +0000 UTC m=+49.269094956,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 11:11:44 crc kubenswrapper[4699]: I0226 11:11:44.120097 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:45 crc kubenswrapper[4699]: I0226 11:11:45.123667 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:46 crc kubenswrapper[4699]: I0226 11:11:46.123763 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:46 crc kubenswrapper[4699]: W0226 11:11:46.416287 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 26 11:11:46 crc kubenswrapper[4699]: E0226 11:11:46.416349 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 26 11:11:46 crc kubenswrapper[4699]: E0226 11:11:46.535681 4699 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 11:11:47 crc kubenswrapper[4699]: W0226 11:11:47.059032 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Feb 26 11:11:47 crc kubenswrapper[4699]: E0226 11:11:47.059145 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 26 11:11:47 crc kubenswrapper[4699]: I0226 11:11:47.125703 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:48 crc kubenswrapper[4699]: I0226 11:11:48.126682 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:48 crc kubenswrapper[4699]: I0226 11:11:48.993974 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:48 crc kubenswrapper[4699]: I0226 11:11:48.995430 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:48 crc kubenswrapper[4699]: I0226 11:11:48.995498 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:48 crc kubenswrapper[4699]: I0226 11:11:48.995513 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:48 crc kubenswrapper[4699]: I0226 11:11:48.995560 4699 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 11:11:49 crc kubenswrapper[4699]: E0226 11:11:49.000185 4699 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 11:11:49 crc kubenswrapper[4699]: E0226 11:11:49.000243 4699 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 11:11:49 crc kubenswrapper[4699]: I0226 11:11:49.123128 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:50 crc kubenswrapper[4699]: I0226 11:11:50.123518 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:50 crc kubenswrapper[4699]: I0226 11:11:50.460749 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:11:50 crc kubenswrapper[4699]: I0226 11:11:50.461048 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:50 crc kubenswrapper[4699]: I0226 11:11:50.462551 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:50 crc kubenswrapper[4699]: I0226 11:11:50.462600 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:50 crc kubenswrapper[4699]: I0226 11:11:50.462616 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:50 crc kubenswrapper[4699]: I0226 11:11:50.464802 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:11:50 crc kubenswrapper[4699]: I0226 11:11:50.513603 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:50 crc kubenswrapper[4699]: I0226 11:11:50.515455 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:50 crc kubenswrapper[4699]: I0226 11:11:50.515521 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:50 crc kubenswrapper[4699]: I0226 11:11:50.515534 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:51 crc kubenswrapper[4699]: I0226 11:11:51.123230 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:52 crc kubenswrapper[4699]: I0226 11:11:52.123474 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:52 crc kubenswrapper[4699]: I0226 11:11:52.841387 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 11:11:52 crc kubenswrapper[4699]: I0226 11:11:52.841525 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:52 crc kubenswrapper[4699]: I0226 11:11:52.842783 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:52 crc kubenswrapper[4699]: I0226 11:11:52.842877 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:52 crc kubenswrapper[4699]: I0226 11:11:52.842902 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:53 crc kubenswrapper[4699]: I0226 11:11:53.123214 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:54 crc kubenswrapper[4699]: I0226 11:11:54.125174 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:54 crc kubenswrapper[4699]: I0226 11:11:54.260725 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:54 crc kubenswrapper[4699]: I0226 11:11:54.262527 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:54 crc kubenswrapper[4699]: I0226 11:11:54.262666 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:54 crc kubenswrapper[4699]: I0226 11:11:54.262756 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:54 crc kubenswrapper[4699]: I0226 11:11:54.263557 4699 scope.go:117] "RemoveContainer" containerID="c2551d40b2ef08b55319c037bd0cce11c3db01d06a454a47ac46c3f5fea58b7f" Feb 26 11:11:54 crc kubenswrapper[4699]: E0226 11:11:54.263844 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 11:11:55 crc kubenswrapper[4699]: I0226 11:11:55.124261 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:56 crc kubenswrapper[4699]: I0226 11:11:56.000641 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:11:56 crc kubenswrapper[4699]: I0226 11:11:56.002098 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:11:56 crc kubenswrapper[4699]: I0226 11:11:56.002170 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:11:56 crc kubenswrapper[4699]: I0226 11:11:56.002191 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:11:56 crc kubenswrapper[4699]: I0226 11:11:56.002216 4699 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 11:11:56 crc kubenswrapper[4699]: E0226 11:11:56.006143 4699 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 11:11:56 crc kubenswrapper[4699]: E0226 11:11:56.006408 4699 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 11:11:56 crc kubenswrapper[4699]: I0226 11:11:56.123455 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:56 crc kubenswrapper[4699]: E0226 11:11:56.536216 4699 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 11:11:57 crc kubenswrapper[4699]: I0226 11:11:57.127813 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:58 crc kubenswrapper[4699]: I0226 11:11:58.123836 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:11:59 crc kubenswrapper[4699]: I0226 11:11:59.122810 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:12:00 crc kubenswrapper[4699]: I0226 11:12:00.125680 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:12:01 crc kubenswrapper[4699]: I0226 11:12:01.123961 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:12:02 crc kubenswrapper[4699]: I0226 11:12:02.123279 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 11:12:02 crc kubenswrapper[4699]: I0226 11:12:02.239793 4699 csr.go:261] certificate signing request csr-xfmsk is approved, waiting to be issued Feb 26 11:12:02 crc kubenswrapper[4699]: I0226 11:12:02.254966 4699 csr.go:257] certificate signing request csr-xfmsk is issued Feb 26 11:12:02 crc kubenswrapper[4699]: I0226 11:12:02.310741 4699 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 26 11:12:02 crc kubenswrapper[4699]: I0226 11:12:02.516501 4699 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.007173 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.008512 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.008561 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.008581 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.008693 4699 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.022636 4699 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.023158 4699 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 26 11:12:03 crc kubenswrapper[4699]: E0226 11:12:03.023262 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.028497 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.028541 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.028550 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.028587 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.028599 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:03Z","lastTransitionTime":"2026-02-26T11:12:03Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 26 11:12:03 crc kubenswrapper[4699]: E0226 11:12:03.042558 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.051347 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.051398 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.051409 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.051437 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.051454 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:03Z","lastTransitionTime":"2026-02-26T11:12:03Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 26 11:12:03 crc kubenswrapper[4699]: E0226 11:12:03.065969 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.076666 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.076710 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.076721 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.076743 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.076756 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:03Z","lastTransitionTime":"2026-02-26T11:12:03Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 26 11:12:03 crc kubenswrapper[4699]: E0226 11:12:03.092335 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.101655 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.101714 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.101736 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.102018 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.102037 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:03Z","lastTransitionTime":"2026-02-26T11:12:03Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 26 11:12:03 crc kubenswrapper[4699]: E0226 11:12:03.114737 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:03Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:03 crc kubenswrapper[4699]: E0226 11:12:03.114924 4699 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 11:12:03 crc kubenswrapper[4699]: E0226 11:12:03.114949 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:03 crc kubenswrapper[4699]: E0226 11:12:03.215696 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.259640 4699 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-30 04:18:17.664345148 +0000 UTC Feb 26 11:12:03 crc kubenswrapper[4699]: I0226 11:12:03.259714 4699 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6641h6m14.404635008s for next certificate rotation Feb 26 11:12:03 crc kubenswrapper[4699]: E0226 11:12:03.316557 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:03 crc kubenswrapper[4699]: E0226 11:12:03.417325 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:03 crc kubenswrapper[4699]: E0226 11:12:03.517724 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:03 crc kubenswrapper[4699]: E0226 11:12:03.618280 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:03 crc kubenswrapper[4699]: E0226 11:12:03.719039 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:03 crc kubenswrapper[4699]: E0226 11:12:03.819228 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:03 crc kubenswrapper[4699]: E0226 11:12:03.919351 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:04 crc kubenswrapper[4699]: E0226 11:12:04.021476 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:04 crc kubenswrapper[4699]: E0226 11:12:04.122413 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:04 crc kubenswrapper[4699]: E0226 11:12:04.222826 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:04 crc kubenswrapper[4699]: I0226 11:12:04.260534 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:12:04 crc kubenswrapper[4699]: I0226 11:12:04.262409 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:04 crc kubenswrapper[4699]: I0226 11:12:04.262457 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:04 crc kubenswrapper[4699]: I0226 11:12:04.262471 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:04 crc kubenswrapper[4699]: E0226 11:12:04.323767 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:04 crc kubenswrapper[4699]: E0226 11:12:04.424643 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:04 crc kubenswrapper[4699]: E0226 11:12:04.525924 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:04 crc kubenswrapper[4699]: E0226 11:12:04.626428 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:04 crc kubenswrapper[4699]: E0226 11:12:04.727485 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:04 crc kubenswrapper[4699]: E0226 11:12:04.828263 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:04 crc kubenswrapper[4699]: E0226 11:12:04.928803 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:05 crc kubenswrapper[4699]: E0226 11:12:05.029352 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:05 crc kubenswrapper[4699]: E0226 11:12:05.129513 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:05 crc kubenswrapper[4699]: E0226 11:12:05.230381 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:05 crc kubenswrapper[4699]: E0226 11:12:05.331212 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:05 crc kubenswrapper[4699]: E0226 11:12:05.431731 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:05 crc kubenswrapper[4699]: E0226 11:12:05.531929 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:05 crc kubenswrapper[4699]: E0226 11:12:05.633140 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:05 crc kubenswrapper[4699]: E0226 11:12:05.733366 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:05 crc kubenswrapper[4699]: E0226 11:12:05.833982 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:05 crc kubenswrapper[4699]: E0226 11:12:05.934665 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:06 crc kubenswrapper[4699]: E0226 11:12:06.035697 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:06 crc kubenswrapper[4699]: E0226 11:12:06.136648 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:06 crc kubenswrapper[4699]: E0226 11:12:06.237150 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:06 crc kubenswrapper[4699]: I0226 11:12:06.260613 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:12:06 crc kubenswrapper[4699]: I0226 11:12:06.262049 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:06 crc kubenswrapper[4699]: I0226 11:12:06.262092 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:06 crc kubenswrapper[4699]: I0226 11:12:06.262104 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:06 crc kubenswrapper[4699]: I0226 11:12:06.262887 4699 scope.go:117] "RemoveContainer" containerID="c2551d40b2ef08b55319c037bd0cce11c3db01d06a454a47ac46c3f5fea58b7f" Feb 26 11:12:06 crc kubenswrapper[4699]: E0226 11:12:06.338176 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:06 crc kubenswrapper[4699]: E0226 11:12:06.438819 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:06 crc kubenswrapper[4699]: E0226 11:12:06.537099 4699 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 11:12:06 crc kubenswrapper[4699]: E0226 11:12:06.539031 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:06 crc kubenswrapper[4699]: I0226 11:12:06.567066 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 26 11:12:06 crc kubenswrapper[4699]: I0226 11:12:06.570692 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0"} Feb 26 11:12:06 crc kubenswrapper[4699]: I0226 11:12:06.570972 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:12:06 crc kubenswrapper[4699]: I0226 11:12:06.573158 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:06 crc kubenswrapper[4699]: I0226 11:12:06.573209 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:06 crc kubenswrapper[4699]: I0226 11:12:06.573219 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:06 crc kubenswrapper[4699]: E0226 11:12:06.639628 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:06 crc kubenswrapper[4699]: E0226 11:12:06.740248 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:06 crc kubenswrapper[4699]: E0226 11:12:06.840845 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:06 crc kubenswrapper[4699]: E0226 11:12:06.941977 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:07 crc kubenswrapper[4699]: E0226 11:12:07.042147 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:07 crc kubenswrapper[4699]: E0226 11:12:07.142980 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:07 crc kubenswrapper[4699]: E0226 11:12:07.243949 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:07 crc kubenswrapper[4699]: E0226 11:12:07.345582 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:07 crc kubenswrapper[4699]: E0226 11:12:07.445977 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:07 crc kubenswrapper[4699]: E0226 11:12:07.546525 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:07 crc kubenswrapper[4699]: E0226 11:12:07.647198 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:07 crc kubenswrapper[4699]: I0226 11:12:07.710528 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:12:07 crc kubenswrapper[4699]: I0226 11:12:07.710714 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:12:07 crc kubenswrapper[4699]: I0226 11:12:07.712228 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:07 crc kubenswrapper[4699]: I0226 11:12:07.712266 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:07 crc kubenswrapper[4699]: I0226 11:12:07.712280 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:07 crc kubenswrapper[4699]: E0226 11:12:07.747606 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:07 crc kubenswrapper[4699]: E0226 11:12:07.848163 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:07 crc kubenswrapper[4699]: E0226 11:12:07.949185 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:08 crc kubenswrapper[4699]: E0226 11:12:08.049865 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:08 crc kubenswrapper[4699]: E0226 11:12:08.150816 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:08 crc kubenswrapper[4699]: E0226 11:12:08.251241 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:08 crc kubenswrapper[4699]: E0226 11:12:08.351990 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:08 crc kubenswrapper[4699]: E0226 11:12:08.453072 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:08 crc kubenswrapper[4699]: E0226 11:12:08.553655 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:08 crc kubenswrapper[4699]: I0226 11:12:08.580775 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 11:12:08 crc kubenswrapper[4699]: I0226 11:12:08.581410 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 26 11:12:08 crc kubenswrapper[4699]: I0226 11:12:08.583429 4699 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0" exitCode=255 Feb 26 11:12:08 crc kubenswrapper[4699]: I0226 11:12:08.583475 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0"} Feb 26 11:12:08 crc kubenswrapper[4699]: I0226 11:12:08.583529 4699 scope.go:117] "RemoveContainer" containerID="c2551d40b2ef08b55319c037bd0cce11c3db01d06a454a47ac46c3f5fea58b7f" Feb 26 11:12:08 crc kubenswrapper[4699]: I0226 11:12:08.583951 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:12:08 crc kubenswrapper[4699]: I0226 11:12:08.585251 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:08 crc kubenswrapper[4699]: I0226 11:12:08.585302 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:08 crc kubenswrapper[4699]: I0226 11:12:08.585317 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:08 crc kubenswrapper[4699]: I0226 11:12:08.586324 4699 scope.go:117] "RemoveContainer" containerID="eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0" Feb 26 11:12:08 crc kubenswrapper[4699]: E0226 11:12:08.586561 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 11:12:08 crc kubenswrapper[4699]: E0226 11:12:08.654563 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:08 crc kubenswrapper[4699]: E0226 11:12:08.755661 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:08 crc kubenswrapper[4699]: E0226 11:12:08.856160 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:08 crc kubenswrapper[4699]: E0226 11:12:08.957062 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:08 crc kubenswrapper[4699]: I0226 11:12:08.972230 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:12:09 crc kubenswrapper[4699]: E0226 11:12:09.057800 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:09 crc kubenswrapper[4699]: E0226 11:12:09.158199 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:09 crc kubenswrapper[4699]: E0226 11:12:09.258838 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:09 crc kubenswrapper[4699]: E0226 11:12:09.359363 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:09 crc kubenswrapper[4699]: E0226 11:12:09.460424 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:09 crc kubenswrapper[4699]: E0226 11:12:09.561247 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:09 crc kubenswrapper[4699]: I0226 11:12:09.588394 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 11:12:09 crc kubenswrapper[4699]: I0226 11:12:09.590669 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 11:12:09 crc kubenswrapper[4699]: I0226 11:12:09.591661 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:09 crc kubenswrapper[4699]: I0226 11:12:09.591701 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:09 crc kubenswrapper[4699]: I0226 11:12:09.591713 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:09 crc kubenswrapper[4699]: I0226 11:12:09.592410 4699 scope.go:117] "RemoveContainer" containerID="eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0" Feb 26 11:12:09 crc kubenswrapper[4699]: E0226 11:12:09.592600 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 11:12:09 crc kubenswrapper[4699]: E0226 11:12:09.662273 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:09 crc kubenswrapper[4699]: E0226 11:12:09.763185 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:09 crc kubenswrapper[4699]: E0226 11:12:09.863489 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:09 crc kubenswrapper[4699]: E0226 11:12:09.964274 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:10 crc kubenswrapper[4699]: E0226 11:12:10.064894 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:10 crc kubenswrapper[4699]: E0226 11:12:10.165220 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:10 crc kubenswrapper[4699]: E0226 11:12:10.266361 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:10 crc kubenswrapper[4699]: E0226 11:12:10.366554 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:10 crc kubenswrapper[4699]: E0226 11:12:10.467064 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:10 crc kubenswrapper[4699]: E0226 11:12:10.568063 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:10 crc kubenswrapper[4699]: E0226 11:12:10.669397 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:10 crc kubenswrapper[4699]: E0226 11:12:10.770175 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:10 crc kubenswrapper[4699]: E0226 11:12:10.870526 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:10 crc kubenswrapper[4699]: I0226 11:12:10.892312 4699 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 26 11:12:10 crc kubenswrapper[4699]: E0226 11:12:10.971335 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:11 crc kubenswrapper[4699]: E0226 11:12:11.071967 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:11 crc kubenswrapper[4699]: E0226 11:12:11.172476 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:11 crc kubenswrapper[4699]: E0226 11:12:11.272963 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:11 crc kubenswrapper[4699]: E0226 11:12:11.374046 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:11 crc kubenswrapper[4699]: E0226 11:12:11.475238 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:11 crc kubenswrapper[4699]: E0226 11:12:11.575662 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:11 crc kubenswrapper[4699]: E0226 11:12:11.676647 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:11 crc kubenswrapper[4699]: E0226 11:12:11.777056 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:11 crc kubenswrapper[4699]: E0226 11:12:11.877428 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:11 crc kubenswrapper[4699]: E0226 11:12:11.978653 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:12 crc kubenswrapper[4699]: E0226 11:12:12.079680 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:12 crc kubenswrapper[4699]: E0226 11:12:12.180683 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:12 crc kubenswrapper[4699]: E0226 11:12:12.281306 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:12 crc kubenswrapper[4699]: E0226 11:12:12.381853 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:12 crc kubenswrapper[4699]: E0226 11:12:12.482774 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:12 crc kubenswrapper[4699]: E0226 11:12:12.583955 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:12 crc kubenswrapper[4699]: E0226 11:12:12.684083 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:12 crc kubenswrapper[4699]: E0226 11:12:12.784840 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:12 crc kubenswrapper[4699]: E0226 11:12:12.885859 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:12 crc kubenswrapper[4699]: E0226 11:12:12.987091 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:13 crc kubenswrapper[4699]: E0226 11:12:13.087888 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:13 crc kubenswrapper[4699]: E0226 11:12:13.188682 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:13 crc kubenswrapper[4699]: E0226 11:12:13.240387 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 26 11:12:13 crc kubenswrapper[4699]: I0226 11:12:13.245537 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:13 crc kubenswrapper[4699]: I0226 11:12:13.245909 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:13 crc kubenswrapper[4699]: I0226 11:12:13.245991 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:13 crc kubenswrapper[4699]: I0226 11:12:13.246090 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:13 crc kubenswrapper[4699]: I0226 11:12:13.246216 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:13Z","lastTransitionTime":"2026-02-26T11:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:13 crc kubenswrapper[4699]: E0226 11:12:13.259100 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:13 crc kubenswrapper[4699]: I0226 11:12:13.265247 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:13 crc kubenswrapper[4699]: I0226 11:12:13.265315 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:13 crc kubenswrapper[4699]: I0226 11:12:13.265329 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:13 crc kubenswrapper[4699]: I0226 11:12:13.265357 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:13 crc kubenswrapper[4699]: I0226 11:12:13.265370 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:13Z","lastTransitionTime":"2026-02-26T11:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:13 crc kubenswrapper[4699]: E0226 11:12:13.278571 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:13 crc kubenswrapper[4699]: I0226 11:12:13.283316 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:13 crc kubenswrapper[4699]: I0226 11:12:13.283399 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:13 crc kubenswrapper[4699]: I0226 11:12:13.283441 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:13 crc kubenswrapper[4699]: I0226 11:12:13.283462 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:13 crc kubenswrapper[4699]: I0226 11:12:13.283475 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:13Z","lastTransitionTime":"2026-02-26T11:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:13 crc kubenswrapper[4699]: E0226 11:12:13.295645 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:13 crc kubenswrapper[4699]: I0226 11:12:13.301984 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:13 crc kubenswrapper[4699]: I0226 11:12:13.302038 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:13 crc kubenswrapper[4699]: I0226 11:12:13.302051 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:13 crc kubenswrapper[4699]: I0226 11:12:13.302072 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:13 crc kubenswrapper[4699]: I0226 11:12:13.302086 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:13Z","lastTransitionTime":"2026-02-26T11:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:13 crc kubenswrapper[4699]: E0226 11:12:13.314309 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:13 crc kubenswrapper[4699]: E0226 11:12:13.314477 4699 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 11:12:13 crc kubenswrapper[4699]: E0226 11:12:13.314512 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:13 crc kubenswrapper[4699]: E0226 11:12:13.415677 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:13 crc kubenswrapper[4699]: E0226 11:12:13.516815 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:13 crc kubenswrapper[4699]: E0226 11:12:13.617333 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:13 crc kubenswrapper[4699]: E0226 11:12:13.717935 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:13 crc kubenswrapper[4699]: E0226 11:12:13.818669 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:13 crc kubenswrapper[4699]: E0226 11:12:13.919151 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:14 crc kubenswrapper[4699]: E0226 11:12:14.019517 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:14 crc kubenswrapper[4699]: E0226 11:12:14.119778 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:14 crc kubenswrapper[4699]: E0226 11:12:14.220848 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:14 crc kubenswrapper[4699]: E0226 11:12:14.321466 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:14 crc kubenswrapper[4699]: E0226 11:12:14.421839 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:14 crc kubenswrapper[4699]: E0226 11:12:14.523024 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:14 crc kubenswrapper[4699]: E0226 11:12:14.623965 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:14 crc kubenswrapper[4699]: E0226 11:12:14.725054 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:14 crc kubenswrapper[4699]: E0226 11:12:14.825864 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:14 crc kubenswrapper[4699]: E0226 11:12:14.926940 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:15 crc kubenswrapper[4699]: E0226 11:12:15.027417 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:15 crc kubenswrapper[4699]: E0226 11:12:15.128152 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:15 crc kubenswrapper[4699]: E0226 11:12:15.229205 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:15 crc kubenswrapper[4699]: E0226 11:12:15.330039 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:15 crc kubenswrapper[4699]: E0226 11:12:15.431009 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:15 crc kubenswrapper[4699]: E0226 11:12:15.531408 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:15 crc kubenswrapper[4699]: E0226 11:12:15.631753 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:15 crc kubenswrapper[4699]: E0226 11:12:15.732188 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:15 crc kubenswrapper[4699]: E0226 11:12:15.833071 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:15 crc kubenswrapper[4699]: E0226 11:12:15.933762 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:16 crc kubenswrapper[4699]: E0226 11:12:16.034315 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:16 crc kubenswrapper[4699]: E0226 11:12:16.135190 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:16 crc kubenswrapper[4699]: E0226 11:12:16.235824 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:16 crc kubenswrapper[4699]: E0226 11:12:16.336700 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:16 crc kubenswrapper[4699]: E0226 11:12:16.437173 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:16 crc kubenswrapper[4699]: E0226 11:12:16.538351 4699 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 11:12:16 crc kubenswrapper[4699]: E0226 11:12:16.538410 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:16 crc kubenswrapper[4699]: E0226 11:12:16.639072 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:16 crc kubenswrapper[4699]: E0226 11:12:16.740101 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:16 crc kubenswrapper[4699]: E0226 11:12:16.841284 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:16 crc kubenswrapper[4699]: E0226 11:12:16.941471 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:17 crc kubenswrapper[4699]: E0226 11:12:17.042474 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:17 crc kubenswrapper[4699]: E0226 11:12:17.143204 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:17 crc kubenswrapper[4699]: E0226 11:12:17.243829 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:17 crc kubenswrapper[4699]: E0226 11:12:17.344283 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:17 crc kubenswrapper[4699]: E0226 11:12:17.445361 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:17 crc kubenswrapper[4699]: E0226 11:12:17.545456 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:17 crc kubenswrapper[4699]: E0226 11:12:17.645590 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:17 crc kubenswrapper[4699]: E0226 11:12:17.746680 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:17 crc kubenswrapper[4699]: E0226 11:12:17.847441 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:17 crc kubenswrapper[4699]: E0226 11:12:17.948546 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:18 crc kubenswrapper[4699]: E0226 11:12:18.048903 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:18 crc kubenswrapper[4699]: E0226 11:12:18.149896 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:18 crc kubenswrapper[4699]: E0226 11:12:18.250708 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:18 crc kubenswrapper[4699]: E0226 11:12:18.351368 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:18 crc kubenswrapper[4699]: E0226 11:12:18.452319 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:18 crc kubenswrapper[4699]: E0226 11:12:18.552526 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:18 crc kubenswrapper[4699]: E0226 11:12:18.653379 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:18 crc kubenswrapper[4699]: E0226 11:12:18.753626 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:18 crc kubenswrapper[4699]: E0226 11:12:18.853956 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:18 crc kubenswrapper[4699]: E0226 11:12:18.954980 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:19 crc kubenswrapper[4699]: E0226 11:12:19.055414 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:19 crc kubenswrapper[4699]: E0226 11:12:19.155921 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:19 crc kubenswrapper[4699]: E0226 11:12:19.256405 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:19 crc kubenswrapper[4699]: E0226 11:12:19.357206 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:19 crc kubenswrapper[4699]: E0226 11:12:19.457969 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:19 crc kubenswrapper[4699]: E0226 11:12:19.559248 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:19 crc kubenswrapper[4699]: E0226 11:12:19.659441 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:19 crc kubenswrapper[4699]: E0226 11:12:19.760752 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:19 crc kubenswrapper[4699]: E0226 11:12:19.861353 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 11:12:19 crc kubenswrapper[4699]: I0226 11:12:19.873322 4699 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 26 11:12:19 crc kubenswrapper[4699]: I0226 11:12:19.964607 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:19 crc kubenswrapper[4699]: I0226 11:12:19.964676 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:19 crc kubenswrapper[4699]: I0226 11:12:19.964690 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:19 crc kubenswrapper[4699]: I0226 11:12:19.964707 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:19 crc kubenswrapper[4699]: I0226 11:12:19.964720 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:19Z","lastTransitionTime":"2026-02-26T11:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.068363 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.068424 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.068434 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.068453 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.068463 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:20Z","lastTransitionTime":"2026-02-26T11:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.078496 4699 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.130025 4699 apiserver.go:52] "Watching apiserver" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.136739 4699 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.137170 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.137747 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:20 crc kubenswrapper[4699]: E0226 11:12:20.137816 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.138273 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.138615 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.138354 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.138416 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:20 crc kubenswrapper[4699]: E0226 11:12:20.139595 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.138309 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 11:12:20 crc kubenswrapper[4699]: E0226 11:12:20.138684 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.142584 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.142585 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.142777 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.143276 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.143614 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.143777 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.143844 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.144021 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.144608 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.171360 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.173304 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.173351 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.173362 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.173378 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.173390 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:20Z","lastTransitionTime":"2026-02-26T11:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.186958 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.210476 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.219979 4699 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.385497 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.385546 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.386531 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.385962 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.386564 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.386608 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.386634 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.386660 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.386682 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.386705 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.386726 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.386746 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.386767 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.386789 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.386849 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.386875 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.386880 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.386897 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.386920 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.386942 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.386969 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.386998 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387023 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387048 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387070 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387092 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387103 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387134 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387158 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387182 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387204 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387359 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387387 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387412 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387436 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387460 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387480 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387500 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387524 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387546 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387568 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387595 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387620 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387642 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387665 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387687 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387708 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387729 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387756 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387777 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387799 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387821 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387845 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387871 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387891 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387913 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387931 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387953 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387977 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388000 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388020 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388043 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388064 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388083 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388104 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388147 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388172 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388194 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388218 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388239 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388263 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388285 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388306 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388326 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388352 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388379 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388403 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388427 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388450 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388471 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388498 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388520 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388541 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388562 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388585 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388609 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388633 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388656 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388679 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388702 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388726 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388745 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388769 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388792 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388824 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388854 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388881 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387221 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387265 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387513 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387552 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387695 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387774 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.387832 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388013 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388189 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388230 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388288 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388301 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388421 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388738 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388820 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388838 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388879 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.388904 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.389353 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.389396 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.389426 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.389453 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.389486 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.389502 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.389538 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.389549 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.389566 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.389577 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:20Z","lastTransitionTime":"2026-02-26T11:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391056 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391044 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.389514 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391196 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391207 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391263 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391364 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391402 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391437 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391466 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391494 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391502 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391523 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391530 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391556 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391551 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391589 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391621 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391655 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391692 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391722 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391751 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391781 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391828 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391881 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391910 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391936 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391968 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391997 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392024 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392052 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392080 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392130 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392162 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392192 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392215 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392240 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392268 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392295 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392324 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392357 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392388 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392419 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392442 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392466 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392495 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392522 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392547 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392576 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392606 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392640 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392668 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392695 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392748 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392770 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392791 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392815 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392836 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392861 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392887 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392913 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392937 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392961 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392985 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393021 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393053 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393077 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393105 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393181 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393218 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393246 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393274 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393302 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393333 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393358 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393457 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393495 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393525 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393552 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393578 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393605 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393635 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393666 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393698 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393727 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393756 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393789 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393817 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393849 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393875 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393901 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393927 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393962 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393989 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394018 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394048 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394081 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394110 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394159 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394188 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394225 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394272 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394423 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394463 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394501 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394547 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394581 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394623 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394659 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394695 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394726 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394759 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394791 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394829 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394868 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394900 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395007 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395025 4699 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395040 4699 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395055 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395069 4699 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395082 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395096 4699 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395110 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395143 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395157 4699 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395172 4699 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395190 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395205 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395218 4699 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395231 4699 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395244 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395258 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395272 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395285 4699 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395298 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395311 4699 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395324 4699 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395341 4699 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395354 4699 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.395368 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.416052 4699 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391586 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391594 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.419010 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391820 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391936 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.391948 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392037 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392212 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.392987 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393372 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393382 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393523 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393655 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393778 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.419250 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394028 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.393485 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394218 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394291 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394331 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394331 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394394 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394796 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.394860 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.396567 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.396582 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.397064 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.397141 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.397149 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.397194 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.397333 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.397603 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.397653 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.397686 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.397728 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.397745 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.398656 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.398791 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.398995 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.399081 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.400003 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.400020 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.400238 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.400340 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.400452 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.400493 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.400559 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.400684 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.400837 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.402268 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.402270 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.402441 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.402459 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.402628 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.402798 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.402864 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.403109 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.403303 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.403436 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.403494 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.403661 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.403686 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.404096 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.404388 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: E0226 11:12:20.404638 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:12:20.904512783 +0000 UTC m=+86.715339387 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.405372 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.405532 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.406079 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.407599 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.407716 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.407759 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.407828 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.408003 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.408176 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.408275 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.408588 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.408744 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.408842 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.409053 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.409157 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.409300 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.409408 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.409619 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.409910 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.410230 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.410282 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.411278 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.411406 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.411592 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.411905 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.412407 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.412914 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.412982 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.413328 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.412879 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.414587 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.415209 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.415339 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.414980 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.417754 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.417939 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.417980 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.418516 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.419185 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.419534 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.419615 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.419732 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.421690 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.421858 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.422062 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.422226 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.422428 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: E0226 11:12:20.423374 4699 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.424065 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.424882 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.424715 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.425382 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.425473 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 11:12:20 crc kubenswrapper[4699]: E0226 11:12:20.425576 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:20.925552761 +0000 UTC m=+86.736379385 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.426044 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.426149 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.426789 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 11:12:20 crc kubenswrapper[4699]: E0226 11:12:20.426901 4699 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 11:12:20 crc kubenswrapper[4699]: E0226 11:12:20.427266 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:20.927243917 +0000 UTC m=+86.738070541 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.428326 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.428677 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.492565 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.492857 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.493294 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.493369 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.493654 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.493775 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.494018 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.494157 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.494173 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.494232 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.494284 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.494291 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.494320 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.494415 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: E0226 11:12:20.495056 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 11:12:20 crc kubenswrapper[4699]: E0226 11:12:20.495101 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 11:12:20 crc kubenswrapper[4699]: E0226 11:12:20.495148 4699 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.495178 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.495407 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: E0226 11:12:20.495599 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 11:12:20 crc kubenswrapper[4699]: E0226 11:12:20.497201 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 11:12:20 crc kubenswrapper[4699]: E0226 11:12:20.497306 4699 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.495714 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.496397 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.496401 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.496527 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.496835 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: E0226 11:12:20.496934 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:20.996886048 +0000 UTC m=+86.807712482 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.497873 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.497980 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.497998 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: E0226 11:12:20.498301 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:20.998234565 +0000 UTC m=+86.809061119 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.499023 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.497710 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.496963 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.498375 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.498398 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.499283 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501196 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501251 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501265 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501277 4699 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501288 4699 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501303 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501315 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501327 4699 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501337 4699 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501351 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501362 4699 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501373 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501384 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501398 4699 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501410 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501421 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501439 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501450 4699 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501460 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501470 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501483 4699 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501493 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501504 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501515 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501533 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501543 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501555 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501568 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501578 4699 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501587 4699 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501596 4699 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501608 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501617 4699 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501630 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501640 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501652 4699 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501662 4699 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501671 4699 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501681 4699 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501694 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501721 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501732 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501746 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501755 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501764 4699 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501775 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501788 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501797 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501806 4699 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501817 4699 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501828 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501837 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501848 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501861 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501870 4699 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501880 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501892 4699 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501904 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501914 4699 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501928 4699 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501940 4699 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501956 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501969 4699 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501981 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.501994 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502008 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502017 4699 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502027 4699 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502040 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502050 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502062 4699 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502072 4699 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502083 4699 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502092 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502102 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502132 4699 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502144 4699 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502154 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502163 4699 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502175 4699 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502184 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502235 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502249 4699 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502266 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502279 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502288 4699 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502346 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502391 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502410 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502437 4699 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502480 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502495 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502510 4699 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502528 4699 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502548 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502561 4699 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502576 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502594 4699 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502609 4699 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502624 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502641 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502655 4699 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502670 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502684 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502702 4699 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502715 4699 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502728 4699 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502742 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502759 4699 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502773 4699 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502790 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502804 4699 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502820 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502863 4699 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502880 4699 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502898 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502912 4699 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502927 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502942 4699 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502960 4699 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502975 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.502988 4699 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.503001 4699 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.503019 4699 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.503033 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.503047 4699 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.503064 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.503078 4699 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.503092 4699 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.503179 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.503195 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.503208 4699 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.503222 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.503234 4699 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.503251 4699 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.503264 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.503276 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.503289 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.503306 4699 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.503319 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.503705 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.503896 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.505128 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.505818 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.505813 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.505648 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.506168 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.506294 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.506365 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.506589 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.506682 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.506794 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.506786 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.506818 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.506800 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.507392 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.507639 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.507781 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.508069 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.508004 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.508128 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.508248 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.508534 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.508563 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.508995 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.510440 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.510607 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.510722 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.510838 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.510921 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:20Z","lastTransitionTime":"2026-02-26T11:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.510490 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.510947 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.515327 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.524329 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.524824 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.536663 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.538264 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.603945 4699 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.603986 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604001 4699 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604012 4699 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604024 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604036 4699 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604047 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604058 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604068 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604077 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604086 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604095 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604103 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604132 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604152 4699 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604163 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604173 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604196 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604204 4699 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604213 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604221 4699 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604230 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604238 4699 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604245 4699 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604253 4699 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604262 4699 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604270 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604279 4699 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.604287 4699 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.613568 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.613623 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.613635 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.613653 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.613665 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:20Z","lastTransitionTime":"2026-02-26T11:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.716384 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.716426 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.716437 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.716453 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.716464 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:20Z","lastTransitionTime":"2026-02-26T11:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.763097 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.770899 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 11:12:20 crc kubenswrapper[4699]: W0226 11:12:20.786696 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-ea395e839fe793a72c8ec8cb739811a6921caac61fd8ceef6e30ca19eb7732a6 WatchSource:0}: Error finding container ea395e839fe793a72c8ec8cb739811a6921caac61fd8ceef6e30ca19eb7732a6: Status 404 returned error can't find the container with id ea395e839fe793a72c8ec8cb739811a6921caac61fd8ceef6e30ca19eb7732a6 Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.789198 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.819191 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.819765 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.819802 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.819827 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.819845 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:20Z","lastTransitionTime":"2026-02-26T11:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.907213 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:12:20 crc kubenswrapper[4699]: E0226 11:12:20.907445 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:12:21.90739545 +0000 UTC m=+87.718221884 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.923164 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.923222 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.923237 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.923255 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:20 crc kubenswrapper[4699]: I0226 11:12:20.923267 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:20Z","lastTransitionTime":"2026-02-26T11:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.008371 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.008446 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.008473 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.008503 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:21 crc kubenswrapper[4699]: E0226 11:12:21.008588 4699 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 11:12:21 crc kubenswrapper[4699]: E0226 11:12:21.008647 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:22.008629868 +0000 UTC m=+87.819456302 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 11:12:21 crc kubenswrapper[4699]: E0226 11:12:21.009072 4699 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 11:12:21 crc kubenswrapper[4699]: E0226 11:12:21.009136 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:22.009107791 +0000 UTC m=+87.819934225 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 11:12:21 crc kubenswrapper[4699]: E0226 11:12:21.009209 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 11:12:21 crc kubenswrapper[4699]: E0226 11:12:21.009225 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 11:12:21 crc kubenswrapper[4699]: E0226 11:12:21.009237 4699 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:21 crc kubenswrapper[4699]: E0226 11:12:21.009267 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:22.009258385 +0000 UTC m=+87.820084819 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:21 crc kubenswrapper[4699]: E0226 11:12:21.009322 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 11:12:21 crc kubenswrapper[4699]: E0226 11:12:21.009333 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 11:12:21 crc kubenswrapper[4699]: E0226 11:12:21.009342 4699 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:21 crc kubenswrapper[4699]: E0226 11:12:21.009366 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:22.009359028 +0000 UTC m=+87.820185462 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.027094 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.027186 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.027234 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.027260 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.027273 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:21Z","lastTransitionTime":"2026-02-26T11:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.130148 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.130212 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.130224 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.130245 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.130256 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:21Z","lastTransitionTime":"2026-02-26T11:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.233030 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.233071 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.233082 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.233102 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.233130 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:21Z","lastTransitionTime":"2026-02-26T11:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.336174 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.336233 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.336244 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.336265 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.336287 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:21Z","lastTransitionTime":"2026-02-26T11:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.439235 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.439307 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.439323 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.439348 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.439361 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:21Z","lastTransitionTime":"2026-02-26T11:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.541753 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.541821 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.541836 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.541858 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.541870 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:21Z","lastTransitionTime":"2026-02-26T11:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.624897 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393"} Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.624987 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"133a8e69467e5c97f3a135ab21c364e50741580a1dc47ad478db24e222d78eb9"} Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.627577 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534"} Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.627639 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ea395e839fe793a72c8ec8cb739811a6921caac61fd8ceef6e30ca19eb7732a6"} Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.629271 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"34e967ea2bb3904d502b9c8d4ce015eed4342b553e45c76324f0bb014f3d76fe"} Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.643385 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.644690 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.644766 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.644782 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.644810 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.644834 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:21Z","lastTransitionTime":"2026-02-26T11:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.656874 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.670102 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.685917 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.700629 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.713601 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.747221 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.747258 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.747267 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.747281 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.747291 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:21Z","lastTransitionTime":"2026-02-26T11:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.849802 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.849844 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.849852 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.849868 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.849878 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:21Z","lastTransitionTime":"2026-02-26T11:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.913895 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:12:21 crc kubenswrapper[4699]: E0226 11:12:21.914197 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:12:23.914160981 +0000 UTC m=+89.724987435 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.951599 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.951689 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.951706 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.951723 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:21 crc kubenswrapper[4699]: I0226 11:12:21.951734 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:21Z","lastTransitionTime":"2026-02-26T11:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.014801 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.014872 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.014900 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.014932 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:22 crc kubenswrapper[4699]: E0226 11:12:22.015013 4699 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 11:12:22 crc kubenswrapper[4699]: E0226 11:12:22.015022 4699 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 11:12:22 crc kubenswrapper[4699]: E0226 11:12:22.015044 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 11:12:22 crc kubenswrapper[4699]: E0226 11:12:22.015073 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 11:12:22 crc kubenswrapper[4699]: E0226 11:12:22.015090 4699 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:22 crc kubenswrapper[4699]: E0226 11:12:22.015102 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:24.015080619 +0000 UTC m=+89.825907073 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 11:12:22 crc kubenswrapper[4699]: E0226 11:12:22.015154 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:24.01511181 +0000 UTC m=+89.825938254 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 11:12:22 crc kubenswrapper[4699]: E0226 11:12:22.015171 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:24.015163672 +0000 UTC m=+89.825990126 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:22 crc kubenswrapper[4699]: E0226 11:12:22.015244 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 11:12:22 crc kubenswrapper[4699]: E0226 11:12:22.015296 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 11:12:22 crc kubenswrapper[4699]: E0226 11:12:22.015314 4699 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:22 crc kubenswrapper[4699]: E0226 11:12:22.015412 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:24.015383708 +0000 UTC m=+89.826210312 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.054071 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.054163 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.054186 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.054205 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.054217 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:22Z","lastTransitionTime":"2026-02-26T11:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.157168 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.157238 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.157249 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.157272 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.157285 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:22Z","lastTransitionTime":"2026-02-26T11:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.259977 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.260002 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.260255 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:22 crc kubenswrapper[4699]: E0226 11:12:22.260520 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.260583 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.260927 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.260949 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:22 crc kubenswrapper[4699]: E0226 11:12:22.260621 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.260972 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:22 crc kubenswrapper[4699]: E0226 11:12:22.260813 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.261040 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:22Z","lastTransitionTime":"2026-02-26T11:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.264748 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.265598 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.267467 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.268676 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.270046 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.270763 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.272738 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.274179 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.275014 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.276182 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.276922 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.278390 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.279023 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.279678 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.280768 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.281594 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.282715 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.283250 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.283977 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.285322 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.285931 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.287515 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.288038 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.289267 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.290757 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.291566 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.292929 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.293464 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.294677 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.295416 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.296524 4699 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.296651 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.299218 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.300365 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.301908 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.304468 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.305231 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.306591 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.307565 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.309605 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.310401 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.311651 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.312333 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.313561 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.314030 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.314949 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.315912 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.317515 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.318049 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.318978 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.319562 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.320766 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.321527 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.321995 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.363711 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.363763 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.363774 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.363792 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.363804 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:22Z","lastTransitionTime":"2026-02-26T11:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.465466 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.465514 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.465523 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.465536 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.465545 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:22Z","lastTransitionTime":"2026-02-26T11:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.567421 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.567454 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.567464 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.567477 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.567487 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:22Z","lastTransitionTime":"2026-02-26T11:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.634380 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2"} Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.647824 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.663167 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.669888 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.669940 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.669953 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.669968 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.669980 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:22Z","lastTransitionTime":"2026-02-26T11:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.680044 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.695132 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.708778 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.726080 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.773285 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.773325 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.773337 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.773352 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.773365 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:22Z","lastTransitionTime":"2026-02-26T11:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.876934 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.877001 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.877017 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.877042 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.877056 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:22Z","lastTransitionTime":"2026-02-26T11:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.980204 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.980277 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.980292 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.980315 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:22 crc kubenswrapper[4699]: I0226 11:12:22.980328 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:22Z","lastTransitionTime":"2026-02-26T11:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.086394 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.086446 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.086471 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.086491 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.086505 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:23Z","lastTransitionTime":"2026-02-26T11:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.189544 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.189599 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.189612 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.189631 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.189644 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:23Z","lastTransitionTime":"2026-02-26T11:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.292976 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.293020 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.293039 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.293057 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.293072 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:23Z","lastTransitionTime":"2026-02-26T11:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.395651 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.395714 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.395725 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.395740 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.395751 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:23Z","lastTransitionTime":"2026-02-26T11:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.406241 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.406328 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.406345 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.406372 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.406425 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:23Z","lastTransitionTime":"2026-02-26T11:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:23 crc kubenswrapper[4699]: E0226 11:12:23.421579 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:23Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.429028 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.430294 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.430324 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.430346 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.430359 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:23Z","lastTransitionTime":"2026-02-26T11:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:23 crc kubenswrapper[4699]: E0226 11:12:23.448394 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:23Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.454954 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.455015 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.455029 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.455049 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.455061 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:23Z","lastTransitionTime":"2026-02-26T11:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:23 crc kubenswrapper[4699]: E0226 11:12:23.471728 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:23Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.477723 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.477806 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.477828 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.477854 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.477869 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:23Z","lastTransitionTime":"2026-02-26T11:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:23 crc kubenswrapper[4699]: E0226 11:12:23.494963 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:23Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.501038 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.501099 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.501111 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.501151 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.501165 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:23Z","lastTransitionTime":"2026-02-26T11:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:23 crc kubenswrapper[4699]: E0226 11:12:23.517171 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:23Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:23 crc kubenswrapper[4699]: E0226 11:12:23.517352 4699 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.520400 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.520454 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.520468 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.520488 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.520503 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:23Z","lastTransitionTime":"2026-02-26T11:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.623416 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.623471 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.623494 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.623511 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.623524 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:23Z","lastTransitionTime":"2026-02-26T11:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.727208 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.727267 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.727281 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.727308 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.727324 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:23Z","lastTransitionTime":"2026-02-26T11:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.830359 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.830508 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.830522 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.830541 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.830556 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:23Z","lastTransitionTime":"2026-02-26T11:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.931551 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:12:23 crc kubenswrapper[4699]: E0226 11:12:23.931930 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:12:27.931897119 +0000 UTC m=+93.742723563 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.932852 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.933612 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.933671 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.933697 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:23 crc kubenswrapper[4699]: I0226 11:12:23.933712 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:23Z","lastTransitionTime":"2026-02-26T11:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.033054 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.033238 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:24 crc kubenswrapper[4699]: E0226 11:12:24.033413 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 11:12:24 crc kubenswrapper[4699]: E0226 11:12:24.033452 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.033457 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:24 crc kubenswrapper[4699]: E0226 11:12:24.033480 4699 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.033509 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:24 crc kubenswrapper[4699]: E0226 11:12:24.033546 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:28.033526597 +0000 UTC m=+93.844353221 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:24 crc kubenswrapper[4699]: E0226 11:12:24.033619 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 11:12:24 crc kubenswrapper[4699]: E0226 11:12:24.033655 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 11:12:24 crc kubenswrapper[4699]: E0226 11:12:24.033666 4699 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:24 crc kubenswrapper[4699]: E0226 11:12:24.033725 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:28.033706222 +0000 UTC m=+93.844532646 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:24 crc kubenswrapper[4699]: E0226 11:12:24.033739 4699 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 11:12:24 crc kubenswrapper[4699]: E0226 11:12:24.033774 4699 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 11:12:24 crc kubenswrapper[4699]: E0226 11:12:24.033801 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:28.033794424 +0000 UTC m=+93.844620858 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 11:12:24 crc kubenswrapper[4699]: E0226 11:12:24.033836 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:28.033807624 +0000 UTC m=+93.844634248 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.037174 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.037215 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.037228 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.037249 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.037265 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:24Z","lastTransitionTime":"2026-02-26T11:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.139949 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.140043 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.140061 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.140087 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.140103 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:24Z","lastTransitionTime":"2026-02-26T11:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.241880 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.241930 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.241944 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.241960 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.241971 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:24Z","lastTransitionTime":"2026-02-26T11:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.260481 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.260481 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.260577 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:24 crc kubenswrapper[4699]: E0226 11:12:24.260676 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:12:24 crc kubenswrapper[4699]: E0226 11:12:24.260751 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:12:24 crc kubenswrapper[4699]: E0226 11:12:24.261304 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.275775 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.276173 4699 scope.go:117] "RemoveContainer" containerID="eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0" Feb 26 11:12:24 crc kubenswrapper[4699]: E0226 11:12:24.276364 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.344477 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.344536 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.344548 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.344566 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.344579 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:24Z","lastTransitionTime":"2026-02-26T11:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.453715 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.454452 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.454714 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.455372 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.455415 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:24Z","lastTransitionTime":"2026-02-26T11:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.558547 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.558609 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.558621 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.558641 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.558654 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:24Z","lastTransitionTime":"2026-02-26T11:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.641970 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70"} Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.642605 4699 scope.go:117] "RemoveContainer" containerID="eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0" Feb 26 11:12:24 crc kubenswrapper[4699]: E0226 11:12:24.642786 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.668152 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.668186 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.668193 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.668205 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.668214 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:24Z","lastTransitionTime":"2026-02-26T11:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.668104 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:24Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.687084 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:24Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.702999 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:24Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.721772 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:24Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.739581 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:24Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.755882 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:24Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.769401 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:24Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.771010 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.771042 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.771218 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.771236 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.771245 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:24Z","lastTransitionTime":"2026-02-26T11:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.874502 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.874542 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.874553 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.874569 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.874581 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:24Z","lastTransitionTime":"2026-02-26T11:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.977134 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.977162 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.977171 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.977184 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:24 crc kubenswrapper[4699]: I0226 11:12:24.977194 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:24Z","lastTransitionTime":"2026-02-26T11:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.079300 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.079358 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.079373 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.079390 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.079403 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:25Z","lastTransitionTime":"2026-02-26T11:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.182181 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.182243 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.182256 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.182272 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.182307 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:25Z","lastTransitionTime":"2026-02-26T11:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.284882 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.285236 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.285345 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.285436 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.285517 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:25Z","lastTransitionTime":"2026-02-26T11:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.388448 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.388485 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.388493 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.388506 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.388514 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:25Z","lastTransitionTime":"2026-02-26T11:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.490739 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.491069 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.491172 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.491245 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.491314 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:25Z","lastTransitionTime":"2026-02-26T11:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.594142 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.594727 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.594801 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.594870 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.594940 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:25Z","lastTransitionTime":"2026-02-26T11:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.702253 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.702310 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.702326 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.702346 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.702366 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:25Z","lastTransitionTime":"2026-02-26T11:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.804892 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.804959 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.804973 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.804990 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.805002 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:25Z","lastTransitionTime":"2026-02-26T11:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.908345 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.908733 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.908868 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.908966 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:25 crc kubenswrapper[4699]: I0226 11:12:25.909067 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:25Z","lastTransitionTime":"2026-02-26T11:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.011558 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.011640 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.011654 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.011672 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.011684 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:26Z","lastTransitionTime":"2026-02-26T11:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.114716 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.114754 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.114767 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.114784 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.114796 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:26Z","lastTransitionTime":"2026-02-26T11:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.217537 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.217827 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.217911 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.217996 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.218079 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:26Z","lastTransitionTime":"2026-02-26T11:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.260733 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.260806 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.260842 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:26 crc kubenswrapper[4699]: E0226 11:12:26.260997 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:12:26 crc kubenswrapper[4699]: E0226 11:12:26.261075 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:12:26 crc kubenswrapper[4699]: E0226 11:12:26.261219 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.281163 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.296493 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.314180 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.320894 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.321240 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.321349 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.321437 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.321512 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:26Z","lastTransitionTime":"2026-02-26T11:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.331414 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.349954 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.364223 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.379537 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.425192 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.425243 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.425259 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.425276 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.425288 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:26Z","lastTransitionTime":"2026-02-26T11:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.528322 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.528371 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.528382 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.528405 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.528416 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:26Z","lastTransitionTime":"2026-02-26T11:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.631175 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.631223 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.631237 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.631251 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.631263 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:26Z","lastTransitionTime":"2026-02-26T11:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.736478 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.736533 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.736554 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.736575 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.736590 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:26Z","lastTransitionTime":"2026-02-26T11:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.839765 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.839798 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.839807 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.839823 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.839833 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:26Z","lastTransitionTime":"2026-02-26T11:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.943444 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.943538 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.943552 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.943568 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:26 crc kubenswrapper[4699]: I0226 11:12:26.943580 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:26Z","lastTransitionTime":"2026-02-26T11:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.046226 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.046273 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.046284 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.046300 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.046313 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:27Z","lastTransitionTime":"2026-02-26T11:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.149921 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.149972 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.149983 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.150002 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.150014 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:27Z","lastTransitionTime":"2026-02-26T11:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.197922 4699 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.253371 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.253422 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.253434 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.253452 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.253464 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:27Z","lastTransitionTime":"2026-02-26T11:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.356637 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.356675 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.356685 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.356700 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.356712 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:27Z","lastTransitionTime":"2026-02-26T11:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.458666 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.458703 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.458714 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.458731 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.458742 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:27Z","lastTransitionTime":"2026-02-26T11:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.561581 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.561630 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.561642 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.561658 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.561671 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:27Z","lastTransitionTime":"2026-02-26T11:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.663730 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.663784 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.663798 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.663822 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.663834 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:27Z","lastTransitionTime":"2026-02-26T11:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.766622 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.766677 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.766688 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.766706 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.766720 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:27Z","lastTransitionTime":"2026-02-26T11:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.870815 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.870870 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.870882 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.870899 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.870910 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:27Z","lastTransitionTime":"2026-02-26T11:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.969282 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:12:27 crc kubenswrapper[4699]: E0226 11:12:27.969518 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:12:35.969495351 +0000 UTC m=+101.780321785 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.973238 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.973355 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.973370 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.973395 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:27 crc kubenswrapper[4699]: I0226 11:12:27.973408 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:27Z","lastTransitionTime":"2026-02-26T11:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.070562 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.070654 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.070682 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.070712 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:28 crc kubenswrapper[4699]: E0226 11:12:28.070838 4699 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 11:12:28 crc kubenswrapper[4699]: E0226 11:12:28.070849 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 11:12:28 crc kubenswrapper[4699]: E0226 11:12:28.070888 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 11:12:28 crc kubenswrapper[4699]: E0226 11:12:28.070897 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:36.070878933 +0000 UTC m=+101.881705367 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 11:12:28 crc kubenswrapper[4699]: E0226 11:12:28.070902 4699 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:28 crc kubenswrapper[4699]: E0226 11:12:28.070973 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:36.070949435 +0000 UTC m=+101.881775939 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:28 crc kubenswrapper[4699]: E0226 11:12:28.071052 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 11:12:28 crc kubenswrapper[4699]: E0226 11:12:28.071103 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 11:12:28 crc kubenswrapper[4699]: E0226 11:12:28.071150 4699 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:28 crc kubenswrapper[4699]: E0226 11:12:28.071051 4699 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 11:12:28 crc kubenswrapper[4699]: E0226 11:12:28.071216 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:36.071195281 +0000 UTC m=+101.882021765 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:28 crc kubenswrapper[4699]: E0226 11:12:28.071241 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:36.071228862 +0000 UTC m=+101.882055406 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.077049 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.077105 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.077153 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.077168 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.077179 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:28Z","lastTransitionTime":"2026-02-26T11:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.179656 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.179724 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.179740 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.179765 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.179783 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:28Z","lastTransitionTime":"2026-02-26T11:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.260324 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.260399 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.260356 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:28 crc kubenswrapper[4699]: E0226 11:12:28.260491 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:12:28 crc kubenswrapper[4699]: E0226 11:12:28.260547 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:12:28 crc kubenswrapper[4699]: E0226 11:12:28.260599 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.281968 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.282003 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.282014 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.282030 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.282041 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:28Z","lastTransitionTime":"2026-02-26T11:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.384439 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.384740 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.384814 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.384875 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.384936 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:28Z","lastTransitionTime":"2026-02-26T11:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.488525 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.488592 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.488606 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.488628 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.488640 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:28Z","lastTransitionTime":"2026-02-26T11:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.591274 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.591530 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.591595 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.591672 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.591736 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:28Z","lastTransitionTime":"2026-02-26T11:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.696443 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.696492 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.696502 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.696521 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.696536 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:28Z","lastTransitionTime":"2026-02-26T11:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.799321 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.799377 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.799405 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.799425 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.799442 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:28Z","lastTransitionTime":"2026-02-26T11:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.904206 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.904287 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.904307 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.904518 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:28 crc kubenswrapper[4699]: I0226 11:12:28.904559 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:28Z","lastTransitionTime":"2026-02-26T11:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.008549 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.008597 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.008610 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.008630 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.008642 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:29Z","lastTransitionTime":"2026-02-26T11:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.112629 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.112692 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.112704 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.112723 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.112733 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:29Z","lastTransitionTime":"2026-02-26T11:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.216616 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.216684 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.216697 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.216710 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.216722 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:29Z","lastTransitionTime":"2026-02-26T11:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.319101 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.319187 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.319200 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.319218 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.319235 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:29Z","lastTransitionTime":"2026-02-26T11:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.421752 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.421813 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.421825 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.421844 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.421857 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:29Z","lastTransitionTime":"2026-02-26T11:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.525136 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.525185 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.525195 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.525211 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.525223 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:29Z","lastTransitionTime":"2026-02-26T11:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.627894 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.627937 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.627947 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.627963 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.627974 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:29Z","lastTransitionTime":"2026-02-26T11:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.730284 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.730316 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.730324 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.730338 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.730347 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:29Z","lastTransitionTime":"2026-02-26T11:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.832897 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.832940 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.832949 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.832966 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.832976 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:29Z","lastTransitionTime":"2026-02-26T11:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.935509 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.935551 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.935562 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.935578 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:29 crc kubenswrapper[4699]: I0226 11:12:29.935588 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:29Z","lastTransitionTime":"2026-02-26T11:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.038532 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.038581 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.038594 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.038610 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.038624 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:30Z","lastTransitionTime":"2026-02-26T11:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.141140 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.141176 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.141186 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.141201 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.141213 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:30Z","lastTransitionTime":"2026-02-26T11:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.244403 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.244444 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.244461 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.244480 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.244492 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:30Z","lastTransitionTime":"2026-02-26T11:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.261380 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.261445 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:30 crc kubenswrapper[4699]: E0226 11:12:30.261556 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:12:30 crc kubenswrapper[4699]: E0226 11:12:30.261673 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.261781 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:30 crc kubenswrapper[4699]: E0226 11:12:30.261857 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.346886 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.346926 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.346940 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.346957 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.346969 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:30Z","lastTransitionTime":"2026-02-26T11:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.449915 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.449958 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.449969 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.449983 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.449994 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:30Z","lastTransitionTime":"2026-02-26T11:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.552926 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.552964 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.552974 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.552988 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.552998 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:30Z","lastTransitionTime":"2026-02-26T11:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.655504 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.655533 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.655542 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.655555 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.655564 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:30Z","lastTransitionTime":"2026-02-26T11:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.757963 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.757999 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.758009 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.758022 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.758032 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:30Z","lastTransitionTime":"2026-02-26T11:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.860570 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.860610 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.860621 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.860636 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.860648 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:30Z","lastTransitionTime":"2026-02-26T11:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.963041 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.963081 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.963093 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.963110 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:30 crc kubenswrapper[4699]: I0226 11:12:30.963143 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:30Z","lastTransitionTime":"2026-02-26T11:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.065384 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.065425 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.065440 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.065459 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.065472 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:31Z","lastTransitionTime":"2026-02-26T11:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.167866 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.167910 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.167921 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.167938 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.167951 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:31Z","lastTransitionTime":"2026-02-26T11:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.270842 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.270878 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.270889 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.270904 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.270915 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:31Z","lastTransitionTime":"2026-02-26T11:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.373544 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.373582 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.373590 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.373610 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.373619 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:31Z","lastTransitionTime":"2026-02-26T11:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.476641 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.476710 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.476731 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.476754 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.476768 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:31Z","lastTransitionTime":"2026-02-26T11:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.579416 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.579446 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.579457 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.579474 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.579486 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:31Z","lastTransitionTime":"2026-02-26T11:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.681813 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.681858 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.681866 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.681880 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.681890 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:31Z","lastTransitionTime":"2026-02-26T11:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.784585 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.784645 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.784659 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.784676 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.784687 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:31Z","lastTransitionTime":"2026-02-26T11:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.886956 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.887010 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.887021 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.887037 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.887050 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:31Z","lastTransitionTime":"2026-02-26T11:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.989661 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.990515 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.990528 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.990542 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:31 crc kubenswrapper[4699]: I0226 11:12:31.990552 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:31Z","lastTransitionTime":"2026-02-26T11:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.092150 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.092193 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.092204 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.092221 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.092232 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:32Z","lastTransitionTime":"2026-02-26T11:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.194090 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.194163 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.194172 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.194188 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.194198 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:32Z","lastTransitionTime":"2026-02-26T11:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.260385 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.260422 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:32 crc kubenswrapper[4699]: E0226 11:12:32.260552 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.260570 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:32 crc kubenswrapper[4699]: E0226 11:12:32.260720 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:12:32 crc kubenswrapper[4699]: E0226 11:12:32.260773 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.299973 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.300029 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.300050 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.300070 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.300086 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:32Z","lastTransitionTime":"2026-02-26T11:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.402359 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.402404 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.402418 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.402433 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.402443 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:32Z","lastTransitionTime":"2026-02-26T11:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.504246 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.504286 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.504299 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.504319 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.504333 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:32Z","lastTransitionTime":"2026-02-26T11:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.606618 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.606944 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.607055 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.607205 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.607420 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:32Z","lastTransitionTime":"2026-02-26T11:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.710586 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.710640 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.710651 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.710671 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.710683 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:32Z","lastTransitionTime":"2026-02-26T11:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.813662 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.813733 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.813747 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.813767 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.813782 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:32Z","lastTransitionTime":"2026-02-26T11:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.916561 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.916614 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.916626 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.916647 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:32 crc kubenswrapper[4699]: I0226 11:12:32.916662 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:32Z","lastTransitionTime":"2026-02-26T11:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.018600 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.018641 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.018652 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.018670 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.018681 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:33Z","lastTransitionTime":"2026-02-26T11:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.120708 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.120748 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.120764 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.120780 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.120791 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:33Z","lastTransitionTime":"2026-02-26T11:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.222713 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.222745 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.222753 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.222764 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.222774 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:33Z","lastTransitionTime":"2026-02-26T11:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.324689 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.324763 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.324776 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.324792 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.324804 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:33Z","lastTransitionTime":"2026-02-26T11:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.426568 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.426616 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.426629 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.426645 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.426658 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:33Z","lastTransitionTime":"2026-02-26T11:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.528972 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.529004 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.529013 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.529026 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.529037 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:33Z","lastTransitionTime":"2026-02-26T11:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.543618 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.543661 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.543670 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.543686 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.543697 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:33Z","lastTransitionTime":"2026-02-26T11:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:33 crc kubenswrapper[4699]: E0226 11:12:33.557350 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:33Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.561665 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.561716 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.561727 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.561747 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.561763 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:33Z","lastTransitionTime":"2026-02-26T11:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:33 crc kubenswrapper[4699]: E0226 11:12:33.574840 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:33Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.578293 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.578327 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.578338 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.578355 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.578367 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:33Z","lastTransitionTime":"2026-02-26T11:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:33 crc kubenswrapper[4699]: E0226 11:12:33.590460 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:33Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.594725 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.594769 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.594780 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.594797 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.594809 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:33Z","lastTransitionTime":"2026-02-26T11:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:33 crc kubenswrapper[4699]: E0226 11:12:33.609604 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:33Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.613869 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.613931 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.613944 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.613963 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.613977 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:33Z","lastTransitionTime":"2026-02-26T11:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:33 crc kubenswrapper[4699]: E0226 11:12:33.627868 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:33Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:33 crc kubenswrapper[4699]: E0226 11:12:33.628042 4699 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.631317 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.631348 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.631356 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.631368 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.631378 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:33Z","lastTransitionTime":"2026-02-26T11:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.735105 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.735188 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.735200 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.735221 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.735232 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:33Z","lastTransitionTime":"2026-02-26T11:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.837991 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.838075 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.838090 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.838107 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.838141 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:33Z","lastTransitionTime":"2026-02-26T11:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.940476 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.940520 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.940532 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.940548 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:33 crc kubenswrapper[4699]: I0226 11:12:33.940560 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:33Z","lastTransitionTime":"2026-02-26T11:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.043106 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.043178 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.043191 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.043210 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.043223 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:34Z","lastTransitionTime":"2026-02-26T11:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.147025 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.147222 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.147235 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.147251 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.147263 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:34Z","lastTransitionTime":"2026-02-26T11:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.249704 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.249748 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.249758 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.249772 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.249783 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:34Z","lastTransitionTime":"2026-02-26T11:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.260349 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.260413 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.260380 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:34 crc kubenswrapper[4699]: E0226 11:12:34.260499 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:12:34 crc kubenswrapper[4699]: E0226 11:12:34.260565 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:12:34 crc kubenswrapper[4699]: E0226 11:12:34.260742 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.352723 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.352779 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.352789 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.352809 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.352821 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:34Z","lastTransitionTime":"2026-02-26T11:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.455848 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.455920 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.455937 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.455965 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.455982 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:34Z","lastTransitionTime":"2026-02-26T11:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.559368 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.559417 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.559436 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.559456 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.559467 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:34Z","lastTransitionTime":"2026-02-26T11:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.662966 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.663027 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.663039 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.663060 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.663074 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:34Z","lastTransitionTime":"2026-02-26T11:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.766719 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.766756 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.766766 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.766780 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.766792 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:34Z","lastTransitionTime":"2026-02-26T11:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.869756 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.869795 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.869803 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.869817 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.869826 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:34Z","lastTransitionTime":"2026-02-26T11:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.972370 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.972418 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.972428 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.972443 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:34 crc kubenswrapper[4699]: I0226 11:12:34.972453 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:34Z","lastTransitionTime":"2026-02-26T11:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.073878 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.073906 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.073915 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.073927 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.073937 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:35Z","lastTransitionTime":"2026-02-26T11:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.176291 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.176343 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.176361 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.176378 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.176390 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:35Z","lastTransitionTime":"2026-02-26T11:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.278761 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.278813 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.278821 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.278834 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.278844 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:35Z","lastTransitionTime":"2026-02-26T11:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.381378 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.381426 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.381438 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.381454 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.381467 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:35Z","lastTransitionTime":"2026-02-26T11:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.483404 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.483463 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.483480 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.483498 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.483514 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:35Z","lastTransitionTime":"2026-02-26T11:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.586191 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.586228 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.586236 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.586252 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.586264 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:35Z","lastTransitionTime":"2026-02-26T11:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.688727 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.688756 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.688766 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.688778 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.688808 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:35Z","lastTransitionTime":"2026-02-26T11:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.791252 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.791289 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.791305 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.791321 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.791334 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:35Z","lastTransitionTime":"2026-02-26T11:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.894398 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.894434 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.894448 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.894465 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.894478 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:35Z","lastTransitionTime":"2026-02-26T11:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.997570 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.997603 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.997612 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.997626 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:35 crc kubenswrapper[4699]: I0226 11:12:35.997635 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:35Z","lastTransitionTime":"2026-02-26T11:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.058352 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:12:36 crc kubenswrapper[4699]: E0226 11:12:36.058543 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:12:52.058524546 +0000 UTC m=+117.869350980 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.100608 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.100643 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.100653 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.100666 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.100675 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:36Z","lastTransitionTime":"2026-02-26T11:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.159839 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.159884 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.159905 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.159922 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:36 crc kubenswrapper[4699]: E0226 11:12:36.160035 4699 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 11:12:36 crc kubenswrapper[4699]: E0226 11:12:36.160096 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:52.160077902 +0000 UTC m=+117.970904336 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 11:12:36 crc kubenswrapper[4699]: E0226 11:12:36.160306 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 11:12:36 crc kubenswrapper[4699]: E0226 11:12:36.160334 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 11:12:36 crc kubenswrapper[4699]: E0226 11:12:36.160348 4699 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:36 crc kubenswrapper[4699]: E0226 11:12:36.160402 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:52.16039258 +0000 UTC m=+117.971219014 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:36 crc kubenswrapper[4699]: E0226 11:12:36.160422 4699 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 11:12:36 crc kubenswrapper[4699]: E0226 11:12:36.160635 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:52.160609606 +0000 UTC m=+117.971436130 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 11:12:36 crc kubenswrapper[4699]: E0226 11:12:36.160482 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 11:12:36 crc kubenswrapper[4699]: E0226 11:12:36.160671 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 11:12:36 crc kubenswrapper[4699]: E0226 11:12:36.160687 4699 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:36 crc kubenswrapper[4699]: E0226 11:12:36.160724 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:52.160716179 +0000 UTC m=+117.971542743 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.203663 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.203709 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.203718 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.203732 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.203744 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:36Z","lastTransitionTime":"2026-02-26T11:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.260345 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.260408 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:36 crc kubenswrapper[4699]: E0226 11:12:36.260486 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.260365 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:36 crc kubenswrapper[4699]: E0226 11:12:36.260770 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:12:36 crc kubenswrapper[4699]: E0226 11:12:36.260887 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.274406 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:36Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.286965 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:36Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.302152 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:36Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.305538 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.305569 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.305577 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.305589 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.305598 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:36Z","lastTransitionTime":"2026-02-26T11:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.316412 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:36Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.330609 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:36Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.344046 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:36Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.356171 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:36Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.408180 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.408214 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.408224 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.408241 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.408254 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:36Z","lastTransitionTime":"2026-02-26T11:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.510269 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.510313 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.510325 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.510340 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.510352 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:36Z","lastTransitionTime":"2026-02-26T11:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.612196 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.612239 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.612250 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.612266 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.612280 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:36Z","lastTransitionTime":"2026-02-26T11:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.716345 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.716402 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.716413 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.716432 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.716444 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:36Z","lastTransitionTime":"2026-02-26T11:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.819882 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.819921 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.819930 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.819946 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.819960 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:36Z","lastTransitionTime":"2026-02-26T11:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.923267 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.923359 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.923384 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.923416 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:36 crc kubenswrapper[4699]: I0226 11:12:36.923438 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:36Z","lastTransitionTime":"2026-02-26T11:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.025618 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.025672 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.025684 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.025702 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.025715 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:37Z","lastTransitionTime":"2026-02-26T11:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.128569 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.128631 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.128644 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.128664 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.128680 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:37Z","lastTransitionTime":"2026-02-26T11:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.231414 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.231456 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.231467 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.231484 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.231497 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:37Z","lastTransitionTime":"2026-02-26T11:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.333996 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.334045 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.334056 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.334074 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.334085 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:37Z","lastTransitionTime":"2026-02-26T11:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.436315 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.436362 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.436370 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.436385 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.436395 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:37Z","lastTransitionTime":"2026-02-26T11:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.539024 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.539071 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.539083 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.539145 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.539158 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:37Z","lastTransitionTime":"2026-02-26T11:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.642633 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.642683 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.642696 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.642715 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.642731 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:37Z","lastTransitionTime":"2026-02-26T11:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.744594 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.744843 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.744907 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.744966 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.745021 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:37Z","lastTransitionTime":"2026-02-26T11:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.847580 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.847635 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.847652 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.847673 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.847686 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:37Z","lastTransitionTime":"2026-02-26T11:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.950396 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.950927 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.950994 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.951089 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:37 crc kubenswrapper[4699]: I0226 11:12:37.951195 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:37Z","lastTransitionTime":"2026-02-26T11:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.054431 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.054663 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.054753 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.054812 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.054904 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:38Z","lastTransitionTime":"2026-02-26T11:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.157778 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.158144 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.158239 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.158428 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.158524 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:38Z","lastTransitionTime":"2026-02-26T11:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.260083 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:38 crc kubenswrapper[4699]: E0226 11:12:38.260308 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.260342 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:38 crc kubenswrapper[4699]: E0226 11:12:38.260462 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.260105 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:38 crc kubenswrapper[4699]: E0226 11:12:38.260710 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.261602 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.262022 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.262189 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.262295 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.262386 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:38Z","lastTransitionTime":"2026-02-26T11:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.364927 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.364960 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.364969 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.364983 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.364992 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:38Z","lastTransitionTime":"2026-02-26T11:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.466953 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.466990 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.467001 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.467018 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.467030 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:38Z","lastTransitionTime":"2026-02-26T11:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.570230 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.570334 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.570348 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.570372 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.570394 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:38Z","lastTransitionTime":"2026-02-26T11:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.672536 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.672802 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.672863 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.672930 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.673037 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:38Z","lastTransitionTime":"2026-02-26T11:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.775384 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.775578 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.775595 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.775619 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.775632 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:38Z","lastTransitionTime":"2026-02-26T11:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.878470 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.878516 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.878528 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.878543 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.878555 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:38Z","lastTransitionTime":"2026-02-26T11:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.980474 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.980516 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.980527 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.980541 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:38 crc kubenswrapper[4699]: I0226 11:12:38.980552 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:38Z","lastTransitionTime":"2026-02-26T11:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.082719 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.082757 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.082769 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.082783 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.082794 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:39Z","lastTransitionTime":"2026-02-26T11:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.185585 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.185643 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.185654 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.185671 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.185683 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:39Z","lastTransitionTime":"2026-02-26T11:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.287597 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.287643 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.287652 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.287670 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.287681 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:39Z","lastTransitionTime":"2026-02-26T11:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.390062 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.390353 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.390438 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.390502 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.390615 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:39Z","lastTransitionTime":"2026-02-26T11:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.493453 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.493496 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.493507 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.493522 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.493535 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:39Z","lastTransitionTime":"2026-02-26T11:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.596277 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.596327 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.596337 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.596354 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.596365 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:39Z","lastTransitionTime":"2026-02-26T11:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.698330 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.698637 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.698720 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.698818 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.698895 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:39Z","lastTransitionTime":"2026-02-26T11:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.801290 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.801323 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.801335 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.801351 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.801363 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:39Z","lastTransitionTime":"2026-02-26T11:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.903361 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.903401 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.903412 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.903427 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:39 crc kubenswrapper[4699]: I0226 11:12:39.903440 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:39Z","lastTransitionTime":"2026-02-26T11:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.005697 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.005755 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.005769 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.005785 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.005797 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:40Z","lastTransitionTime":"2026-02-26T11:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.108911 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.109036 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.109047 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.109064 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.109091 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:40Z","lastTransitionTime":"2026-02-26T11:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.211595 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.211627 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.211639 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.211653 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.211665 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:40Z","lastTransitionTime":"2026-02-26T11:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.260344 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.260841 4699 scope.go:117] "RemoveContainer" containerID="eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0" Feb 26 11:12:40 crc kubenswrapper[4699]: E0226 11:12:40.261018 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.261193 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.261224 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:40 crc kubenswrapper[4699]: E0226 11:12:40.261272 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:12:40 crc kubenswrapper[4699]: E0226 11:12:40.261605 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:12:40 crc kubenswrapper[4699]: E0226 11:12:40.261657 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.313723 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.314087 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.314239 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.314328 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.314417 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:40Z","lastTransitionTime":"2026-02-26T11:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.416676 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.416709 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.416720 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.416814 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.416834 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:40Z","lastTransitionTime":"2026-02-26T11:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.519303 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.519343 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.519353 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.519369 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.519378 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:40Z","lastTransitionTime":"2026-02-26T11:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.621819 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.621872 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.621882 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.621897 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.621907 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:40Z","lastTransitionTime":"2026-02-26T11:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.724686 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.724727 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.724736 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.724751 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.724763 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:40Z","lastTransitionTime":"2026-02-26T11:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.826353 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.826388 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.826397 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.826410 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.826420 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:40Z","lastTransitionTime":"2026-02-26T11:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.851665 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-gbl2h"] Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.852222 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gbl2h" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.854448 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.854512 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.854587 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.866898 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:40Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.878657 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:40Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.892033 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:40Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.904521 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:40Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.919607 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:40Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.930334 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.930404 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.930420 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.930438 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.930472 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:40Z","lastTransitionTime":"2026-02-26T11:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.934850 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:40Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.945891 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:40Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:40 crc kubenswrapper[4699]: I0226 11:12:40.961372 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:40Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.009858 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/db105b7b-9325-4f20-a760-06c045ea844f-hosts-file\") pod \"node-resolver-gbl2h\" (UID: \"db105b7b-9325-4f20-a760-06c045ea844f\") " pod="openshift-dns/node-resolver-gbl2h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.009935 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9n6f\" (UniqueName: \"kubernetes.io/projected/db105b7b-9325-4f20-a760-06c045ea844f-kube-api-access-t9n6f\") pod \"node-resolver-gbl2h\" (UID: \"db105b7b-9325-4f20-a760-06c045ea844f\") " pod="openshift-dns/node-resolver-gbl2h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.033247 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.033287 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.033295 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.033309 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.033319 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:41Z","lastTransitionTime":"2026-02-26T11:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.110683 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9n6f\" (UniqueName: \"kubernetes.io/projected/db105b7b-9325-4f20-a760-06c045ea844f-kube-api-access-t9n6f\") pod \"node-resolver-gbl2h\" (UID: \"db105b7b-9325-4f20-a760-06c045ea844f\") " pod="openshift-dns/node-resolver-gbl2h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.110764 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/db105b7b-9325-4f20-a760-06c045ea844f-hosts-file\") pod \"node-resolver-gbl2h\" (UID: \"db105b7b-9325-4f20-a760-06c045ea844f\") " pod="openshift-dns/node-resolver-gbl2h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.110860 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/db105b7b-9325-4f20-a760-06c045ea844f-hosts-file\") pod \"node-resolver-gbl2h\" (UID: \"db105b7b-9325-4f20-a760-06c045ea844f\") " pod="openshift-dns/node-resolver-gbl2h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.135951 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.135995 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.136006 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.136024 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.136036 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:41Z","lastTransitionTime":"2026-02-26T11:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.138684 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9n6f\" (UniqueName: \"kubernetes.io/projected/db105b7b-9325-4f20-a760-06c045ea844f-kube-api-access-t9n6f\") pod \"node-resolver-gbl2h\" (UID: \"db105b7b-9325-4f20-a760-06c045ea844f\") " pod="openshift-dns/node-resolver-gbl2h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.165468 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gbl2h" Feb 26 11:12:41 crc kubenswrapper[4699]: W0226 11:12:41.179026 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb105b7b_9325_4f20_a760_06c045ea844f.slice/crio-f73bc9858ad6a2aa98dfd93ca9f6bf62bd314c1c2fb6f45ef41eced5e8a668ec WatchSource:0}: Error finding container f73bc9858ad6a2aa98dfd93ca9f6bf62bd314c1c2fb6f45ef41eced5e8a668ec: Status 404 returned error can't find the container with id f73bc9858ad6a2aa98dfd93ca9f6bf62bd314c1c2fb6f45ef41eced5e8a668ec Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.225598 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-2k6b7"] Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.225973 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-28p79"] Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.226220 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-tfp9h"] Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.226873 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.227529 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.228001 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.231875 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.235368 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.235694 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.235951 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.236298 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.238588 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.238671 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.238607 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.238844 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.239014 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.239206 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.239384 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.243723 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.244295 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.244310 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.244330 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.244346 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:41Z","lastTransitionTime":"2026-02-26T11:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.253414 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.271685 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.288172 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.305361 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313042 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25g9f\" (UniqueName: \"kubernetes.io/projected/e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff-kube-api-access-25g9f\") pod \"machine-config-daemon-28p79\" (UID: \"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\") " pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313097 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-hostroot\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313135 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff-mcd-auth-proxy-config\") pod \"machine-config-daemon-28p79\" (UID: \"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\") " pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313174 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-cnibin\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313195 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-host-var-lib-cni-multus\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313220 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95d160b5-697e-42fa-8cd0-8b7b337820c4-system-cni-dir\") pod \"multus-additional-cni-plugins-tfp9h\" (UID: \"95d160b5-697e-42fa-8cd0-8b7b337820c4\") " pod="openshift-multus/multus-additional-cni-plugins-tfp9h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313240 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-host-run-k8s-cni-cncf-io\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313262 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-etc-kubernetes\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313282 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g8bl\" (UniqueName: \"kubernetes.io/projected/32ce77d1-5287-4674-aeda-810070efbb29-kube-api-access-6g8bl\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313297 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-host-var-lib-cni-bin\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313323 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/95d160b5-697e-42fa-8cd0-8b7b337820c4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tfp9h\" (UID: \"95d160b5-697e-42fa-8cd0-8b7b337820c4\") " pod="openshift-multus/multus-additional-cni-plugins-tfp9h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313342 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/95d160b5-697e-42fa-8cd0-8b7b337820c4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tfp9h\" (UID: \"95d160b5-697e-42fa-8cd0-8b7b337820c4\") " pod="openshift-multus/multus-additional-cni-plugins-tfp9h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313359 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tczc\" (UniqueName: \"kubernetes.io/projected/95d160b5-697e-42fa-8cd0-8b7b337820c4-kube-api-access-5tczc\") pod \"multus-additional-cni-plugins-tfp9h\" (UID: \"95d160b5-697e-42fa-8cd0-8b7b337820c4\") " pod="openshift-multus/multus-additional-cni-plugins-tfp9h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313376 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32ce77d1-5287-4674-aeda-810070efbb29-cni-binary-copy\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313391 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/32ce77d1-5287-4674-aeda-810070efbb29-multus-daemon-config\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313408 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/95d160b5-697e-42fa-8cd0-8b7b337820c4-os-release\") pod \"multus-additional-cni-plugins-tfp9h\" (UID: \"95d160b5-697e-42fa-8cd0-8b7b337820c4\") " pod="openshift-multus/multus-additional-cni-plugins-tfp9h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313424 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-multus-socket-dir-parent\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313445 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-host-var-lib-kubelet\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313461 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff-proxy-tls\") pod \"machine-config-daemon-28p79\" (UID: \"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\") " pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313481 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-multus-conf-dir\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313498 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/95d160b5-697e-42fa-8cd0-8b7b337820c4-cnibin\") pod \"multus-additional-cni-plugins-tfp9h\" (UID: \"95d160b5-697e-42fa-8cd0-8b7b337820c4\") " pod="openshift-multus/multus-additional-cni-plugins-tfp9h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313512 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-os-release\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313544 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-host-run-netns\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313596 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/95d160b5-697e-42fa-8cd0-8b7b337820c4-cni-binary-copy\") pod \"multus-additional-cni-plugins-tfp9h\" (UID: \"95d160b5-697e-42fa-8cd0-8b7b337820c4\") " pod="openshift-multus/multus-additional-cni-plugins-tfp9h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313620 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-host-run-multus-certs\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313654 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-multus-cni-dir\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313670 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-system-cni-dir\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.313693 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff-rootfs\") pod \"machine-config-daemon-28p79\" (UID: \"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\") " pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.319704 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.340306 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.347860 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.347924 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.347937 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.347954 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.347964 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:41Z","lastTransitionTime":"2026-02-26T11:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.360669 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.377760 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.393093 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.412254 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414536 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/95d160b5-697e-42fa-8cd0-8b7b337820c4-cnibin\") pod \"multus-additional-cni-plugins-tfp9h\" (UID: \"95d160b5-697e-42fa-8cd0-8b7b337820c4\") " pod="openshift-multus/multus-additional-cni-plugins-tfp9h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414586 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-os-release\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414608 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-host-run-netns\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414624 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-host-run-multus-certs\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414652 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/95d160b5-697e-42fa-8cd0-8b7b337820c4-cni-binary-copy\") pod \"multus-additional-cni-plugins-tfp9h\" (UID: \"95d160b5-697e-42fa-8cd0-8b7b337820c4\") " pod="openshift-multus/multus-additional-cni-plugins-tfp9h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414666 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-multus-cni-dir\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414689 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-system-cni-dir\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414710 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff-rootfs\") pod \"machine-config-daemon-28p79\" (UID: \"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\") " pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414729 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25g9f\" (UniqueName: \"kubernetes.io/projected/e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff-kube-api-access-25g9f\") pod \"machine-config-daemon-28p79\" (UID: \"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\") " pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414745 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-hostroot\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414760 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-cnibin\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414776 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-host-var-lib-cni-multus\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414792 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff-mcd-auth-proxy-config\") pod \"machine-config-daemon-28p79\" (UID: \"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\") " pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414807 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95d160b5-697e-42fa-8cd0-8b7b337820c4-system-cni-dir\") pod \"multus-additional-cni-plugins-tfp9h\" (UID: \"95d160b5-697e-42fa-8cd0-8b7b337820c4\") " pod="openshift-multus/multus-additional-cni-plugins-tfp9h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414825 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-host-run-k8s-cni-cncf-io\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414840 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-etc-kubernetes\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414853 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-host-var-lib-cni-bin\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414869 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g8bl\" (UniqueName: \"kubernetes.io/projected/32ce77d1-5287-4674-aeda-810070efbb29-kube-api-access-6g8bl\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414883 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/95d160b5-697e-42fa-8cd0-8b7b337820c4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tfp9h\" (UID: \"95d160b5-697e-42fa-8cd0-8b7b337820c4\") " pod="openshift-multus/multus-additional-cni-plugins-tfp9h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414899 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tczc\" (UniqueName: \"kubernetes.io/projected/95d160b5-697e-42fa-8cd0-8b7b337820c4-kube-api-access-5tczc\") pod \"multus-additional-cni-plugins-tfp9h\" (UID: \"95d160b5-697e-42fa-8cd0-8b7b337820c4\") " pod="openshift-multus/multus-additional-cni-plugins-tfp9h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414914 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32ce77d1-5287-4674-aeda-810070efbb29-cni-binary-copy\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414935 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/32ce77d1-5287-4674-aeda-810070efbb29-multus-daemon-config\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414964 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/95d160b5-697e-42fa-8cd0-8b7b337820c4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tfp9h\" (UID: \"95d160b5-697e-42fa-8cd0-8b7b337820c4\") " pod="openshift-multus/multus-additional-cni-plugins-tfp9h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414979 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/95d160b5-697e-42fa-8cd0-8b7b337820c4-os-release\") pod \"multus-additional-cni-plugins-tfp9h\" (UID: \"95d160b5-697e-42fa-8cd0-8b7b337820c4\") " pod="openshift-multus/multus-additional-cni-plugins-tfp9h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.414996 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-multus-socket-dir-parent\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.415010 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-host-var-lib-kubelet\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.415024 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff-proxy-tls\") pod \"machine-config-daemon-28p79\" (UID: \"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\") " pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.415038 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-multus-conf-dir\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.415099 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-multus-conf-dir\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.415164 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-host-run-k8s-cni-cncf-io\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.415190 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-etc-kubernetes\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.415213 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-host-var-lib-cni-bin\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.415208 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95d160b5-697e-42fa-8cd0-8b7b337820c4-system-cni-dir\") pod \"multus-additional-cni-plugins-tfp9h\" (UID: \"95d160b5-697e-42fa-8cd0-8b7b337820c4\") " pod="openshift-multus/multus-additional-cni-plugins-tfp9h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.415269 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/95d160b5-697e-42fa-8cd0-8b7b337820c4-cnibin\") pod \"multus-additional-cni-plugins-tfp9h\" (UID: \"95d160b5-697e-42fa-8cd0-8b7b337820c4\") " pod="openshift-multus/multus-additional-cni-plugins-tfp9h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.415392 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-os-release\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.415414 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff-rootfs\") pod \"machine-config-daemon-28p79\" (UID: \"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\") " pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.415663 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-system-cni-dir\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.415660 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-multus-cni-dir\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.415664 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-host-run-netns\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.415730 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-cnibin\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.415768 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-hostroot\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.415792 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-host-run-multus-certs\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.415795 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-host-var-lib-cni-multus\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.416143 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/95d160b5-697e-42fa-8cd0-8b7b337820c4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tfp9h\" (UID: \"95d160b5-697e-42fa-8cd0-8b7b337820c4\") " pod="openshift-multus/multus-additional-cni-plugins-tfp9h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.416291 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/95d160b5-697e-42fa-8cd0-8b7b337820c4-cni-binary-copy\") pod \"multus-additional-cni-plugins-tfp9h\" (UID: \"95d160b5-697e-42fa-8cd0-8b7b337820c4\") " pod="openshift-multus/multus-additional-cni-plugins-tfp9h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.416309 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/95d160b5-697e-42fa-8cd0-8b7b337820c4-os-release\") pod \"multus-additional-cni-plugins-tfp9h\" (UID: \"95d160b5-697e-42fa-8cd0-8b7b337820c4\") " pod="openshift-multus/multus-additional-cni-plugins-tfp9h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.416396 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-host-var-lib-kubelet\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.416454 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/32ce77d1-5287-4674-aeda-810070efbb29-multus-socket-dir-parent\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.416702 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32ce77d1-5287-4674-aeda-810070efbb29-cni-binary-copy\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.416718 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff-mcd-auth-proxy-config\") pod \"machine-config-daemon-28p79\" (UID: \"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\") " pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.416818 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/32ce77d1-5287-4674-aeda-810070efbb29-multus-daemon-config\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.421883 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/95d160b5-697e-42fa-8cd0-8b7b337820c4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tfp9h\" (UID: \"95d160b5-697e-42fa-8cd0-8b7b337820c4\") " pod="openshift-multus/multus-additional-cni-plugins-tfp9h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.422649 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff-proxy-tls\") pod \"machine-config-daemon-28p79\" (UID: \"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\") " pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.433040 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.435057 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25g9f\" (UniqueName: \"kubernetes.io/projected/e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff-kube-api-access-25g9f\") pod \"machine-config-daemon-28p79\" (UID: \"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\") " pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.437351 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tczc\" (UniqueName: \"kubernetes.io/projected/95d160b5-697e-42fa-8cd0-8b7b337820c4-kube-api-access-5tczc\") pod \"multus-additional-cni-plugins-tfp9h\" (UID: \"95d160b5-697e-42fa-8cd0-8b7b337820c4\") " pod="openshift-multus/multus-additional-cni-plugins-tfp9h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.439059 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g8bl\" (UniqueName: \"kubernetes.io/projected/32ce77d1-5287-4674-aeda-810070efbb29-kube-api-access-6g8bl\") pod \"multus-2k6b7\" (UID: \"32ce77d1-5287-4674-aeda-810070efbb29\") " pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.448936 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.451154 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.451189 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.451198 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.451211 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.451221 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:41Z","lastTransitionTime":"2026-02-26T11:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.464634 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.479632 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.498234 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.515529 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.531031 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.547183 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.553790 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.554064 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.554181 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.554262 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.554333 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:41Z","lastTransitionTime":"2026-02-26T11:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.562344 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.563544 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.572826 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2k6b7" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.582146 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.583685 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.598653 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.604394 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cw6vx"] Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.605840 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.609778 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.610070 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.610233 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.610434 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.610545 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.610682 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.611140 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.624055 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.642920 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.660436 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.660492 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.660506 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.660522 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.660536 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:41Z","lastTransitionTime":"2026-02-26T11:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.664720 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.680912 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.695550 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" event={"ID":"95d160b5-697e-42fa-8cd0-8b7b337820c4","Type":"ContainerStarted","Data":"41434ad1774419837d0cfa44cc794bc1cf04b6d532f022c99cc9412efa657e08"} Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.706407 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gbl2h" event={"ID":"db105b7b-9325-4f20-a760-06c045ea844f","Type":"ContainerStarted","Data":"ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce"} Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.706454 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gbl2h" event={"ID":"db105b7b-9325-4f20-a760-06c045ea844f","Type":"ContainerStarted","Data":"f73bc9858ad6a2aa98dfd93ca9f6bf62bd314c1c2fb6f45ef41eced5e8a668ec"} Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.706499 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.710151 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerStarted","Data":"75bb32524199dbcede7aa4c16881b59120ad0fef6384c38ee908897771299028"} Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.713431 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2k6b7" event={"ID":"32ce77d1-5287-4674-aeda-810070efbb29","Type":"ContainerStarted","Data":"e2939f6d7ac86a30ce90043410998ebc04c1e55df27aa2c7369ea114b2b85f39"} Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.720994 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-cni-bin\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.721054 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-slash\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.721075 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-log-socket\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.721097 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-ovn-node-metrics-cert\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.721156 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-kubelet\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.721179 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-systemd-units\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.721201 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-run-ovn\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.721221 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnmg2\" (UniqueName: \"kubernetes.io/projected/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-kube-api-access-tnmg2\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.721244 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-node-log\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.721258 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-run-netns\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.721279 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-run-openvswitch\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.721355 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-run-ovn-kubernetes\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.721414 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.721452 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-ovnkube-config\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.721478 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-var-lib-openvswitch\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.721500 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-etc-openvswitch\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.721522 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-cni-netd\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.721542 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-env-overrides\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.721560 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-ovnkube-script-lib\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.721584 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-run-systemd\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.723642 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.739067 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.755692 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.762749 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.762797 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.762810 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.762827 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.762839 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:41Z","lastTransitionTime":"2026-02-26T11:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.772518 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.791344 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.810988 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.822480 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-env-overrides\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.822516 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-ovnkube-script-lib\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.822545 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-run-systemd\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.822576 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-cni-bin\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.822593 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-slash\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.822608 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-log-socket\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.822624 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-ovn-node-metrics-cert\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.822652 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-kubelet\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.822671 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnmg2\" (UniqueName: \"kubernetes.io/projected/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-kube-api-access-tnmg2\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.822689 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-systemd-units\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.822705 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-run-ovn\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.822732 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-node-log\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.822748 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-run-openvswitch\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.822762 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-run-netns\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.822776 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-run-ovn-kubernetes\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.822789 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-ovnkube-config\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.822807 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.822823 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-var-lib-openvswitch\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.822836 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-etc-openvswitch\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.822863 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-cni-netd\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.822937 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-cni-netd\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.823601 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-env-overrides\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.824743 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-ovnkube-script-lib\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.824807 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-run-systemd\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.825295 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-run-ovn\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.825378 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-cni-bin\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.825408 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-slash\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.825435 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-log-socket\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.825702 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-node-log\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.825743 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-run-openvswitch\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.825780 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-run-netns\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.825845 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-run-ovn-kubernetes\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.826100 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-systemd-units\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.826193 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-var-lib-openvswitch\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.826268 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.826318 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-kubelet\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.826389 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-ovnkube-config\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.826259 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-etc-openvswitch\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.826306 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.829593 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-ovn-node-metrics-cert\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.842506 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.848259 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnmg2\" (UniqueName: \"kubernetes.io/projected/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-kube-api-access-tnmg2\") pod \"ovnkube-node-cw6vx\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.869627 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.883183 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.883235 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.883249 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.883268 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.883280 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:41Z","lastTransitionTime":"2026-02-26T11:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.905317 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.922581 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.925955 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:41 crc kubenswrapper[4699]: W0226 11:12:41.942288 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd12b2df_7af6_45bc_88e7_d5e5e6451e65.slice/crio-90b46f5a3e61ec03394a2be7ff4739209b903f31912a7a66807fca0693899985 WatchSource:0}: Error finding container 90b46f5a3e61ec03394a2be7ff4739209b903f31912a7a66807fca0693899985: Status 404 returned error can't find the container with id 90b46f5a3e61ec03394a2be7ff4739209b903f31912a7a66807fca0693899985 Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.948696 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.966369 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.986202 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.986255 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.986276 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.986296 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.986309 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:41Z","lastTransitionTime":"2026-02-26T11:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:41 crc kubenswrapper[4699]: I0226 11:12:41.988603 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:41Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.006494 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:42Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.040670 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:42Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.064309 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:42Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.081829 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:42Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.093272 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.093322 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.093338 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.093360 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.093379 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:42Z","lastTransitionTime":"2026-02-26T11:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.107666 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:42Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.197091 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.197146 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.197159 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.197175 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.197188 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:42Z","lastTransitionTime":"2026-02-26T11:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.260174 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.260257 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:42 crc kubenswrapper[4699]: E0226 11:12:42.260343 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:12:42 crc kubenswrapper[4699]: E0226 11:12:42.260435 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.260514 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:42 crc kubenswrapper[4699]: E0226 11:12:42.260666 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.302769 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.302828 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.302838 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.302858 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.302873 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:42Z","lastTransitionTime":"2026-02-26T11:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.406141 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.406185 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.406195 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.406213 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.406227 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:42Z","lastTransitionTime":"2026-02-26T11:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.509013 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.509062 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.509073 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.509092 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.509103 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:42Z","lastTransitionTime":"2026-02-26T11:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.612343 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.612415 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.612428 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.612452 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.612465 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:42Z","lastTransitionTime":"2026-02-26T11:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.715607 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.715638 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.715647 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.715659 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.715668 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:42Z","lastTransitionTime":"2026-02-26T11:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.721558 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerStarted","Data":"d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47"} Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.721603 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerStarted","Data":"0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4"} Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.723814 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2k6b7" event={"ID":"32ce77d1-5287-4674-aeda-810070efbb29","Type":"ContainerStarted","Data":"b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3"} Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.725583 4699 generic.go:334] "Generic (PLEG): container finished" podID="95d160b5-697e-42fa-8cd0-8b7b337820c4" containerID="dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01" exitCode=0 Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.725643 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" event={"ID":"95d160b5-697e-42fa-8cd0-8b7b337820c4","Type":"ContainerDied","Data":"dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01"} Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.729165 4699 generic.go:334] "Generic (PLEG): container finished" podID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerID="f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124" exitCode=0 Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.729253 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerDied","Data":"f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124"} Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.729475 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerStarted","Data":"90b46f5a3e61ec03394a2be7ff4739209b903f31912a7a66807fca0693899985"} Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.737625 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:42Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.757788 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:42Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.776220 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:42Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.792443 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:42Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.817945 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.817981 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.817993 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.818009 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.818021 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:42Z","lastTransitionTime":"2026-02-26T11:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.818743 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:42Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.834547 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:42Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.851104 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:42Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.868658 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:42Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.881563 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:42Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.898135 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:42Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.913497 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:42Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.921354 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.921409 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.921425 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.921442 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.921455 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:42Z","lastTransitionTime":"2026-02-26T11:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.931144 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:42Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.951084 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:42Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.963545 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:42Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.979697 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:42Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:42 crc kubenswrapper[4699]: I0226 11:12:42.999730 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:42Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.018208 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.024142 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.024193 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.024205 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.024226 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.024239 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:43Z","lastTransitionTime":"2026-02-26T11:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.033374 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.061126 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.089943 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.108373 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.124729 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.129501 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.129608 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.129687 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.129710 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.129780 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:43Z","lastTransitionTime":"2026-02-26T11:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.150181 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.165529 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.241033 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.241082 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.241096 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.241134 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.241150 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:43Z","lastTransitionTime":"2026-02-26T11:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.348806 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.348853 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.348862 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.348879 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.348911 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:43Z","lastTransitionTime":"2026-02-26T11:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.451579 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.451622 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.451634 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.451651 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.451665 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:43Z","lastTransitionTime":"2026-02-26T11:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.555007 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.555044 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.555056 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.555073 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.555085 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:43Z","lastTransitionTime":"2026-02-26T11:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.657592 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.657634 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.657644 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.657661 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.657673 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:43Z","lastTransitionTime":"2026-02-26T11:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.739529 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerStarted","Data":"e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f"} Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.739770 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerStarted","Data":"fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c"} Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.739846 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerStarted","Data":"d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38"} Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.739905 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerStarted","Data":"bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f"} Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.739998 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerStarted","Data":"74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f"} Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.742126 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" event={"ID":"95d160b5-697e-42fa-8cd0-8b7b337820c4","Type":"ContainerStarted","Data":"83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61"} Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.760889 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.760867 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.760928 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.761088 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.761103 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.761131 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:43Z","lastTransitionTime":"2026-02-26T11:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.785736 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.804995 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.821334 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.837080 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.850928 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.865110 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.865445 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.865541 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.865645 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.865727 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:43Z","lastTransitionTime":"2026-02-26T11:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.870058 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.870355 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.870444 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.870560 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.870649 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:43Z","lastTransitionTime":"2026-02-26T11:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.874981 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: E0226 11:12:43.888557 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.893810 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.894372 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.894433 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.894473 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.894489 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.894502 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:43Z","lastTransitionTime":"2026-02-26T11:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:43 crc kubenswrapper[4699]: E0226 11:12:43.910594 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.911748 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.916920 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.917008 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.917022 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.917039 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.917094 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:43Z","lastTransitionTime":"2026-02-26T11:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.940393 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: E0226 11:12:43.943780 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.950387 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.950481 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.950494 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.950514 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.950527 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:43Z","lastTransitionTime":"2026-02-26T11:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.961780 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: E0226 11:12:43.963908 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.975562 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.975631 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.975645 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.975665 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.975679 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:43Z","lastTransitionTime":"2026-02-26T11:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.990973 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: E0226 11:12:43.991564 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:43Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:43 crc kubenswrapper[4699]: E0226 11:12:43.991711 4699 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.993878 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.993930 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.993944 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.993963 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:43 crc kubenswrapper[4699]: I0226 11:12:43.993979 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:43Z","lastTransitionTime":"2026-02-26T11:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.096444 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.096509 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.096527 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.096554 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.096571 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:44Z","lastTransitionTime":"2026-02-26T11:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.198826 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.198868 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.198878 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.198895 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.198907 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:44Z","lastTransitionTime":"2026-02-26T11:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.260484 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.260634 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:44 crc kubenswrapper[4699]: E0226 11:12:44.260753 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.260815 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:44 crc kubenswrapper[4699]: E0226 11:12:44.261002 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:12:44 crc kubenswrapper[4699]: E0226 11:12:44.261187 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.301813 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.301854 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.301863 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.301877 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.301888 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:44Z","lastTransitionTime":"2026-02-26T11:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.403821 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.403863 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.403878 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.403895 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.403906 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:44Z","lastTransitionTime":"2026-02-26T11:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.506575 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.506615 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.506625 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.506641 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.506653 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:44Z","lastTransitionTime":"2026-02-26T11:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.608715 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.608763 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.608775 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.608789 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.608799 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:44Z","lastTransitionTime":"2026-02-26T11:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.711477 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.711517 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.711526 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.711541 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.711553 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:44Z","lastTransitionTime":"2026-02-26T11:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.751051 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerStarted","Data":"5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f"} Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.813787 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.813826 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.813836 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.813851 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.813863 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:44Z","lastTransitionTime":"2026-02-26T11:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.922066 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.922149 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.922164 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.922180 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:44 crc kubenswrapper[4699]: I0226 11:12:44.922520 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:44Z","lastTransitionTime":"2026-02-26T11:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.026645 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.026877 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.026890 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.026907 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.026919 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:45Z","lastTransitionTime":"2026-02-26T11:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.135326 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.135396 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.135408 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.135425 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.135458 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:45Z","lastTransitionTime":"2026-02-26T11:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.238174 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.238293 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.238303 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.238316 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.238328 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:45Z","lastTransitionTime":"2026-02-26T11:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.276672 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.341752 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.341797 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.341810 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.341855 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.341870 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:45Z","lastTransitionTime":"2026-02-26T11:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.444294 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.444337 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.444349 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.444364 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.444378 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:45Z","lastTransitionTime":"2026-02-26T11:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.547358 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.547403 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.547414 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.547431 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.547447 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:45Z","lastTransitionTime":"2026-02-26T11:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.650494 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.650538 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.650547 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.650563 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.650573 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:45Z","lastTransitionTime":"2026-02-26T11:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.753214 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.753265 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.753275 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.753293 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.753314 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:45Z","lastTransitionTime":"2026-02-26T11:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.756617 4699 generic.go:334] "Generic (PLEG): container finished" podID="95d160b5-697e-42fa-8cd0-8b7b337820c4" containerID="83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61" exitCode=0 Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.756760 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" event={"ID":"95d160b5-697e-42fa-8cd0-8b7b337820c4","Type":"ContainerDied","Data":"83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61"} Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.777551 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:45Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.800202 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:45Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.835190 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:45Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.850518 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:45Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.857042 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.857079 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.857089 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.857106 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.857135 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:45Z","lastTransitionTime":"2026-02-26T11:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.866352 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:45Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.885460 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:45Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.901461 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:45Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.918928 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:45Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.937169 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:45Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.953796 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:45Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.959476 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.959516 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.959527 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.959545 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.959580 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:45Z","lastTransitionTime":"2026-02-26T11:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.978216 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:45Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:45 crc kubenswrapper[4699]: I0226 11:12:45.999691 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:45Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.022896 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.064659 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.064722 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.064739 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.064760 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.064777 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:46Z","lastTransitionTime":"2026-02-26T11:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.167739 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.168353 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.168384 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.168403 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.168416 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:46Z","lastTransitionTime":"2026-02-26T11:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.260621 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:46 crc kubenswrapper[4699]: E0226 11:12:46.261371 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.260748 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:46 crc kubenswrapper[4699]: E0226 11:12:46.261542 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.260708 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:46 crc kubenswrapper[4699]: E0226 11:12:46.261626 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.270641 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.270690 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.270703 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.270727 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.270742 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:46Z","lastTransitionTime":"2026-02-26T11:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.277235 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.292924 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.308787 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.327101 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.343100 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.358588 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.373835 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.373897 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.373913 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.373935 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.373947 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:46Z","lastTransitionTime":"2026-02-26T11:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.381061 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.405228 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.422554 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.440658 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.464621 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.476608 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.476672 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.476687 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.476713 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.476731 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:46Z","lastTransitionTime":"2026-02-26T11:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.480727 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.501897 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.580293 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.580351 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.580364 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.580381 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.580393 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:46Z","lastTransitionTime":"2026-02-26T11:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.683534 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.683571 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.683582 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.683600 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.683611 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:46Z","lastTransitionTime":"2026-02-26T11:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.767167 4699 generic.go:334] "Generic (PLEG): container finished" podID="95d160b5-697e-42fa-8cd0-8b7b337820c4" containerID="677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b" exitCode=0 Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.767282 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" event={"ID":"95d160b5-697e-42fa-8cd0-8b7b337820c4","Type":"ContainerDied","Data":"677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b"} Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.774158 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerStarted","Data":"8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f"} Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.783611 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.788258 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.788314 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.788326 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.788343 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.788356 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:46Z","lastTransitionTime":"2026-02-26T11:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.816172 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.835536 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.854166 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.872960 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.890108 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.893560 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.893616 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.893626 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.893661 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.893673 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:46Z","lastTransitionTime":"2026-02-26T11:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.918114 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.938015 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.960532 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.981088 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:46Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.997007 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.997055 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.997065 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.997081 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:46 crc kubenswrapper[4699]: I0226 11:12:46.997091 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:46Z","lastTransitionTime":"2026-02-26T11:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.004984 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:47Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.022905 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:47Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.041882 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:47Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.100911 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.100952 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.100962 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.100977 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.100988 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:47Z","lastTransitionTime":"2026-02-26T11:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.205876 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.205936 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.205945 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.205962 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.205972 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:47Z","lastTransitionTime":"2026-02-26T11:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.308636 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.308668 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.308677 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.308719 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.308729 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:47Z","lastTransitionTime":"2026-02-26T11:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.411896 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.411931 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.411942 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.411958 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.411970 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:47Z","lastTransitionTime":"2026-02-26T11:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.514527 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.514559 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.514568 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.514581 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.514594 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:47Z","lastTransitionTime":"2026-02-26T11:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.617005 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.617052 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.617063 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.617081 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.617095 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:47Z","lastTransitionTime":"2026-02-26T11:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.719913 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.719956 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.719967 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.719983 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.719994 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:47Z","lastTransitionTime":"2026-02-26T11:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.751086 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-gs59q"] Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.751553 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gs59q" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.754529 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.754841 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.754885 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.755583 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.770193 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:47Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.779937 4699 generic.go:334] "Generic (PLEG): container finished" podID="95d160b5-697e-42fa-8cd0-8b7b337820c4" containerID="a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9" exitCode=0 Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.779988 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" event={"ID":"95d160b5-697e-42fa-8cd0-8b7b337820c4","Type":"ContainerDied","Data":"a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9"} Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.788255 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:47Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.803969 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:47Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.818886 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:47Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.823929 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.823984 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.823994 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.824011 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.824022 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:47Z","lastTransitionTime":"2026-02-26T11:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.835401 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:47Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.850181 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:47Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.871306 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:47Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.884333 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gs59q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adbc7948-b89f-46f1-8ebd-c5406fee4e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8kzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gs59q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:47Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.897157 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/adbc7948-b89f-46f1-8ebd-c5406fee4e30-host\") pod \"node-ca-gs59q\" (UID: \"adbc7948-b89f-46f1-8ebd-c5406fee4e30\") " pod="openshift-image-registry/node-ca-gs59q" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.897213 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8kzk\" (UniqueName: \"kubernetes.io/projected/adbc7948-b89f-46f1-8ebd-c5406fee4e30-kube-api-access-k8kzk\") pod \"node-ca-gs59q\" (UID: \"adbc7948-b89f-46f1-8ebd-c5406fee4e30\") " pod="openshift-image-registry/node-ca-gs59q" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.897242 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/adbc7948-b89f-46f1-8ebd-c5406fee4e30-serviceca\") pod \"node-ca-gs59q\" (UID: \"adbc7948-b89f-46f1-8ebd-c5406fee4e30\") " pod="openshift-image-registry/node-ca-gs59q" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.901210 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:47Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.918426 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:47Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.929913 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.929974 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.929987 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.930007 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.930021 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:47Z","lastTransitionTime":"2026-02-26T11:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.934764 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:47Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.981415 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:47Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.998650 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/adbc7948-b89f-46f1-8ebd-c5406fee4e30-host\") pod \"node-ca-gs59q\" (UID: \"adbc7948-b89f-46f1-8ebd-c5406fee4e30\") " pod="openshift-image-registry/node-ca-gs59q" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.998708 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8kzk\" (UniqueName: \"kubernetes.io/projected/adbc7948-b89f-46f1-8ebd-c5406fee4e30-kube-api-access-k8kzk\") pod \"node-ca-gs59q\" (UID: \"adbc7948-b89f-46f1-8ebd-c5406fee4e30\") " pod="openshift-image-registry/node-ca-gs59q" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.998735 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/adbc7948-b89f-46f1-8ebd-c5406fee4e30-serviceca\") pod \"node-ca-gs59q\" (UID: \"adbc7948-b89f-46f1-8ebd-c5406fee4e30\") " pod="openshift-image-registry/node-ca-gs59q" Feb 26 11:12:47 crc kubenswrapper[4699]: I0226 11:12:47.998759 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/adbc7948-b89f-46f1-8ebd-c5406fee4e30-host\") pod \"node-ca-gs59q\" (UID: \"adbc7948-b89f-46f1-8ebd-c5406fee4e30\") " pod="openshift-image-registry/node-ca-gs59q" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.000006 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/adbc7948-b89f-46f1-8ebd-c5406fee4e30-serviceca\") pod \"node-ca-gs59q\" (UID: \"adbc7948-b89f-46f1-8ebd-c5406fee4e30\") " pod="openshift-image-registry/node-ca-gs59q" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.011924 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.028610 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8kzk\" (UniqueName: \"kubernetes.io/projected/adbc7948-b89f-46f1-8ebd-c5406fee4e30-kube-api-access-k8kzk\") pod \"node-ca-gs59q\" (UID: \"adbc7948-b89f-46f1-8ebd-c5406fee4e30\") " pod="openshift-image-registry/node-ca-gs59q" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.032907 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.032945 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.032954 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.032987 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.032997 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:48Z","lastTransitionTime":"2026-02-26T11:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.041704 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.061304 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.069352 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gs59q" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.076503 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: W0226 11:12:48.081221 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadbc7948_b89f_46f1_8ebd_c5406fee4e30.slice/crio-e19702bf616d777e4c3b196bddff9586305430656663d47d5076a6ca0becb46d WatchSource:0}: Error finding container e19702bf616d777e4c3b196bddff9586305430656663d47d5076a6ca0becb46d: Status 404 returned error can't find the container with id e19702bf616d777e4c3b196bddff9586305430656663d47d5076a6ca0becb46d Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.093713 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.115161 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.130754 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.134758 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.134804 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.134813 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.134830 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.134841 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:48Z","lastTransitionTime":"2026-02-26T11:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.146649 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.165517 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.181304 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.199076 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.221740 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.238058 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.238470 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.238486 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.238502 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.238512 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:48Z","lastTransitionTime":"2026-02-26T11:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.238408 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.254023 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.260547 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.260659 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:48 crc kubenswrapper[4699]: E0226 11:12:48.260820 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.260880 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:48 crc kubenswrapper[4699]: E0226 11:12:48.261033 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:12:48 crc kubenswrapper[4699]: E0226 11:12:48.260937 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.273496 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.285069 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gs59q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adbc7948-b89f-46f1-8ebd-c5406fee4e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8kzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gs59q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.341402 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.341883 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.341964 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.342040 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.342120 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:48Z","lastTransitionTime":"2026-02-26T11:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.444689 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.444766 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.444779 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.444794 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.444909 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:48Z","lastTransitionTime":"2026-02-26T11:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.548151 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.548212 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.548224 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.548245 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.548259 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:48Z","lastTransitionTime":"2026-02-26T11:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.650907 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.650937 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.650948 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.650963 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.650975 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:48Z","lastTransitionTime":"2026-02-26T11:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.754091 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.754163 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.754173 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.754187 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.754197 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:48Z","lastTransitionTime":"2026-02-26T11:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.784367 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gs59q" event={"ID":"adbc7948-b89f-46f1-8ebd-c5406fee4e30","Type":"ContainerStarted","Data":"e19702bf616d777e4c3b196bddff9586305430656663d47d5076a6ca0becb46d"} Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.788290 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" event={"ID":"95d160b5-697e-42fa-8cd0-8b7b337820c4","Type":"ContainerStarted","Data":"fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38"} Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.805785 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.821107 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.841257 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.857550 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.857614 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.857626 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.857653 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.857668 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:48Z","lastTransitionTime":"2026-02-26T11:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.864687 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.881777 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.898726 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.917974 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.933697 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.957030 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.961785 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.961834 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.961851 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.961871 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.961885 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:48Z","lastTransitionTime":"2026-02-26T11:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.973631 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gs59q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adbc7948-b89f-46f1-8ebd-c5406fee4e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8kzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gs59q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:48 crc kubenswrapper[4699]: I0226 11:12:48.990932 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:48Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.013903 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:49Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.030690 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:49Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.044280 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:49Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.064522 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.064570 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.064588 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.064610 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.064622 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:49Z","lastTransitionTime":"2026-02-26T11:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.167023 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.167067 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.167078 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.167096 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.167109 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:49Z","lastTransitionTime":"2026-02-26T11:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.269547 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.269578 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.269586 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.269601 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.269800 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:49Z","lastTransitionTime":"2026-02-26T11:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.371890 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.371927 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.371944 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.371965 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.371977 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:49Z","lastTransitionTime":"2026-02-26T11:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.474070 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.474151 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.474161 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.474180 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.474192 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:49Z","lastTransitionTime":"2026-02-26T11:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.576800 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.576849 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.576860 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.576874 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.576884 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:49Z","lastTransitionTime":"2026-02-26T11:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.679990 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.680038 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.680047 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.680065 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.680079 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:49Z","lastTransitionTime":"2026-02-26T11:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.783315 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.783369 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.783378 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.783392 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.783403 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:49Z","lastTransitionTime":"2026-02-26T11:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.792560 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gs59q" event={"ID":"adbc7948-b89f-46f1-8ebd-c5406fee4e30","Type":"ContainerStarted","Data":"6b10028b125687697ebd614eb89d3715bc4913fd8dabed0a60e56572d469f5cd"} Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.797437 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerStarted","Data":"2415fafc678da73fd9c30a65db8d0b812a049df0797deb080a2118c22e4536df"} Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.798028 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.798084 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.798104 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.818227 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:49Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.832594 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:49Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.841727 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.848467 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.848993 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:49Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.863949 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:49Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.876457 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:49Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.885417 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.885445 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.885453 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.885467 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.885476 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:49Z","lastTransitionTime":"2026-02-26T11:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.890658 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:49Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.906489 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:49Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.920480 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:49Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.935446 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:49Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.959975 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:49Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.972671 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gs59q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adbc7948-b89f-46f1-8ebd-c5406fee4e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b10028b125687697ebd614eb89d3715bc4913fd8dabed0a60e56572d469f5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8kzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gs59q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:49Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.987714 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.987766 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.987776 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.987791 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.987801 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:49Z","lastTransitionTime":"2026-02-26T11:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:49 crc kubenswrapper[4699]: I0226 11:12:49.990557 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:49Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.007813 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:50Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.025068 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:50Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.042749 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:50Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.060570 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:50Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.087858 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:50Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.090622 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.090664 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.090674 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.090690 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.090700 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:50Z","lastTransitionTime":"2026-02-26T11:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.113516 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:50Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.127476 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:50Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.147047 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:50Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.163797 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:50Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.178297 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:50Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.193357 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.193402 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.193414 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.193441 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.193455 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:50Z","lastTransitionTime":"2026-02-26T11:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.195141 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:50Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.216262 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:50Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.232468 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:50Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.245541 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:50Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.260136 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.260416 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:50 crc kubenswrapper[4699]: E0226 11:12:50.260548 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.260892 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:50 crc kubenswrapper[4699]: E0226 11:12:50.260958 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:12:50 crc kubenswrapper[4699]: E0226 11:12:50.261006 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.270677 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2415fafc678da73fd9c30a65db8d0b812a049df0797deb080a2118c22e4536df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:50Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.282864 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gs59q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adbc7948-b89f-46f1-8ebd-c5406fee4e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b10028b125687697ebd614eb89d3715bc4913fd8dabed0a60e56572d469f5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8kzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gs59q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:50Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.298646 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.298725 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.298749 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.298775 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.298789 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:50Z","lastTransitionTime":"2026-02-26T11:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.401473 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.401514 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.401527 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.401545 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.401556 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:50Z","lastTransitionTime":"2026-02-26T11:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.505233 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.505292 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.505303 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.505319 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.505330 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:50Z","lastTransitionTime":"2026-02-26T11:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.607918 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.607951 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.607961 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.607995 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.608009 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:50Z","lastTransitionTime":"2026-02-26T11:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.710606 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.710651 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.710664 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.710679 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.710689 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:50Z","lastTransitionTime":"2026-02-26T11:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.813736 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.813768 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.813776 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.813791 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.813800 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:50Z","lastTransitionTime":"2026-02-26T11:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.915995 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.916038 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.916049 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.916065 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:50 crc kubenswrapper[4699]: I0226 11:12:50.916083 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:50Z","lastTransitionTime":"2026-02-26T11:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.039904 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.039935 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.039944 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.039958 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.039967 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:51Z","lastTransitionTime":"2026-02-26T11:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.142081 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.142133 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.142144 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.142179 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.142192 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:51Z","lastTransitionTime":"2026-02-26T11:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.244834 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.244869 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.244877 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.244894 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.244903 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:51Z","lastTransitionTime":"2026-02-26T11:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.346884 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.346932 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.346945 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.346962 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.346978 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:51Z","lastTransitionTime":"2026-02-26T11:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.448895 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.448935 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.448947 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.448963 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.448974 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:51Z","lastTransitionTime":"2026-02-26T11:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.550726 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.550758 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.550767 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.550780 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.550790 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:51Z","lastTransitionTime":"2026-02-26T11:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.653092 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.653150 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.653161 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.653174 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.653184 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:51Z","lastTransitionTime":"2026-02-26T11:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.755575 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.755623 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.755637 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.755653 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.755664 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:51Z","lastTransitionTime":"2026-02-26T11:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.811876 4699 generic.go:334] "Generic (PLEG): container finished" podID="95d160b5-697e-42fa-8cd0-8b7b337820c4" containerID="fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38" exitCode=0 Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.811973 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" event={"ID":"95d160b5-697e-42fa-8cd0-8b7b337820c4","Type":"ContainerDied","Data":"fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38"} Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.827049 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:51Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.841366 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:51Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.854011 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:51Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.857673 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.857717 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.857729 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.857745 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.857758 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:51Z","lastTransitionTime":"2026-02-26T11:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.875002 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2415fafc678da73fd9c30a65db8d0b812a049df0797deb080a2118c22e4536df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:51Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.887491 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gs59q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adbc7948-b89f-46f1-8ebd-c5406fee4e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b10028b125687697ebd614eb89d3715bc4913fd8dabed0a60e56572d469f5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8kzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gs59q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:51Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.900666 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:51Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.915910 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:51Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.933211 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:51Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.955255 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:51Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.964154 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.964241 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.964258 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.964279 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.964292 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:51Z","lastTransitionTime":"2026-02-26T11:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:51 crc kubenswrapper[4699]: I0226 11:12:51.985750 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:51Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.001309 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:51Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.017956 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:52Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.030997 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:52Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.044021 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:52Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.066630 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.066666 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.066676 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.066692 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.066711 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:52Z","lastTransitionTime":"2026-02-26T11:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.149503 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:12:52 crc kubenswrapper[4699]: E0226 11:12:52.149759 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:13:24.149731824 +0000 UTC m=+149.960558258 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.169314 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.169574 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.169679 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.169786 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.169963 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:52Z","lastTransitionTime":"2026-02-26T11:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.251014 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.251074 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.251097 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.251159 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:52 crc kubenswrapper[4699]: E0226 11:12:52.251261 4699 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 11:12:52 crc kubenswrapper[4699]: E0226 11:12:52.251310 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 11:13:24.251297911 +0000 UTC m=+150.062124345 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 11:12:52 crc kubenswrapper[4699]: E0226 11:12:52.251477 4699 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 11:12:52 crc kubenswrapper[4699]: E0226 11:12:52.251501 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 11:13:24.251494526 +0000 UTC m=+150.062320960 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 11:12:52 crc kubenswrapper[4699]: E0226 11:12:52.251623 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 11:12:52 crc kubenswrapper[4699]: E0226 11:12:52.251638 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 11:12:52 crc kubenswrapper[4699]: E0226 11:12:52.251647 4699 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:52 crc kubenswrapper[4699]: E0226 11:12:52.251669 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 11:13:24.251663131 +0000 UTC m=+150.062489565 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:52 crc kubenswrapper[4699]: E0226 11:12:52.251926 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 11:12:52 crc kubenswrapper[4699]: E0226 11:12:52.251949 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 11:12:52 crc kubenswrapper[4699]: E0226 11:12:52.251958 4699 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:52 crc kubenswrapper[4699]: E0226 11:12:52.251987 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 11:13:24.251977389 +0000 UTC m=+150.062803823 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.260105 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.260111 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.260136 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:52 crc kubenswrapper[4699]: E0226 11:12:52.260329 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:12:52 crc kubenswrapper[4699]: E0226 11:12:52.260477 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:12:52 crc kubenswrapper[4699]: E0226 11:12:52.260681 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.272847 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.272901 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.272915 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.272936 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.272949 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:52Z","lastTransitionTime":"2026-02-26T11:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.375060 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.375107 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.375143 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.375164 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.375175 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:52Z","lastTransitionTime":"2026-02-26T11:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.477653 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.477698 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.477711 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.477727 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.477740 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:52Z","lastTransitionTime":"2026-02-26T11:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.579474 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.579519 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.579533 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.579549 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.579560 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:52Z","lastTransitionTime":"2026-02-26T11:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.683297 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.683341 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.683352 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.683368 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.683380 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:52Z","lastTransitionTime":"2026-02-26T11:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.786254 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.786298 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.786313 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.786331 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.786344 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:52Z","lastTransitionTime":"2026-02-26T11:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.820261 4699 generic.go:334] "Generic (PLEG): container finished" podID="95d160b5-697e-42fa-8cd0-8b7b337820c4" containerID="df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c" exitCode=0 Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.820319 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" event={"ID":"95d160b5-697e-42fa-8cd0-8b7b337820c4","Type":"ContainerDied","Data":"df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c"} Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.835667 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:52Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.853105 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:52Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.875070 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:52Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.890223 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.890286 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.890301 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.890323 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.890336 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:52Z","lastTransitionTime":"2026-02-26T11:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.899422 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:52Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.914039 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:52Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.933047 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:52Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.948455 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:52Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.962043 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:52Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.980625 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:52Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.993146 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:52Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.993493 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.993531 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.993543 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.993559 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:52 crc kubenswrapper[4699]: I0226 11:12:52.993571 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:52Z","lastTransitionTime":"2026-02-26T11:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.010067 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.022919 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.043250 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2415fafc678da73fd9c30a65db8d0b812a049df0797deb080a2118c22e4536df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.055508 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gs59q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adbc7948-b89f-46f1-8ebd-c5406fee4e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b10028b125687697ebd614eb89d3715bc4913fd8dabed0a60e56572d469f5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8kzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gs59q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.095702 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.095742 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.095753 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.095772 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.095788 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:53Z","lastTransitionTime":"2026-02-26T11:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.198421 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.198456 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.198468 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.198482 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.198493 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:53Z","lastTransitionTime":"2026-02-26T11:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.260742 4699 scope.go:117] "RemoveContainer" containerID="eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.301493 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.301538 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.301550 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.301596 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.301610 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:53Z","lastTransitionTime":"2026-02-26T11:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.405795 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.405823 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.405832 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.405845 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.405856 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:53Z","lastTransitionTime":"2026-02-26T11:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.508401 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.508433 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.508442 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.508455 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.508466 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:53Z","lastTransitionTime":"2026-02-26T11:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.610769 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.610811 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.610822 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.610839 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.610849 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:53Z","lastTransitionTime":"2026-02-26T11:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.711572 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4"] Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.712133 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.712942 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.712970 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.712983 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.713019 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.713034 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:53Z","lastTransitionTime":"2026-02-26T11:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.714056 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.714216 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.727695 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.739367 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.753953 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.768814 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6dd0f846-a702-4f37-a862-f620cb23e7bf-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9nrn4\" (UID: \"6dd0f846-a702-4f37-a862-f620cb23e7bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.768861 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnqfc\" (UniqueName: \"kubernetes.io/projected/6dd0f846-a702-4f37-a862-f620cb23e7bf-kube-api-access-rnqfc\") pod \"ovnkube-control-plane-749d76644c-9nrn4\" (UID: \"6dd0f846-a702-4f37-a862-f620cb23e7bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.768880 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6dd0f846-a702-4f37-a862-f620cb23e7bf-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9nrn4\" (UID: \"6dd0f846-a702-4f37-a862-f620cb23e7bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.768905 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6dd0f846-a702-4f37-a862-f620cb23e7bf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9nrn4\" (UID: \"6dd0f846-a702-4f37-a862-f620cb23e7bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.776499 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.787889 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.802348 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.814190 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dd0f846-a702-4f37-a862-f620cb23e7bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9nrn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.815637 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.815674 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.815686 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.815704 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.815718 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:53Z","lastTransitionTime":"2026-02-26T11:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.824919 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.826421 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b29557dbd4105e597cfdc9d373556a23fe5f57a9d5ebf3c8bf9db2cd72c81886"} Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.826751 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.830847 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" event={"ID":"95d160b5-697e-42fa-8cd0-8b7b337820c4","Type":"ContainerStarted","Data":"aff9ad002d597d66f0897adccd2319697d917dad946cc1e3fbb6b419397032e3"} Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.831174 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.844728 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.855819 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.869745 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6dd0f846-a702-4f37-a862-f620cb23e7bf-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9nrn4\" (UID: \"6dd0f846-a702-4f37-a862-f620cb23e7bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.869831 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6dd0f846-a702-4f37-a862-f620cb23e7bf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9nrn4\" (UID: \"6dd0f846-a702-4f37-a862-f620cb23e7bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.869913 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.870989 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6dd0f846-a702-4f37-a862-f620cb23e7bf-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9nrn4\" (UID: \"6dd0f846-a702-4f37-a862-f620cb23e7bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.871388 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6dd0f846-a702-4f37-a862-f620cb23e7bf-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9nrn4\" (UID: \"6dd0f846-a702-4f37-a862-f620cb23e7bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.871792 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6dd0f846-a702-4f37-a862-f620cb23e7bf-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9nrn4\" (UID: \"6dd0f846-a702-4f37-a862-f620cb23e7bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.872444 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnqfc\" (UniqueName: \"kubernetes.io/projected/6dd0f846-a702-4f37-a862-f620cb23e7bf-kube-api-access-rnqfc\") pod \"ovnkube-control-plane-749d76644c-9nrn4\" (UID: \"6dd0f846-a702-4f37-a862-f620cb23e7bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.883147 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6dd0f846-a702-4f37-a862-f620cb23e7bf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9nrn4\" (UID: \"6dd0f846-a702-4f37-a862-f620cb23e7bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.885520 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.889544 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnqfc\" (UniqueName: \"kubernetes.io/projected/6dd0f846-a702-4f37-a862-f620cb23e7bf-kube-api-access-rnqfc\") pod \"ovnkube-control-plane-749d76644c-9nrn4\" (UID: \"6dd0f846-a702-4f37-a862-f620cb23e7bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.896398 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.913489 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2415fafc678da73fd9c30a65db8d0b812a049df0797deb080a2118c22e4536df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.917731 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.917780 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.917791 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.917832 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.917857 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:53Z","lastTransitionTime":"2026-02-26T11:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.922798 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gs59q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adbc7948-b89f-46f1-8ebd-c5406fee4e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b10028b125687697ebd614eb89d3715bc4913fd8dabed0a60e56572d469f5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8kzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gs59q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.935159 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.946335 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.958926 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.978414 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2415fafc678da73fd9c30a65db8d0b812a049df0797deb080a2118c22e4536df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:53 crc kubenswrapper[4699]: I0226 11:12:53.990242 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gs59q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adbc7948-b89f-46f1-8ebd-c5406fee4e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b10028b125687697ebd614eb89d3715bc4913fd8dabed0a60e56572d469f5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8kzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gs59q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:53Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.004388 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.018294 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.023056 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.023092 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.023101 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.023136 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.023148 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:54Z","lastTransitionTime":"2026-02-26T11:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.025722 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.036024 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.053456 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: W0226 11:12:54.071482 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dd0f846_a702_4f37_a862_f620cb23e7bf.slice/crio-c26b394f33ef5ce7f27295b839e9a69160522b06071e4f8a8bb4a77d3876bc05 WatchSource:0}: Error finding container c26b394f33ef5ce7f27295b839e9a69160522b06071e4f8a8bb4a77d3876bc05: Status 404 returned error can't find the container with id c26b394f33ef5ce7f27295b839e9a69160522b06071e4f8a8bb4a77d3876bc05 Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.072657 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b29557dbd4105e597cfdc9d373556a23fe5f57a9d5ebf3c8bf9db2cd72c81886\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.088416 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.104993 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aff9ad002d597d66f0897adccd2319697d917dad946cc1e3fbb6b419397032e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.119840 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dd0f846-a702-4f37-a862-f620cb23e7bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9nrn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.125797 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.125840 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.125852 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.125868 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.125881 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:54Z","lastTransitionTime":"2026-02-26T11:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.141920 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.155990 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.228093 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.228141 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.228149 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.228161 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.228170 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:54Z","lastTransitionTime":"2026-02-26T11:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.260180 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.260226 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.260294 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:54 crc kubenswrapper[4699]: E0226 11:12:54.260344 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:12:54 crc kubenswrapper[4699]: E0226 11:12:54.260444 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:12:54 crc kubenswrapper[4699]: E0226 11:12:54.260523 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.332315 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.332352 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.332362 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.332375 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.332386 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:54Z","lastTransitionTime":"2026-02-26T11:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.385861 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.385913 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.385924 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.385942 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.385954 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:54Z","lastTransitionTime":"2026-02-26T11:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:54 crc kubenswrapper[4699]: E0226 11:12:54.402057 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.407665 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.407726 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.407737 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.407756 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.407770 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:54Z","lastTransitionTime":"2026-02-26T11:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:54 crc kubenswrapper[4699]: E0226 11:12:54.422279 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.428440 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.428494 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.428505 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.428523 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.428535 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:54Z","lastTransitionTime":"2026-02-26T11:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:54 crc kubenswrapper[4699]: E0226 11:12:54.442878 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.449639 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.449696 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.449707 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.449726 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.449741 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:54Z","lastTransitionTime":"2026-02-26T11:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.457034 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-v5ctv"] Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.457769 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:12:54 crc kubenswrapper[4699]: E0226 11:12:54.457867 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:12:54 crc kubenswrapper[4699]: E0226 11:12:54.468265 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.473322 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.473357 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.473365 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.473377 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.473387 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:54Z","lastTransitionTime":"2026-02-26T11:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.477018 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.478428 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs\") pod \"network-metrics-daemon-v5ctv\" (UID: \"6956c039-cf77-429b-8f7f-f93ba195d321\") " pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.478540 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phthm\" (UniqueName: \"kubernetes.io/projected/6956c039-cf77-429b-8f7f-f93ba195d321-kube-api-access-phthm\") pod \"network-metrics-daemon-v5ctv\" (UID: \"6956c039-cf77-429b-8f7f-f93ba195d321\") " pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.491330 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: E0226 11:12:54.491683 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: E0226 11:12:54.491839 4699 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.493528 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.493572 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.493596 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.493614 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.493630 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:54Z","lastTransitionTime":"2026-02-26T11:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.508634 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.525076 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.542160 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.556204 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.579628 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phthm\" (UniqueName: \"kubernetes.io/projected/6956c039-cf77-429b-8f7f-f93ba195d321-kube-api-access-phthm\") pod \"network-metrics-daemon-v5ctv\" (UID: \"6956c039-cf77-429b-8f7f-f93ba195d321\") " pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.579708 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs\") pod \"network-metrics-daemon-v5ctv\" (UID: \"6956c039-cf77-429b-8f7f-f93ba195d321\") " pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:12:54 crc kubenswrapper[4699]: E0226 11:12:54.579837 4699 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 11:12:54 crc kubenswrapper[4699]: E0226 11:12:54.579900 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs podName:6956c039-cf77-429b-8f7f-f93ba195d321 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:55.079883816 +0000 UTC m=+120.890710250 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs") pod "network-metrics-daemon-v5ctv" (UID: "6956c039-cf77-429b-8f7f-f93ba195d321") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.586339 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2415fafc678da73fd9c30a65db8d0b812a049df0797deb080a2118c22e4536df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.595432 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.595465 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.595474 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.595489 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.595501 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:54Z","lastTransitionTime":"2026-02-26T11:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.598922 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phthm\" (UniqueName: \"kubernetes.io/projected/6956c039-cf77-429b-8f7f-f93ba195d321-kube-api-access-phthm\") pod \"network-metrics-daemon-v5ctv\" (UID: \"6956c039-cf77-429b-8f7f-f93ba195d321\") " pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.601084 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gs59q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adbc7948-b89f-46f1-8ebd-c5406fee4e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b10028b125687697ebd614eb89d3715bc4913fd8dabed0a60e56572d469f5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8kzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gs59q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.619358 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b29557dbd4105e597cfdc9d373556a23fe5f57a9d5ebf3c8bf9db2cd72c81886\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.646998 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.662695 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.675898 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v5ctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6956c039-cf77-429b-8f7f-f93ba195d321\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v5ctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.697938 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.698000 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.698012 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.698049 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.698060 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:54Z","lastTransitionTime":"2026-02-26T11:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.701206 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.718289 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.742526 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aff9ad002d597d66f0897adccd2319697d917dad946cc1e3fbb6b419397032e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.755885 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dd0f846-a702-4f37-a862-f620cb23e7bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9nrn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.801697 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.801747 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.801760 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.801781 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.801795 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:54Z","lastTransitionTime":"2026-02-26T11:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.837975 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" event={"ID":"6dd0f846-a702-4f37-a862-f620cb23e7bf","Type":"ContainerStarted","Data":"19dd2547b7a142b99f0170df619d1b1750e287507d7f8b8bad3299a4a3d23849"} Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.838042 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" event={"ID":"6dd0f846-a702-4f37-a862-f620cb23e7bf","Type":"ContainerStarted","Data":"a1982fcdf20248152190117fa9699973c4f989d3a197f947e71f8066f8f16218"} Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.838055 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" event={"ID":"6dd0f846-a702-4f37-a862-f620cb23e7bf","Type":"ContainerStarted","Data":"c26b394f33ef5ce7f27295b839e9a69160522b06071e4f8a8bb4a77d3876bc05"} Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.868459 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b29557dbd4105e597cfdc9d373556a23fe5f57a9d5ebf3c8bf9db2cd72c81886\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.907508 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.916718 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.916769 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.916781 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.916799 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.916813 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:54Z","lastTransitionTime":"2026-02-26T11:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.929673 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.943479 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v5ctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6956c039-cf77-429b-8f7f-f93ba195d321\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v5ctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.971588 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:54 crc kubenswrapper[4699]: I0226 11:12:54.992071 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:54Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.011399 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aff9ad002d597d66f0897adccd2319697d917dad946cc1e3fbb6b419397032e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:55Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.019872 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.019936 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.019947 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.019961 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.019974 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:55Z","lastTransitionTime":"2026-02-26T11:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.025677 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dd0f846-a702-4f37-a862-f620cb23e7bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1982fcdf20248152190117fa9699973c4f989d3a197f947e71f8066f8f16218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19dd2547b7a142b99f0170df619d1b1750e287507d7f8b8bad3299a4a3d23849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9nrn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:55Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.040733 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:55Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.055205 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:55Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.068999 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:55Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.088658 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs\") pod \"network-metrics-daemon-v5ctv\" (UID: \"6956c039-cf77-429b-8f7f-f93ba195d321\") " pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:12:55 crc kubenswrapper[4699]: E0226 11:12:55.088839 4699 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.088999 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:55Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:55 crc kubenswrapper[4699]: E0226 11:12:55.088918 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs podName:6956c039-cf77-429b-8f7f-f93ba195d321 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:56.088900231 +0000 UTC m=+121.899726665 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs") pod "network-metrics-daemon-v5ctv" (UID: "6956c039-cf77-429b-8f7f-f93ba195d321") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.106049 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:55Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.124614 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.124655 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.124666 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.124682 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.124694 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:55Z","lastTransitionTime":"2026-02-26T11:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.126615 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:55Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.150108 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2415fafc678da73fd9c30a65db8d0b812a049df0797deb080a2118c22e4536df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:55Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.167717 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gs59q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adbc7948-b89f-46f1-8ebd-c5406fee4e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b10028b125687697ebd614eb89d3715bc4913fd8dabed0a60e56572d469f5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8kzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gs59q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:55Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.228487 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.228533 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.228546 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.228563 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.228575 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:55Z","lastTransitionTime":"2026-02-26T11:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.331099 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.331150 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.331163 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.331178 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.331188 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:55Z","lastTransitionTime":"2026-02-26T11:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.435605 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.435678 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.435708 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.435734 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.435752 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:55Z","lastTransitionTime":"2026-02-26T11:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.539207 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.539246 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.539255 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.539267 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.539276 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:55Z","lastTransitionTime":"2026-02-26T11:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.641989 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.642050 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.642065 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.642084 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.642099 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:55Z","lastTransitionTime":"2026-02-26T11:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.744398 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.744433 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.744444 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.744458 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.744472 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:55Z","lastTransitionTime":"2026-02-26T11:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.846686 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.847339 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cw6vx_cd12b2df-7af6-45bc-88e7-d5e5e6451e65/ovnkube-controller/0.log" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.847408 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.847456 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.847502 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.847520 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:55Z","lastTransitionTime":"2026-02-26T11:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.852287 4699 generic.go:334] "Generic (PLEG): container finished" podID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerID="2415fafc678da73fd9c30a65db8d0b812a049df0797deb080a2118c22e4536df" exitCode=1 Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.852393 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerDied","Data":"2415fafc678da73fd9c30a65db8d0b812a049df0797deb080a2118c22e4536df"} Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.853369 4699 scope.go:117] "RemoveContainer" containerID="2415fafc678da73fd9c30a65db8d0b812a049df0797deb080a2118c22e4536df" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.868317 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:55Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.886076 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:55Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.901005 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v5ctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6956c039-cf77-429b-8f7f-f93ba195d321\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v5ctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:55Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.919017 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b29557dbd4105e597cfdc9d373556a23fe5f57a9d5ebf3c8bf9db2cd72c81886\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:55Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.936562 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:55Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.950798 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.950873 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.950892 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.950915 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.950932 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:55Z","lastTransitionTime":"2026-02-26T11:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.956666 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aff9ad002d597d66f0897adccd2319697d917dad946cc1e3fbb6b419397032e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:55Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.969259 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dd0f846-a702-4f37-a862-f620cb23e7bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1982fcdf20248152190117fa9699973c4f989d3a197f947e71f8066f8f16218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19dd2547b7a142b99f0170df619d1b1750e287507d7f8b8bad3299a4a3d23849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9nrn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:55Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:55 crc kubenswrapper[4699]: I0226 11:12:55.989746 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:55Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.004021 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.022721 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.037220 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.052274 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.053062 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.053093 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.053103 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.053135 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.053149 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:12:56Z","lastTransitionTime":"2026-02-26T11:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.077387 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2415fafc678da73fd9c30a65db8d0b812a049df0797deb080a2118c22e4536df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2415fafc678da73fd9c30a65db8d0b812a049df0797deb080a2118c22e4536df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T11:12:55Z\\\",\\\"message\\\":\\\"6 11:12:55.725598 6475 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0226 11:12:55.725608 6475 handler.go:208] Removed *v1.Node event handler 2\\\\nI0226 11:12:55.725632 6475 handler.go:208] Removed *v1.Node event handler 7\\\\nI0226 11:12:55.725634 6475 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0226 11:12:55.725643 6475 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0226 11:12:55.725661 6475 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0226 11:12:55.725662 6475 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0226 11:12:55.725670 6475 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0226 11:12:55.725679 6475 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0226 11:12:55.725699 6475 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 11:12:55.725719 6475 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0226 11:12:55.725726 6475 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0226 11:12:55.725742 6475 factory.go:656] Stopping watch factory\\\\nI0226 11:12:55.725744 6475 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0226 11:12:55.725764 6475 ovnkube.go:599] Stopped ovnkube\\\\nI0226 11:12:55.725766 6475 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.093476 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gs59q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adbc7948-b89f-46f1-8ebd-c5406fee4e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b10028b125687697ebd614eb89d3715bc4913fd8dabed0a60e56572d469f5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8kzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gs59q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.100322 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs\") pod \"network-metrics-daemon-v5ctv\" (UID: \"6956c039-cf77-429b-8f7f-f93ba195d321\") " pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:12:56 crc kubenswrapper[4699]: E0226 11:12:56.100709 4699 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 11:12:56 crc kubenswrapper[4699]: E0226 11:12:56.100775 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs podName:6956c039-cf77-429b-8f7f-f93ba195d321 nodeName:}" failed. No retries permitted until 2026-02-26 11:12:58.100758092 +0000 UTC m=+123.911584526 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs") pod "network-metrics-daemon-v5ctv" (UID: "6956c039-cf77-429b-8f7f-f93ba195d321") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.111053 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.129045 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: E0226 11:12:56.153452 4699 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.259806 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.259860 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.259946 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:12:56 crc kubenswrapper[4699]: E0226 11:12:56.259936 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.260008 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:56 crc kubenswrapper[4699]: E0226 11:12:56.260179 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:12:56 crc kubenswrapper[4699]: E0226 11:12:56.260285 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:12:56 crc kubenswrapper[4699]: E0226 11:12:56.260359 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.290298 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b29557dbd4105e597cfdc9d373556a23fe5f57a9d5ebf3c8bf9db2cd72c81886\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.306877 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.319667 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.331959 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v5ctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6956c039-cf77-429b-8f7f-f93ba195d321\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v5ctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.357365 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.372837 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.392683 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aff9ad002d597d66f0897adccd2319697d917dad946cc1e3fbb6b419397032e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.406795 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dd0f846-a702-4f37-a862-f620cb23e7bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1982fcdf20248152190117fa9699973c4f989d3a197f947e71f8066f8f16218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19dd2547b7a142b99f0170df619d1b1750e287507d7f8b8bad3299a4a3d23849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9nrn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.423745 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.440178 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.456499 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.478801 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.499167 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.516385 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: E0226 11:12:56.557913 4699 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.559400 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2415fafc678da73fd9c30a65db8d0b812a049df0797deb080a2118c22e4536df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2415fafc678da73fd9c30a65db8d0b812a049df0797deb080a2118c22e4536df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T11:12:55Z\\\",\\\"message\\\":\\\"6 11:12:55.725598 6475 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0226 11:12:55.725608 6475 handler.go:208] Removed *v1.Node event handler 2\\\\nI0226 11:12:55.725632 6475 handler.go:208] Removed *v1.Node event handler 7\\\\nI0226 11:12:55.725634 6475 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0226 11:12:55.725643 6475 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0226 11:12:55.725661 6475 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0226 11:12:55.725662 6475 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0226 11:12:55.725670 6475 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0226 11:12:55.725679 6475 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0226 11:12:55.725699 6475 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 11:12:55.725719 6475 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0226 11:12:55.725726 6475 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0226 11:12:55.725742 6475 factory.go:656] Stopping watch factory\\\\nI0226 11:12:55.725744 6475 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0226 11:12:55.725764 6475 ovnkube.go:599] Stopped ovnkube\\\\nI0226 11:12:55.725766 6475 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.579642 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gs59q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adbc7948-b89f-46f1-8ebd-c5406fee4e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b10028b125687697ebd614eb89d3715bc4913fd8dabed0a60e56572d469f5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8kzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gs59q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.860006 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cw6vx_cd12b2df-7af6-45bc-88e7-d5e5e6451e65/ovnkube-controller/0.log" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.864318 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerStarted","Data":"46aba02a00b28d54085033583368e3a91e4a1f042cab3b0abc3885f717555396"} Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.865023 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.895370 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.914003 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.933045 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aff9ad002d597d66f0897adccd2319697d917dad946cc1e3fbb6b419397032e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.945963 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dd0f846-a702-4f37-a862-f620cb23e7bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1982fcdf20248152190117fa9699973c4f989d3a197f947e71f8066f8f16218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19dd2547b7a142b99f0170df619d1b1750e287507d7f8b8bad3299a4a3d23849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9nrn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.960616 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.972010 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:56 crc kubenswrapper[4699]: I0226 11:12:56.987248 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:56Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:57 crc kubenswrapper[4699]: I0226 11:12:57.004456 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:57Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:57 crc kubenswrapper[4699]: I0226 11:12:57.017625 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:57Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:57 crc kubenswrapper[4699]: I0226 11:12:57.028656 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:57Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:57 crc kubenswrapper[4699]: I0226 11:12:57.047712 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46aba02a00b28d54085033583368e3a91e4a1f042cab3b0abc3885f717555396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2415fafc678da73fd9c30a65db8d0b812a049df0797deb080a2118c22e4536df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T11:12:55Z\\\",\\\"message\\\":\\\"6 11:12:55.725598 6475 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0226 11:12:55.725608 6475 handler.go:208] Removed *v1.Node event handler 2\\\\nI0226 11:12:55.725632 6475 handler.go:208] Removed *v1.Node event handler 7\\\\nI0226 11:12:55.725634 6475 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0226 11:12:55.725643 6475 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0226 11:12:55.725661 6475 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0226 11:12:55.725662 6475 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0226 11:12:55.725670 6475 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0226 11:12:55.725679 6475 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0226 11:12:55.725699 6475 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 11:12:55.725719 6475 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0226 11:12:55.725726 6475 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0226 11:12:55.725742 6475 factory.go:656] Stopping watch factory\\\\nI0226 11:12:55.725744 6475 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0226 11:12:55.725764 6475 ovnkube.go:599] Stopped ovnkube\\\\nI0226 11:12:55.725766 6475 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:57Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:57 crc kubenswrapper[4699]: I0226 11:12:57.058657 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gs59q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adbc7948-b89f-46f1-8ebd-c5406fee4e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b10028b125687697ebd614eb89d3715bc4913fd8dabed0a60e56572d469f5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8kzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gs59q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:57Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:57 crc kubenswrapper[4699]: I0226 11:12:57.076974 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b29557dbd4105e597cfdc9d373556a23fe5f57a9d5ebf3c8bf9db2cd72c81886\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:57Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:57 crc kubenswrapper[4699]: I0226 11:12:57.088363 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:57Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:57 crc kubenswrapper[4699]: I0226 11:12:57.100881 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:57Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:57 crc kubenswrapper[4699]: I0226 11:12:57.111768 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v5ctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6956c039-cf77-429b-8f7f-f93ba195d321\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v5ctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:57Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:57 crc kubenswrapper[4699]: I0226 11:12:57.869331 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cw6vx_cd12b2df-7af6-45bc-88e7-d5e5e6451e65/ovnkube-controller/1.log" Feb 26 11:12:57 crc kubenswrapper[4699]: I0226 11:12:57.870509 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cw6vx_cd12b2df-7af6-45bc-88e7-d5e5e6451e65/ovnkube-controller/0.log" Feb 26 11:12:57 crc kubenswrapper[4699]: I0226 11:12:57.873351 4699 generic.go:334] "Generic (PLEG): container finished" podID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerID="46aba02a00b28d54085033583368e3a91e4a1f042cab3b0abc3885f717555396" exitCode=1 Feb 26 11:12:57 crc kubenswrapper[4699]: I0226 11:12:57.873388 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerDied","Data":"46aba02a00b28d54085033583368e3a91e4a1f042cab3b0abc3885f717555396"} Feb 26 11:12:57 crc kubenswrapper[4699]: I0226 11:12:57.873443 4699 scope.go:117] "RemoveContainer" containerID="2415fafc678da73fd9c30a65db8d0b812a049df0797deb080a2118c22e4536df" Feb 26 11:12:57 crc kubenswrapper[4699]: I0226 11:12:57.874035 4699 scope.go:117] "RemoveContainer" containerID="46aba02a00b28d54085033583368e3a91e4a1f042cab3b0abc3885f717555396" Feb 26 11:12:57 crc kubenswrapper[4699]: E0226 11:12:57.874224 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cw6vx_openshift-ovn-kubernetes(cd12b2df-7af6-45bc-88e7-d5e5e6451e65)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" Feb 26 11:12:57 crc kubenswrapper[4699]: I0226 11:12:57.889512 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:57Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:57 crc kubenswrapper[4699]: I0226 11:12:57.905067 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:57Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:57 crc kubenswrapper[4699]: I0226 11:12:57.918445 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v5ctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6956c039-cf77-429b-8f7f-f93ba195d321\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v5ctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:57Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:57 crc kubenswrapper[4699]: I0226 11:12:57.932056 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b29557dbd4105e597cfdc9d373556a23fe5f57a9d5ebf3c8bf9db2cd72c81886\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:57Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:57 crc kubenswrapper[4699]: I0226 11:12:57.953860 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:57Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:57 crc kubenswrapper[4699]: I0226 11:12:57.968298 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:57Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:57 crc kubenswrapper[4699]: I0226 11:12:57.987171 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aff9ad002d597d66f0897adccd2319697d917dad946cc1e3fbb6b419397032e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:57Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:58 crc kubenswrapper[4699]: I0226 11:12:57.999928 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dd0f846-a702-4f37-a862-f620cb23e7bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1982fcdf20248152190117fa9699973c4f989d3a197f947e71f8066f8f16218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19dd2547b7a142b99f0170df619d1b1750e287507d7f8b8bad3299a4a3d23849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9nrn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:57Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:58 crc kubenswrapper[4699]: I0226 11:12:58.014572 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:58Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:58 crc kubenswrapper[4699]: I0226 11:12:58.026485 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:58Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:58 crc kubenswrapper[4699]: I0226 11:12:58.039883 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:58Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:58 crc kubenswrapper[4699]: I0226 11:12:58.053787 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:58Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:58 crc kubenswrapper[4699]: I0226 11:12:58.064648 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:58Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:58 crc kubenswrapper[4699]: I0226 11:12:58.082664 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46aba02a00b28d54085033583368e3a91e4a1f042cab3b0abc3885f717555396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2415fafc678da73fd9c30a65db8d0b812a049df0797deb080a2118c22e4536df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T11:12:55Z\\\",\\\"message\\\":\\\"6 11:12:55.725598 6475 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0226 11:12:55.725608 6475 handler.go:208] Removed *v1.Node event handler 2\\\\nI0226 11:12:55.725632 6475 handler.go:208] Removed *v1.Node event handler 7\\\\nI0226 11:12:55.725634 6475 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0226 11:12:55.725643 6475 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0226 11:12:55.725661 6475 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0226 11:12:55.725662 6475 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0226 11:12:55.725670 6475 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0226 11:12:55.725679 6475 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0226 11:12:55.725699 6475 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 11:12:55.725719 6475 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0226 11:12:55.725726 6475 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0226 11:12:55.725742 6475 factory.go:656] Stopping watch factory\\\\nI0226 11:12:55.725744 6475 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0226 11:12:55.725764 6475 ovnkube.go:599] Stopped ovnkube\\\\nI0226 11:12:55.725766 6475 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46aba02a00b28d54085033583368e3a91e4a1f042cab3b0abc3885f717555396\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T11:12:57Z\\\",\\\"message\\\":\\\"openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0226 11:12:56.988067 6769 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-gbl2h in node crc\\\\nI0226 11:12:56.988312 6769 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-gbl2h after 0 failed attempt(s)\\\\nI0226 11:12:56.988318 6769 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-gbl2h\\\\nI0226 11:12:56.988176 6769 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI0226 11:12:56.988330 6769 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0226 11:12:56.988335 6769 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0226 11:12:56.988194 6769 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0226 11:12:56.988346 6769 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0226 11:12:56.988353 6769 ovn.go:134] Ensuring zone local for Pod openshift-network-operato\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:58Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:58 crc kubenswrapper[4699]: I0226 11:12:58.094352 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gs59q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adbc7948-b89f-46f1-8ebd-c5406fee4e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b10028b125687697ebd614eb89d3715bc4913fd8dabed0a60e56572d469f5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8kzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gs59q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:58Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:58 crc kubenswrapper[4699]: I0226 11:12:58.107234 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:58Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:58 crc kubenswrapper[4699]: I0226 11:12:58.122722 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs\") pod \"network-metrics-daemon-v5ctv\" (UID: \"6956c039-cf77-429b-8f7f-f93ba195d321\") " pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:12:58 crc kubenswrapper[4699]: E0226 11:12:58.122910 4699 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 11:12:58 crc kubenswrapper[4699]: E0226 11:12:58.122992 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs podName:6956c039-cf77-429b-8f7f-f93ba195d321 nodeName:}" failed. No retries permitted until 2026-02-26 11:13:02.122972441 +0000 UTC m=+127.933798875 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs") pod "network-metrics-daemon-v5ctv" (UID: "6956c039-cf77-429b-8f7f-f93ba195d321") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 11:12:58 crc kubenswrapper[4699]: I0226 11:12:58.260280 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:12:58 crc kubenswrapper[4699]: E0226 11:12:58.260431 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:12:58 crc kubenswrapper[4699]: I0226 11:12:58.260458 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:12:58 crc kubenswrapper[4699]: I0226 11:12:58.260527 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:12:58 crc kubenswrapper[4699]: E0226 11:12:58.260615 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:12:58 crc kubenswrapper[4699]: E0226 11:12:58.260690 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:12:58 crc kubenswrapper[4699]: I0226 11:12:58.260916 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:12:58 crc kubenswrapper[4699]: E0226 11:12:58.261061 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:12:58 crc kubenswrapper[4699]: I0226 11:12:58.880006 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cw6vx_cd12b2df-7af6-45bc-88e7-d5e5e6451e65/ovnkube-controller/1.log" Feb 26 11:12:58 crc kubenswrapper[4699]: I0226 11:12:58.886834 4699 scope.go:117] "RemoveContainer" containerID="46aba02a00b28d54085033583368e3a91e4a1f042cab3b0abc3885f717555396" Feb 26 11:12:58 crc kubenswrapper[4699]: E0226 11:12:58.887201 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cw6vx_openshift-ovn-kubernetes(cd12b2df-7af6-45bc-88e7-d5e5e6451e65)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" Feb 26 11:12:58 crc kubenswrapper[4699]: I0226 11:12:58.910949 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:58Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:58 crc kubenswrapper[4699]: I0226 11:12:58.924235 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:58Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:58 crc kubenswrapper[4699]: I0226 11:12:58.947366 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aff9ad002d597d66f0897adccd2319697d917dad946cc1e3fbb6b419397032e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:58Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:58 crc kubenswrapper[4699]: I0226 11:12:58.959907 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dd0f846-a702-4f37-a862-f620cb23e7bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1982fcdf20248152190117fa9699973c4f989d3a197f947e71f8066f8f16218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19dd2547b7a142b99f0170df619d1b1750e287507d7f8b8bad3299a4a3d23849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9nrn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:58Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:58 crc kubenswrapper[4699]: I0226 11:12:58.973689 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:58Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:58 crc kubenswrapper[4699]: I0226 11:12:58.985999 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:58Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:58 crc kubenswrapper[4699]: I0226 11:12:58.999824 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:58Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:59 crc kubenswrapper[4699]: I0226 11:12:59.013861 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:59Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:59 crc kubenswrapper[4699]: I0226 11:12:59.030989 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:59Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:59 crc kubenswrapper[4699]: I0226 11:12:59.043898 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:59Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:59 crc kubenswrapper[4699]: I0226 11:12:59.068738 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46aba02a00b28d54085033583368e3a91e4a1f042cab3b0abc3885f717555396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46aba02a00b28d54085033583368e3a91e4a1f042cab3b0abc3885f717555396\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T11:12:57Z\\\",\\\"message\\\":\\\"openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0226 11:12:56.988067 6769 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-gbl2h in node crc\\\\nI0226 11:12:56.988312 6769 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-gbl2h after 0 failed attempt(s)\\\\nI0226 11:12:56.988318 6769 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-gbl2h\\\\nI0226 11:12:56.988176 6769 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI0226 11:12:56.988330 6769 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0226 11:12:56.988335 6769 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0226 11:12:56.988194 6769 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0226 11:12:56.988346 6769 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0226 11:12:56.988353 6769 ovn.go:134] Ensuring zone local for Pod openshift-network-operato\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cw6vx_openshift-ovn-kubernetes(cd12b2df-7af6-45bc-88e7-d5e5e6451e65)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:59Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:59 crc kubenswrapper[4699]: I0226 11:12:59.084334 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gs59q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adbc7948-b89f-46f1-8ebd-c5406fee4e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b10028b125687697ebd614eb89d3715bc4913fd8dabed0a60e56572d469f5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8kzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gs59q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:59Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:59 crc kubenswrapper[4699]: I0226 11:12:59.105177 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b29557dbd4105e597cfdc9d373556a23fe5f57a9d5ebf3c8bf9db2cd72c81886\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:59Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:59 crc kubenswrapper[4699]: I0226 11:12:59.118759 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:59Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:59 crc kubenswrapper[4699]: I0226 11:12:59.135953 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:59Z is after 2025-08-24T17:21:41Z" Feb 26 11:12:59 crc kubenswrapper[4699]: I0226 11:12:59.147755 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v5ctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6956c039-cf77-429b-8f7f-f93ba195d321\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v5ctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:12:59Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:00 crc kubenswrapper[4699]: I0226 11:13:00.260177 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:00 crc kubenswrapper[4699]: I0226 11:13:00.260260 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:00 crc kubenswrapper[4699]: I0226 11:13:00.260272 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:00 crc kubenswrapper[4699]: E0226 11:13:00.260328 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:00 crc kubenswrapper[4699]: E0226 11:13:00.260481 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:00 crc kubenswrapper[4699]: I0226 11:13:00.260519 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:00 crc kubenswrapper[4699]: E0226 11:13:00.260803 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:00 crc kubenswrapper[4699]: E0226 11:13:00.260554 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:01 crc kubenswrapper[4699]: E0226 11:13:01.559577 4699 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 11:13:02 crc kubenswrapper[4699]: I0226 11:13:02.171854 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs\") pod \"network-metrics-daemon-v5ctv\" (UID: \"6956c039-cf77-429b-8f7f-f93ba195d321\") " pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:02 crc kubenswrapper[4699]: E0226 11:13:02.172043 4699 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 11:13:02 crc kubenswrapper[4699]: E0226 11:13:02.172201 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs podName:6956c039-cf77-429b-8f7f-f93ba195d321 nodeName:}" failed. No retries permitted until 2026-02-26 11:13:10.172181363 +0000 UTC m=+135.983007797 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs") pod "network-metrics-daemon-v5ctv" (UID: "6956c039-cf77-429b-8f7f-f93ba195d321") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 11:13:02 crc kubenswrapper[4699]: I0226 11:13:02.260715 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:02 crc kubenswrapper[4699]: I0226 11:13:02.260750 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:02 crc kubenswrapper[4699]: E0226 11:13:02.260840 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:02 crc kubenswrapper[4699]: I0226 11:13:02.260715 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:02 crc kubenswrapper[4699]: E0226 11:13:02.260967 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:02 crc kubenswrapper[4699]: E0226 11:13:02.261029 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:02 crc kubenswrapper[4699]: I0226 11:13:02.261051 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:02 crc kubenswrapper[4699]: E0226 11:13:02.261234 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:03 crc kubenswrapper[4699]: I0226 11:13:03.268809 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.260414 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.260514 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.260535 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.260584 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:04 crc kubenswrapper[4699]: E0226 11:13:04.260659 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:04 crc kubenswrapper[4699]: E0226 11:13:04.260810 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:04 crc kubenswrapper[4699]: E0226 11:13:04.260997 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:04 crc kubenswrapper[4699]: E0226 11:13:04.261046 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.540388 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.540433 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.540446 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.540461 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.540473 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:13:04Z","lastTransitionTime":"2026-02-26T11:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:13:04 crc kubenswrapper[4699]: E0226 11:13:04.553455 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:04Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.558086 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.558135 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.558146 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.558163 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.558177 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:13:04Z","lastTransitionTime":"2026-02-26T11:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:13:04 crc kubenswrapper[4699]: E0226 11:13:04.573324 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:04Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.576739 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.576776 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.576787 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.576804 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.576816 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:13:04Z","lastTransitionTime":"2026-02-26T11:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:13:04 crc kubenswrapper[4699]: E0226 11:13:04.592387 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:04Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.595533 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.595568 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.595582 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.595604 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.595615 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:13:04Z","lastTransitionTime":"2026-02-26T11:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:13:04 crc kubenswrapper[4699]: E0226 11:13:04.608842 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:04Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.612839 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.612902 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.612915 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.612934 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:13:04 crc kubenswrapper[4699]: I0226 11:13:04.612948 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:13:04Z","lastTransitionTime":"2026-02-26T11:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:13:04 crc kubenswrapper[4699]: E0226 11:13:04.625635 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:04Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:04 crc kubenswrapper[4699]: E0226 11:13:04.625824 4699 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 11:13:06 crc kubenswrapper[4699]: I0226 11:13:06.260459 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:06 crc kubenswrapper[4699]: E0226 11:13:06.260558 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:06 crc kubenswrapper[4699]: I0226 11:13:06.260725 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:06 crc kubenswrapper[4699]: E0226 11:13:06.260769 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:06 crc kubenswrapper[4699]: I0226 11:13:06.260872 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:06 crc kubenswrapper[4699]: E0226 11:13:06.260915 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:06 crc kubenswrapper[4699]: I0226 11:13:06.261056 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:06 crc kubenswrapper[4699]: E0226 11:13:06.261135 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:06 crc kubenswrapper[4699]: I0226 11:13:06.277947 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b29557dbd4105e597cfdc9d373556a23fe5f57a9d5ebf3c8bf9db2cd72c81886\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:06Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:06 crc kubenswrapper[4699]: I0226 11:13:06.290848 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ad0a6b-b6ae-493c-9d3c-22b171765374\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e3cbce38225d1f5dc3e506a9bc813f8b1e76e2f3315a83ce984c0b51ad0b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a914b4d63550a877b25fc802fe067357dc144ba29c8d4cff98b2d3d1c10baa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0c4b60e5b4a7561832379bb034e4202ac67058677a2e7e362703e8f5952b83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b818c7e0d202c03b73fbaacb76a36fdf224d3674ac77a316eb94ce14a45836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b818c7e0d202c03b73fbaacb76a36fdf224d3674ac77a316eb94ce14a45836\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:06Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:06 crc kubenswrapper[4699]: I0226 11:13:06.306290 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:06Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:06 crc kubenswrapper[4699]: I0226 11:13:06.321649 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:06Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:06 crc kubenswrapper[4699]: I0226 11:13:06.333053 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v5ctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6956c039-cf77-429b-8f7f-f93ba195d321\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v5ctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:06Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:06 crc kubenswrapper[4699]: I0226 11:13:06.352014 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:06Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:06 crc kubenswrapper[4699]: I0226 11:13:06.363201 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:06Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:06 crc kubenswrapper[4699]: I0226 11:13:06.376264 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aff9ad002d597d66f0897adccd2319697d917dad946cc1e3fbb6b419397032e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:06Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:06 crc kubenswrapper[4699]: I0226 11:13:06.388064 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dd0f846-a702-4f37-a862-f620cb23e7bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1982fcdf20248152190117fa9699973c4f989d3a197f947e71f8066f8f16218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19dd2547b7a142b99f0170df619d1b1750e287507d7f8b8bad3299a4a3d23849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9nrn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:06Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:06 crc kubenswrapper[4699]: I0226 11:13:06.400452 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:06Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:06 crc kubenswrapper[4699]: I0226 11:13:06.410582 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:06Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:06 crc kubenswrapper[4699]: I0226 11:13:06.421603 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:06Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:06 crc kubenswrapper[4699]: I0226 11:13:06.435280 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:06Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:06 crc kubenswrapper[4699]: I0226 11:13:06.449087 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:06Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:06 crc kubenswrapper[4699]: I0226 11:13:06.459636 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:06Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:06 crc kubenswrapper[4699]: I0226 11:13:06.479403 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46aba02a00b28d54085033583368e3a91e4a1f042cab3b0abc3885f717555396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46aba02a00b28d54085033583368e3a91e4a1f042cab3b0abc3885f717555396\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T11:12:57Z\\\",\\\"message\\\":\\\"openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0226 11:12:56.988067 6769 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-gbl2h in node crc\\\\nI0226 11:12:56.988312 6769 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-gbl2h after 0 failed attempt(s)\\\\nI0226 11:12:56.988318 6769 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-gbl2h\\\\nI0226 11:12:56.988176 6769 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI0226 11:12:56.988330 6769 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0226 11:12:56.988335 6769 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0226 11:12:56.988194 6769 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0226 11:12:56.988346 6769 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0226 11:12:56.988353 6769 ovn.go:134] Ensuring zone local for Pod openshift-network-operato\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cw6vx_openshift-ovn-kubernetes(cd12b2df-7af6-45bc-88e7-d5e5e6451e65)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:06Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:06 crc kubenswrapper[4699]: I0226 11:13:06.490472 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gs59q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adbc7948-b89f-46f1-8ebd-c5406fee4e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b10028b125687697ebd614eb89d3715bc4913fd8dabed0a60e56572d469f5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8kzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gs59q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:06Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:06 crc kubenswrapper[4699]: E0226 11:13:06.560091 4699 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 11:13:07 crc kubenswrapper[4699]: I0226 11:13:07.714804 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:13:07 crc kubenswrapper[4699]: I0226 11:13:07.746455 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46aba02a00b28d54085033583368e3a91e4a1f042cab3b0abc3885f717555396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46aba02a00b28d54085033583368e3a91e4a1f042cab3b0abc3885f717555396\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T11:12:57Z\\\",\\\"message\\\":\\\"openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0226 11:12:56.988067 6769 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-gbl2h in node crc\\\\nI0226 11:12:56.988312 6769 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-gbl2h after 0 failed attempt(s)\\\\nI0226 11:12:56.988318 6769 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-gbl2h\\\\nI0226 11:12:56.988176 6769 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI0226 11:12:56.988330 6769 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0226 11:12:56.988335 6769 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0226 11:12:56.988194 6769 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0226 11:12:56.988346 6769 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0226 11:12:56.988353 6769 ovn.go:134] Ensuring zone local for Pod openshift-network-operato\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cw6vx_openshift-ovn-kubernetes(cd12b2df-7af6-45bc-88e7-d5e5e6451e65)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:07Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:07 crc kubenswrapper[4699]: I0226 11:13:07.760502 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gs59q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adbc7948-b89f-46f1-8ebd-c5406fee4e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b10028b125687697ebd614eb89d3715bc4913fd8dabed0a60e56572d469f5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8kzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gs59q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:07Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:07 crc kubenswrapper[4699]: I0226 11:13:07.776778 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:07Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:07 crc kubenswrapper[4699]: I0226 11:13:07.791648 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:07Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:07 crc kubenswrapper[4699]: I0226 11:13:07.806946 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:07Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:07 crc kubenswrapper[4699]: I0226 11:13:07.821264 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:07Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:07 crc kubenswrapper[4699]: I0226 11:13:07.833330 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v5ctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6956c039-cf77-429b-8f7f-f93ba195d321\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v5ctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:07Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:07 crc kubenswrapper[4699]: I0226 11:13:07.848283 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b29557dbd4105e597cfdc9d373556a23fe5f57a9d5ebf3c8bf9db2cd72c81886\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:07Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:07 crc kubenswrapper[4699]: I0226 11:13:07.862231 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ad0a6b-b6ae-493c-9d3c-22b171765374\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e3cbce38225d1f5dc3e506a9bc813f8b1e76e2f3315a83ce984c0b51ad0b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a914b4d63550a877b25fc802fe067357dc144ba29c8d4cff98b2d3d1c10baa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0c4b60e5b4a7561832379bb034e4202ac67058677a2e7e362703e8f5952b83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b818c7e0d202c03b73fbaacb76a36fdf224d3674ac77a316eb94ce14a45836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b818c7e0d202c03b73fbaacb76a36fdf224d3674ac77a316eb94ce14a45836\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:07Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:07 crc kubenswrapper[4699]: I0226 11:13:07.896612 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:07Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:07 crc kubenswrapper[4699]: I0226 11:13:07.925424 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:07Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:07 crc kubenswrapper[4699]: I0226 11:13:07.946677 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dd0f846-a702-4f37-a862-f620cb23e7bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1982fcdf20248152190117fa9699973c4f989d3a197f947e71f8066f8f16218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19dd2547b7a142b99f0170df619d1b1750e287507d7f8b8bad3299a4a3d23849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9nrn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:07Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:07 crc kubenswrapper[4699]: I0226 11:13:07.969960 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:07Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:07 crc kubenswrapper[4699]: I0226 11:13:07.986956 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:07Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:08 crc kubenswrapper[4699]: I0226 11:13:08.007931 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aff9ad002d597d66f0897adccd2319697d917dad946cc1e3fbb6b419397032e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:08Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:08 crc kubenswrapper[4699]: I0226 11:13:08.033292 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:08Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:08 crc kubenswrapper[4699]: I0226 11:13:08.048639 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:08Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:08 crc kubenswrapper[4699]: I0226 11:13:08.259752 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:08 crc kubenswrapper[4699]: I0226 11:13:08.259781 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:08 crc kubenswrapper[4699]: I0226 11:13:08.259868 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:08 crc kubenswrapper[4699]: E0226 11:13:08.259936 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:08 crc kubenswrapper[4699]: I0226 11:13:08.260024 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:08 crc kubenswrapper[4699]: E0226 11:13:08.260085 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:08 crc kubenswrapper[4699]: E0226 11:13:08.260280 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:08 crc kubenswrapper[4699]: E0226 11:13:08.260374 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:10 crc kubenswrapper[4699]: I0226 11:13:10.260397 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:10 crc kubenswrapper[4699]: I0226 11:13:10.260434 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:10 crc kubenswrapper[4699]: I0226 11:13:10.260397 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:10 crc kubenswrapper[4699]: E0226 11:13:10.260554 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:10 crc kubenswrapper[4699]: I0226 11:13:10.260583 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:10 crc kubenswrapper[4699]: E0226 11:13:10.260664 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:10 crc kubenswrapper[4699]: E0226 11:13:10.260737 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:10 crc kubenswrapper[4699]: E0226 11:13:10.260838 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:10 crc kubenswrapper[4699]: I0226 11:13:10.263276 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs\") pod \"network-metrics-daemon-v5ctv\" (UID: \"6956c039-cf77-429b-8f7f-f93ba195d321\") " pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:10 crc kubenswrapper[4699]: E0226 11:13:10.263450 4699 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 11:13:10 crc kubenswrapper[4699]: E0226 11:13:10.263571 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs podName:6956c039-cf77-429b-8f7f-f93ba195d321 nodeName:}" failed. No retries permitted until 2026-02-26 11:13:26.263542202 +0000 UTC m=+152.074368826 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs") pod "network-metrics-daemon-v5ctv" (UID: "6956c039-cf77-429b-8f7f-f93ba195d321") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 11:13:11 crc kubenswrapper[4699]: E0226 11:13:11.561495 4699 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 11:13:12 crc kubenswrapper[4699]: I0226 11:13:12.259749 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:12 crc kubenswrapper[4699]: I0226 11:13:12.259749 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:12 crc kubenswrapper[4699]: E0226 11:13:12.260216 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:12 crc kubenswrapper[4699]: I0226 11:13:12.259792 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:12 crc kubenswrapper[4699]: I0226 11:13:12.259749 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:12 crc kubenswrapper[4699]: E0226 11:13:12.260336 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:12 crc kubenswrapper[4699]: E0226 11:13:12.260401 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:12 crc kubenswrapper[4699]: E0226 11:13:12.260223 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:13 crc kubenswrapper[4699]: I0226 11:13:13.261098 4699 scope.go:117] "RemoveContainer" containerID="46aba02a00b28d54085033583368e3a91e4a1f042cab3b0abc3885f717555396" Feb 26 11:13:13 crc kubenswrapper[4699]: I0226 11:13:13.971287 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cw6vx_cd12b2df-7af6-45bc-88e7-d5e5e6451e65/ovnkube-controller/1.log" Feb 26 11:13:13 crc kubenswrapper[4699]: I0226 11:13:13.974359 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerStarted","Data":"063e6f60e4a1aae569cb37aa15eefe27977691a15f63d85e697b606d5a1771d0"} Feb 26 11:13:13 crc kubenswrapper[4699]: I0226 11:13:13.974784 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:13:13 crc kubenswrapper[4699]: I0226 11:13:13.988999 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:13Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.003789 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:14Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.018626 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:14Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.033556 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:14Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.044765 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:14Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.064463 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://063e6f60e4a1aae569cb37aa15eefe27977691a15f63d85e697b606d5a1771d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46aba02a00b28d54085033583368e3a91e4a1f042cab3b0abc3885f717555396\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T11:12:57Z\\\",\\\"message\\\":\\\"openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0226 11:12:56.988067 6769 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-gbl2h in node crc\\\\nI0226 11:12:56.988312 6769 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-gbl2h after 0 failed attempt(s)\\\\nI0226 11:12:56.988318 6769 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-gbl2h\\\\nI0226 11:12:56.988176 6769 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI0226 11:12:56.988330 6769 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0226 11:12:56.988335 6769 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0226 11:12:56.988194 6769 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0226 11:12:56.988346 6769 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0226 11:12:56.988353 6769 ovn.go:134] Ensuring zone local for Pod openshift-network-operato\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:13:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:14Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.080073 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gs59q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adbc7948-b89f-46f1-8ebd-c5406fee4e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b10028b125687697ebd614eb89d3715bc4913fd8dabed0a60e56572d469f5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8kzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gs59q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:14Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.094975 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:14Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.107414 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ad0a6b-b6ae-493c-9d3c-22b171765374\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e3cbce38225d1f5dc3e506a9bc813f8b1e76e2f3315a83ce984c0b51ad0b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a914b4d63550a877b25fc802fe067357dc144ba29c8d4cff98b2d3d1c10baa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0c4b60e5b4a7561832379bb034e4202ac67058677a2e7e362703e8f5952b83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b818c7e0d202c03b73fbaacb76a36fdf224d3674ac77a316eb94ce14a45836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b818c7e0d202c03b73fbaacb76a36fdf224d3674ac77a316eb94ce14a45836\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:14Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.120334 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:14Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.137831 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:14Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.153348 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v5ctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6956c039-cf77-429b-8f7f-f93ba195d321\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v5ctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:14Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.175363 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b29557dbd4105e597cfdc9d373556a23fe5f57a9d5ebf3c8bf9db2cd72c81886\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:14Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.204351 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:14Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.222029 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:14Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.249259 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aff9ad002d597d66f0897adccd2319697d917dad946cc1e3fbb6b419397032e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:14Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.259883 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.259907 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.259946 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:14 crc kubenswrapper[4699]: E0226 11:13:14.260031 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.260168 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:14 crc kubenswrapper[4699]: E0226 11:13:14.260162 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:14 crc kubenswrapper[4699]: E0226 11:13:14.260245 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:14 crc kubenswrapper[4699]: E0226 11:13:14.260365 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.265446 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dd0f846-a702-4f37-a862-f620cb23e7bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1982fcdf20248152190117fa9699973c4f989d3a197f947e71f8066f8f16218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19dd2547b7a142b99f0170df619d1b1750e287507d7f8b8bad3299a4a3d23849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9nrn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:14Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.646063 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.646138 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.646152 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.646168 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.646178 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:13:14Z","lastTransitionTime":"2026-02-26T11:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:13:14 crc kubenswrapper[4699]: E0226 11:13:14.658993 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:14Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.662571 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.662628 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.662641 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.662659 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.662671 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:13:14Z","lastTransitionTime":"2026-02-26T11:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:13:14 crc kubenswrapper[4699]: E0226 11:13:14.674853 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:14Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.678456 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.678489 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.678499 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.678513 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.678522 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:13:14Z","lastTransitionTime":"2026-02-26T11:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:13:14 crc kubenswrapper[4699]: E0226 11:13:14.693933 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:14Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.697704 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.697746 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.697758 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.697775 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.697787 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:13:14Z","lastTransitionTime":"2026-02-26T11:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:13:14 crc kubenswrapper[4699]: E0226 11:13:14.709961 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:14Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.714518 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.714562 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.714572 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.714592 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.714602 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:13:14Z","lastTransitionTime":"2026-02-26T11:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:13:14 crc kubenswrapper[4699]: E0226 11:13:14.726730 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:14Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:14 crc kubenswrapper[4699]: E0226 11:13:14.726891 4699 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.979370 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cw6vx_cd12b2df-7af6-45bc-88e7-d5e5e6451e65/ovnkube-controller/2.log" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.980456 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cw6vx_cd12b2df-7af6-45bc-88e7-d5e5e6451e65/ovnkube-controller/1.log" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.983581 4699 generic.go:334] "Generic (PLEG): container finished" podID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerID="063e6f60e4a1aae569cb37aa15eefe27977691a15f63d85e697b606d5a1771d0" exitCode=1 Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.983680 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerDied","Data":"063e6f60e4a1aae569cb37aa15eefe27977691a15f63d85e697b606d5a1771d0"} Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.983784 4699 scope.go:117] "RemoveContainer" containerID="46aba02a00b28d54085033583368e3a91e4a1f042cab3b0abc3885f717555396" Feb 26 11:13:14 crc kubenswrapper[4699]: I0226 11:13:14.984510 4699 scope.go:117] "RemoveContainer" containerID="063e6f60e4a1aae569cb37aa15eefe27977691a15f63d85e697b606d5a1771d0" Feb 26 11:13:14 crc kubenswrapper[4699]: E0226 11:13:14.984738 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cw6vx_openshift-ovn-kubernetes(cd12b2df-7af6-45bc-88e7-d5e5e6451e65)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" Feb 26 11:13:15 crc kubenswrapper[4699]: I0226 11:13:15.008696 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://063e6f60e4a1aae569cb37aa15eefe27977691a15f63d85e697b606d5a1771d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46aba02a00b28d54085033583368e3a91e4a1f042cab3b0abc3885f717555396\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T11:12:57Z\\\",\\\"message\\\":\\\"openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0226 11:12:56.988067 6769 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-gbl2h in node crc\\\\nI0226 11:12:56.988312 6769 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-gbl2h after 0 failed attempt(s)\\\\nI0226 11:12:56.988318 6769 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-gbl2h\\\\nI0226 11:12:56.988176 6769 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI0226 11:12:56.988330 6769 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0226 11:12:56.988335 6769 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0226 11:12:56.988194 6769 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0226 11:12:56.988346 6769 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0226 11:12:56.988353 6769 ovn.go:134] Ensuring zone local for Pod openshift-network-operato\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://063e6f60e4a1aae569cb37aa15eefe27977691a15f63d85e697b606d5a1771d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"ernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0226 11:13:14.559644 7027 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 11:13:14.559182 7027 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 11:13:14.560083 7027 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 11:13:14.560811 7027 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0226 11:13:14.560836 7027 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0226 11:13:14.561218 7027 factory.go:656] Stopping watch factory\\\\nI0226 11:13:14.561367 7027 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 11:13:14.561415 7027 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0226 11:13:14.562003 7027 ovnkube.go:599] Stopped ovnkube\\\\nI0226 11:13:14.562038 7027 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0226 11:13:14.562134 7027 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:13:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:15Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:15 crc kubenswrapper[4699]: I0226 11:13:15.021274 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gs59q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adbc7948-b89f-46f1-8ebd-c5406fee4e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b10028b125687697ebd614eb89d3715bc4913fd8dabed0a60e56572d469f5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8kzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gs59q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:15Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:15 crc kubenswrapper[4699]: I0226 11:13:15.037359 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:15Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:15 crc kubenswrapper[4699]: I0226 11:13:15.054862 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:15Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:15 crc kubenswrapper[4699]: I0226 11:13:15.070164 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:15Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:15 crc kubenswrapper[4699]: I0226 11:13:15.084442 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:15Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:15 crc kubenswrapper[4699]: I0226 11:13:15.098620 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v5ctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6956c039-cf77-429b-8f7f-f93ba195d321\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v5ctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:15Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:15 crc kubenswrapper[4699]: I0226 11:13:15.115434 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b29557dbd4105e597cfdc9d373556a23fe5f57a9d5ebf3c8bf9db2cd72c81886\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:15Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:15 crc kubenswrapper[4699]: I0226 11:13:15.131581 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ad0a6b-b6ae-493c-9d3c-22b171765374\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e3cbce38225d1f5dc3e506a9bc813f8b1e76e2f3315a83ce984c0b51ad0b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a914b4d63550a877b25fc802fe067357dc144ba29c8d4cff98b2d3d1c10baa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0c4b60e5b4a7561832379bb034e4202ac67058677a2e7e362703e8f5952b83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b818c7e0d202c03b73fbaacb76a36fdf224d3674ac77a316eb94ce14a45836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b818c7e0d202c03b73fbaacb76a36fdf224d3674ac77a316eb94ce14a45836\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:15Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:15 crc kubenswrapper[4699]: I0226 11:13:15.149081 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:15Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:15 crc kubenswrapper[4699]: I0226 11:13:15.162595 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:15Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:15 crc kubenswrapper[4699]: I0226 11:13:15.174752 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dd0f846-a702-4f37-a862-f620cb23e7bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1982fcdf20248152190117fa9699973c4f989d3a197f947e71f8066f8f16218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19dd2547b7a142b99f0170df619d1b1750e287507d7f8b8bad3299a4a3d23849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9nrn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:15Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:15 crc kubenswrapper[4699]: I0226 11:13:15.194626 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:15Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:15 crc kubenswrapper[4699]: I0226 11:13:15.210828 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:15Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:15 crc kubenswrapper[4699]: I0226 11:13:15.226649 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aff9ad002d597d66f0897adccd2319697d917dad946cc1e3fbb6b419397032e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:15Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:15 crc kubenswrapper[4699]: I0226 11:13:15.241693 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:15Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:15 crc kubenswrapper[4699]: I0226 11:13:15.256777 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:15Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:15 crc kubenswrapper[4699]: I0226 11:13:15.989501 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cw6vx_cd12b2df-7af6-45bc-88e7-d5e5e6451e65/ovnkube-controller/2.log" Feb 26 11:13:15 crc kubenswrapper[4699]: I0226 11:13:15.993378 4699 scope.go:117] "RemoveContainer" containerID="063e6f60e4a1aae569cb37aa15eefe27977691a15f63d85e697b606d5a1771d0" Feb 26 11:13:15 crc kubenswrapper[4699]: E0226 11:13:15.993628 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cw6vx_openshift-ovn-kubernetes(cd12b2df-7af6-45bc-88e7-d5e5e6451e65)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.005863 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.018530 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.034732 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.048473 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.072250 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://063e6f60e4a1aae569cb37aa15eefe27977691a15f63d85e697b606d5a1771d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://063e6f60e4a1aae569cb37aa15eefe27977691a15f63d85e697b606d5a1771d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"ernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0226 11:13:14.559644 7027 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 11:13:14.559182 7027 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 11:13:14.560083 7027 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 11:13:14.560811 7027 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0226 11:13:14.560836 7027 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0226 11:13:14.561218 7027 factory.go:656] Stopping watch factory\\\\nI0226 11:13:14.561367 7027 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 11:13:14.561415 7027 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0226 11:13:14.562003 7027 ovnkube.go:599] Stopped ovnkube\\\\nI0226 11:13:14.562038 7027 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0226 11:13:14.562134 7027 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:13:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cw6vx_openshift-ovn-kubernetes(cd12b2df-7af6-45bc-88e7-d5e5e6451e65)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.084042 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gs59q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adbc7948-b89f-46f1-8ebd-c5406fee4e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b10028b125687697ebd614eb89d3715bc4913fd8dabed0a60e56572d469f5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8kzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gs59q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.098664 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.110706 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.121415 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.135525 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.149159 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v5ctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6956c039-cf77-429b-8f7f-f93ba195d321\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v5ctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.169136 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b29557dbd4105e597cfdc9d373556a23fe5f57a9d5ebf3c8bf9db2cd72c81886\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.184508 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ad0a6b-b6ae-493c-9d3c-22b171765374\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e3cbce38225d1f5dc3e506a9bc813f8b1e76e2f3315a83ce984c0b51ad0b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a914b4d63550a877b25fc802fe067357dc144ba29c8d4cff98b2d3d1c10baa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0c4b60e5b4a7561832379bb034e4202ac67058677a2e7e362703e8f5952b83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b818c7e0d202c03b73fbaacb76a36fdf224d3674ac77a316eb94ce14a45836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b818c7e0d202c03b73fbaacb76a36fdf224d3674ac77a316eb94ce14a45836\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.198193 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.214584 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aff9ad002d597d66f0897adccd2319697d917dad946cc1e3fbb6b419397032e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.229893 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dd0f846-a702-4f37-a862-f620cb23e7bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1982fcdf20248152190117fa9699973c4f989d3a197f947e71f8066f8f16218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19dd2547b7a142b99f0170df619d1b1750e287507d7f8b8bad3299a4a3d23849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9nrn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.252996 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.260294 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.260357 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:16 crc kubenswrapper[4699]: E0226 11:13:16.260555 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.260679 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.260691 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:16 crc kubenswrapper[4699]: E0226 11:13:16.260820 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:16 crc kubenswrapper[4699]: E0226 11:13:16.260947 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:16 crc kubenswrapper[4699]: E0226 11:13:16.261100 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.284968 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.299989 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.317529 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aff9ad002d597d66f0897adccd2319697d917dad946cc1e3fbb6b419397032e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.329824 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dd0f846-a702-4f37-a862-f620cb23e7bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1982fcdf20248152190117fa9699973c4f989d3a197f947e71f8066f8f16218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19dd2547b7a142b99f0170df619d1b1750e287507d7f8b8bad3299a4a3d23849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9nrn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.345575 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.357978 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.368298 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gs59q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adbc7948-b89f-46f1-8ebd-c5406fee4e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b10028b125687697ebd614eb89d3715bc4913fd8dabed0a60e56572d469f5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8kzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gs59q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.379874 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.392522 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.406837 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.418245 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.438331 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://063e6f60e4a1aae569cb37aa15eefe27977691a15f63d85e697b606d5a1771d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://063e6f60e4a1aae569cb37aa15eefe27977691a15f63d85e697b606d5a1771d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"ernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0226 11:13:14.559644 7027 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 11:13:14.559182 7027 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 11:13:14.560083 7027 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 11:13:14.560811 7027 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0226 11:13:14.560836 7027 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0226 11:13:14.561218 7027 factory.go:656] Stopping watch factory\\\\nI0226 11:13:14.561367 7027 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 11:13:14.561415 7027 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0226 11:13:14.562003 7027 ovnkube.go:599] Stopped ovnkube\\\\nI0226 11:13:14.562038 7027 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0226 11:13:14.562134 7027 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:13:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cw6vx_openshift-ovn-kubernetes(cd12b2df-7af6-45bc-88e7-d5e5e6451e65)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.453971 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b29557dbd4105e597cfdc9d373556a23fe5f57a9d5ebf3c8bf9db2cd72c81886\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.466810 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ad0a6b-b6ae-493c-9d3c-22b171765374\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e3cbce38225d1f5dc3e506a9bc813f8b1e76e2f3315a83ce984c0b51ad0b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a914b4d63550a877b25fc802fe067357dc144ba29c8d4cff98b2d3d1c10baa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0c4b60e5b4a7561832379bb034e4202ac67058677a2e7e362703e8f5952b83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b818c7e0d202c03b73fbaacb76a36fdf224d3674ac77a316eb94ce14a45836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b818c7e0d202c03b73fbaacb76a36fdf224d3674ac77a316eb94ce14a45836\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.479381 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.495579 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: I0226 11:13:16.507777 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v5ctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6956c039-cf77-429b-8f7f-f93ba195d321\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v5ctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:16Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:16 crc kubenswrapper[4699]: E0226 11:13:16.561894 4699 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 11:13:18 crc kubenswrapper[4699]: I0226 11:13:18.259790 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:18 crc kubenswrapper[4699]: I0226 11:13:18.259877 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:18 crc kubenswrapper[4699]: I0226 11:13:18.259931 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:18 crc kubenswrapper[4699]: E0226 11:13:18.260009 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:18 crc kubenswrapper[4699]: I0226 11:13:18.260067 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:18 crc kubenswrapper[4699]: E0226 11:13:18.260150 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:18 crc kubenswrapper[4699]: E0226 11:13:18.260195 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:18 crc kubenswrapper[4699]: E0226 11:13:18.260257 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:20 crc kubenswrapper[4699]: I0226 11:13:20.259919 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:20 crc kubenswrapper[4699]: I0226 11:13:20.259948 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:20 crc kubenswrapper[4699]: I0226 11:13:20.260434 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:20 crc kubenswrapper[4699]: E0226 11:13:20.260618 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:20 crc kubenswrapper[4699]: I0226 11:13:20.260705 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:20 crc kubenswrapper[4699]: E0226 11:13:20.260807 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:20 crc kubenswrapper[4699]: E0226 11:13:20.260876 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:20 crc kubenswrapper[4699]: E0226 11:13:20.260988 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:20 crc kubenswrapper[4699]: I0226 11:13:20.274434 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 26 11:13:21 crc kubenswrapper[4699]: E0226 11:13:21.563050 4699 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 11:13:22 crc kubenswrapper[4699]: I0226 11:13:22.260011 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:22 crc kubenswrapper[4699]: I0226 11:13:22.260094 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:22 crc kubenswrapper[4699]: I0226 11:13:22.260050 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:22 crc kubenswrapper[4699]: I0226 11:13:22.260182 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:22 crc kubenswrapper[4699]: E0226 11:13:22.260202 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:22 crc kubenswrapper[4699]: E0226 11:13:22.260300 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:22 crc kubenswrapper[4699]: E0226 11:13:22.260393 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:22 crc kubenswrapper[4699]: E0226 11:13:22.260504 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.218909 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:13:24 crc kubenswrapper[4699]: E0226 11:13:24.219176 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:28.219108293 +0000 UTC m=+214.029934777 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.260729 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.260776 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.260872 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.260941 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:24 crc kubenswrapper[4699]: E0226 11:13:24.260871 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:24 crc kubenswrapper[4699]: E0226 11:13:24.261017 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:24 crc kubenswrapper[4699]: E0226 11:13:24.261178 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:24 crc kubenswrapper[4699]: E0226 11:13:24.261240 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.319789 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.319850 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.319883 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.319903 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:24 crc kubenswrapper[4699]: E0226 11:13:24.319910 4699 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 11:13:24 crc kubenswrapper[4699]: E0226 11:13:24.319958 4699 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 11:13:24 crc kubenswrapper[4699]: E0226 11:13:24.319982 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 11:14:28.319964436 +0000 UTC m=+214.130790870 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 11:13:24 crc kubenswrapper[4699]: E0226 11:13:24.320002 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 11:14:28.319993097 +0000 UTC m=+214.130819531 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 11:13:24 crc kubenswrapper[4699]: E0226 11:13:24.320025 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 11:13:24 crc kubenswrapper[4699]: E0226 11:13:24.320048 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 11:13:24 crc kubenswrapper[4699]: E0226 11:13:24.320062 4699 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:13:24 crc kubenswrapper[4699]: E0226 11:13:24.320104 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 11:14:28.320093991 +0000 UTC m=+214.130920425 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:13:24 crc kubenswrapper[4699]: E0226 11:13:24.320223 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 11:13:24 crc kubenswrapper[4699]: E0226 11:13:24.320238 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 11:13:24 crc kubenswrapper[4699]: E0226 11:13:24.320247 4699 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:13:24 crc kubenswrapper[4699]: E0226 11:13:24.320281 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 11:14:28.320274397 +0000 UTC m=+214.131100831 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.768832 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.768857 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.768865 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.768878 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.768887 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:13:24Z","lastTransitionTime":"2026-02-26T11:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:13:24 crc kubenswrapper[4699]: E0226 11:13:24.780638 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:24Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.783892 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.783920 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.783930 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.783944 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.783954 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:13:24Z","lastTransitionTime":"2026-02-26T11:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:13:24 crc kubenswrapper[4699]: E0226 11:13:24.795532 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:24Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.799777 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.799812 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.799822 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.799838 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.799851 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:13:24Z","lastTransitionTime":"2026-02-26T11:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:13:24 crc kubenswrapper[4699]: E0226 11:13:24.810964 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:24Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.818360 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.818430 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.818506 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.818531 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.818548 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:13:24Z","lastTransitionTime":"2026-02-26T11:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:13:24 crc kubenswrapper[4699]: E0226 11:13:24.832423 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:24Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.836269 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.836319 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.836328 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.836342 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:13:24 crc kubenswrapper[4699]: I0226 11:13:24.836353 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:13:24Z","lastTransitionTime":"2026-02-26T11:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:13:24 crc kubenswrapper[4699]: E0226 11:13:24.847750 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f74e239-3726-44b7-b791-47b33a2699be\\\",\\\"systemUUID\\\":\\\"e4404db5-04f3-42e6-90eb-21c35124a700\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:24Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:24 crc kubenswrapper[4699]: E0226 11:13:24.847861 4699 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 11:13:26 crc kubenswrapper[4699]: I0226 11:13:26.260198 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:26 crc kubenswrapper[4699]: E0226 11:13:26.260339 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:26 crc kubenswrapper[4699]: I0226 11:13:26.260410 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:26 crc kubenswrapper[4699]: I0226 11:13:26.260541 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:26 crc kubenswrapper[4699]: E0226 11:13:26.260544 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:26 crc kubenswrapper[4699]: I0226 11:13:26.260564 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:26 crc kubenswrapper[4699]: E0226 11:13:26.260820 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:26 crc kubenswrapper[4699]: E0226 11:13:26.260839 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:26 crc kubenswrapper[4699]: I0226 11:13:26.276299 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee33485d-044d-4356-a626-df5e4625a4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:13:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b29557dbd4105e597cfdc9d373556a23fe5f57a9d5ebf3c8bf9db2cd72c81886\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:12:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 11:12:07.177508 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 11:12:07.178221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 11:12:07.179410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427621089/tls.crt::/tmp/serving-cert-3427621089/tls.key\\\\\\\"\\\\nI0226 11:12:07.693355 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 11:12:07.699737 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 11:12:07.699770 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 11:12:07.699794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 11:12:07.699801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 11:12:07.704076 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 11:12:07.704163 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 11:12:07.704178 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 11:12:07.704182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 11:12:07.704187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 11:12:07.704192 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 11:12:07.704180 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 11:12:07.707201 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:26 crc kubenswrapper[4699]: I0226 11:13:26.293511 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ad0a6b-b6ae-493c-9d3c-22b171765374\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e3cbce38225d1f5dc3e506a9bc813f8b1e76e2f3315a83ce984c0b51ad0b22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a914b4d63550a877b25fc802fe067357dc144ba29c8d4cff98b2d3d1c10baa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0c4b60e5b4a7561832379bb034e4202ac67058677a2e7e362703e8f5952b83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b818c7e0d202c03b73fbaacb76a36fdf224d3674ac77a316eb94ce14a45836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b818c7e0d202c03b73fbaacb76a36fdf224d3674ac77a316eb94ce14a45836\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:26 crc kubenswrapper[4699]: I0226 11:13:26.309442 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:26 crc kubenswrapper[4699]: I0226 11:13:26.323641 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6005eb1355f9bc1dc9d5c6b4aa90bb33d74c16bd28f074c3266aea1b315fb1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6d74fea843381a23d5cabacf31e931a0dd98b4a11abcc02d3df951692965534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:26 crc kubenswrapper[4699]: I0226 11:13:26.338060 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v5ctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6956c039-cf77-429b-8f7f-f93ba195d321\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phthm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v5ctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:26 crc kubenswrapper[4699]: I0226 11:13:26.339552 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs\") pod \"network-metrics-daemon-v5ctv\" (UID: \"6956c039-cf77-429b-8f7f-f93ba195d321\") " pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:26 crc kubenswrapper[4699]: E0226 11:13:26.339702 4699 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 11:13:26 crc kubenswrapper[4699]: E0226 11:13:26.339756 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs podName:6956c039-cf77-429b-8f7f-f93ba195d321 nodeName:}" failed. No retries permitted until 2026-02-26 11:13:58.339743394 +0000 UTC m=+184.150569828 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs") pod "network-metrics-daemon-v5ctv" (UID: "6956c039-cf77-429b-8f7f-f93ba195d321") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 11:13:26 crc kubenswrapper[4699]: I0226 11:13:26.359642 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb09c7e7-a19a-4c5e-ace4-584f63f71fe8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52919194c48ba8d2d5312f534646aa43d41aba48324f4aca79e502ae157df536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0dff35454006f3c8277e6b0f96fb676434096cede813fbd68783b3982e4a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb55f15e1f01b945bdfef17406997d2a9d7835fd4dddd59e5a6661b180f3ca22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0dfe169d520e86bf6d1f1712a91c38a5b4d90661868f45c203e8b3db917ebf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3059e4195de5c0d992c4854fd6f8f83e494688b7c11b5b29835bbda22f4050b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef5999e9d480989b990803f3134d541b78f47eb97d312682d198507f853a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80a98e152c962355bb43eb7cd21f431853a8b19bb9e1cac28fe48a2bc86bd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a604176804dbcfad4c0f5cf5945c679818a0453a047091115fa9f8a8e573acb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:26 crc kubenswrapper[4699]: I0226 11:13:26.371864 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a028f97a63d8d7fa34c1029fb62aea26be554f157d879fdcf7553807a75d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:26 crc kubenswrapper[4699]: I0226 11:13:26.386371 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d160b5-697e-42fa-8cd0-8b7b337820c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aff9ad002d597d66f0897adccd2319697d917dad946cc1e3fbb6b419397032e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc8c2167e8d3e5078f58aa53ee8861687c577adbbc475e0fc2e340abf6afcc01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83232777cdd898271757bffefc3c36b87e28390feea956025cc67851b2e4ea61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677566e01c477fea7d7ca41804df01d70b206b79f0acacc945f3dad5dc76bb3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2ffcec6c52eae876bc972074bff0b05a6d42620a9c6f3f4b97e2f4ff904e3e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd4d3547a1ae5e11f679ad08111ac12370edea14dddb45b168ca4b26295f3a38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39bbf7f7afc2aafedfa54a04af62da4a9d363986585fd2eac61f4a1d4ed34c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tczc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfp9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:26 crc kubenswrapper[4699]: I0226 11:13:26.398788 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dd0f846-a702-4f37-a862-f620cb23e7bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1982fcdf20248152190117fa9699973c4f989d3a197f947e71f8066f8f16218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19dd2547b7a142b99f0170df619d1b1750e287507d7f8b8bad3299a4a3d23849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnqfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9nrn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:26 crc kubenswrapper[4699]: I0226 11:13:26.412239 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9106aba-3c7b-4722-a051-a7fe53d9b619\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:11:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:10:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55229c06747f2b5d388af00f4d2aa770f2786ea7f8015579fb05381eee44235f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4003edefbd2aac9a706e0d56e2791c34c4bc9a820e5cda0ab4cf3172fc4f5c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T11:11:32Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 11:11:02.135153 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 11:11:02.147322 1 observer_polling.go:159] Starting file observer\\\\nI0226 11:11:02.237317 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 11:11:02.242318 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0226 11:11:32.466165 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcbdf473c08abfc93be6ee643eb86aebdaf8cae59cbe4c844b800862b15f7434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bc27153e659e049d639cf7b8963c1485433aed35f5efe5e88f1cc275d92a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1cda06107373ef4a7be9d68d9a39ed9f7351913e1deb1bd9e7d825d93ee54a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:10:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:26 crc kubenswrapper[4699]: I0226 11:13:26.432389 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2k6b7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32ce77d1-5287-4674-aeda-810070efbb29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2k6b7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:26 crc kubenswrapper[4699]: I0226 11:13:26.446802 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8a5817d51e63aa7fe3012de90c758002d8db1d9d829509434f8925c373f0b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25g9f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-28p79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:26 crc kubenswrapper[4699]: I0226 11:13:26.458965 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gs59q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adbc7948-b89f-46f1-8ebd-c5406fee4e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b10028b125687697ebd614eb89d3715bc4913fd8dabed0a60e56572d469f5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8kzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gs59q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:26 crc kubenswrapper[4699]: I0226 11:13:26.474382 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:26 crc kubenswrapper[4699]: I0226 11:13:26.490716 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:26 crc kubenswrapper[4699]: I0226 11:13:26.506969 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250276015ba7d961b83bd0201443683d6f3558be50759ff714cff05dd2cc6393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:26 crc kubenswrapper[4699]: I0226 11:13:26.519567 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gbl2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db105b7b-9325-4f20-a760-06c045ea844f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed93026c595f2e3015e02073b545216e79ac8e68085a3b2ccead1a25e6c0afce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9n6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gbl2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:26 crc kubenswrapper[4699]: I0226 11:13:26.543202 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T11:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://063e6f60e4a1aae569cb37aa15eefe27977691a15f63d85e697b606d5a1771d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://063e6f60e4a1aae569cb37aa15eefe27977691a15f63d85e697b606d5a1771d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T11:13:14Z\\\",\\\"message\\\":\\\"ernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0226 11:13:14.559644 7027 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 11:13:14.559182 7027 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 11:13:14.560083 7027 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 11:13:14.560811 7027 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0226 11:13:14.560836 7027 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0226 11:13:14.561218 7027 factory.go:656] Stopping watch factory\\\\nI0226 11:13:14.561367 7027 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 11:13:14.561415 7027 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0226 11:13:14.562003 7027 ovnkube.go:599] Stopped ovnkube\\\\nI0226 11:13:14.562038 7027 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0226 11:13:14.562134 7027 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T11:13:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cw6vx_openshift-ovn-kubernetes(cd12b2df-7af6-45bc-88e7-d5e5e6451e65)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T11:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T11:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T11:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnmg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T11:12:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cw6vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T11:13:26Z is after 2025-08-24T17:21:41Z" Feb 26 11:13:26 crc kubenswrapper[4699]: E0226 11:13:26.563568 4699 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 11:13:28 crc kubenswrapper[4699]: I0226 11:13:28.261420 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:28 crc kubenswrapper[4699]: I0226 11:13:28.261413 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:28 crc kubenswrapper[4699]: I0226 11:13:28.261500 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:28 crc kubenswrapper[4699]: I0226 11:13:28.261508 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:28 crc kubenswrapper[4699]: E0226 11:13:28.261596 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:28 crc kubenswrapper[4699]: E0226 11:13:28.261718 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:28 crc kubenswrapper[4699]: E0226 11:13:28.262218 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:28 crc kubenswrapper[4699]: E0226 11:13:28.262282 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:28 crc kubenswrapper[4699]: I0226 11:13:28.262421 4699 scope.go:117] "RemoveContainer" containerID="063e6f60e4a1aae569cb37aa15eefe27977691a15f63d85e697b606d5a1771d0" Feb 26 11:13:28 crc kubenswrapper[4699]: E0226 11:13:28.262665 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cw6vx_openshift-ovn-kubernetes(cd12b2df-7af6-45bc-88e7-d5e5e6451e65)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" Feb 26 11:13:28 crc kubenswrapper[4699]: I0226 11:13:28.274252 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 26 11:13:30 crc kubenswrapper[4699]: I0226 11:13:30.260348 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:30 crc kubenswrapper[4699]: I0226 11:13:30.260430 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:30 crc kubenswrapper[4699]: I0226 11:13:30.260500 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:30 crc kubenswrapper[4699]: E0226 11:13:30.260503 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:30 crc kubenswrapper[4699]: E0226 11:13:30.260578 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:30 crc kubenswrapper[4699]: I0226 11:13:30.260610 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:30 crc kubenswrapper[4699]: E0226 11:13:30.260662 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:30 crc kubenswrapper[4699]: E0226 11:13:30.260724 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:31 crc kubenswrapper[4699]: E0226 11:13:31.565250 4699 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 11:13:32 crc kubenswrapper[4699]: I0226 11:13:32.051743 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2k6b7_32ce77d1-5287-4674-aeda-810070efbb29/kube-multus/0.log" Feb 26 11:13:32 crc kubenswrapper[4699]: I0226 11:13:32.051796 4699 generic.go:334] "Generic (PLEG): container finished" podID="32ce77d1-5287-4674-aeda-810070efbb29" containerID="b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3" exitCode=1 Feb 26 11:13:32 crc kubenswrapper[4699]: I0226 11:13:32.051835 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2k6b7" event={"ID":"32ce77d1-5287-4674-aeda-810070efbb29","Type":"ContainerDied","Data":"b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3"} Feb 26 11:13:32 crc kubenswrapper[4699]: I0226 11:13:32.052268 4699 scope.go:117] "RemoveContainer" containerID="b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3" Feb 26 11:13:32 crc kubenswrapper[4699]: I0226 11:13:32.156435 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-gbl2h" podStartSLOduration=106.156419091 podStartE2EDuration="1m46.156419091s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:13:32.130720903 +0000 UTC m=+157.941547347" watchObservedRunningTime="2026-02-26 11:13:32.156419091 +0000 UTC m=+157.967245525" Feb 26 11:13:32 crc kubenswrapper[4699]: I0226 11:13:32.170016 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gs59q" podStartSLOduration=106.169996966 podStartE2EDuration="1m46.169996966s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:13:32.169779508 +0000 UTC m=+157.980605952" watchObservedRunningTime="2026-02-26 11:13:32.169996966 +0000 UTC m=+157.980823400" Feb 26 11:13:32 crc kubenswrapper[4699]: I0226 11:13:32.209496 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=29.209434765 podStartE2EDuration="29.209434765s" podCreationTimestamp="2026-02-26 11:13:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:13:32.208218062 +0000 UTC m=+158.019044506" watchObservedRunningTime="2026-02-26 11:13:32.209434765 +0000 UTC m=+158.020261189" Feb 26 11:13:32 crc kubenswrapper[4699]: I0226 11:13:32.211171 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.211154217 podStartE2EDuration="1m8.211154217s" podCreationTimestamp="2026-02-26 11:12:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:13:32.190201658 +0000 UTC m=+158.001028112" watchObservedRunningTime="2026-02-26 11:13:32.211154217 +0000 UTC m=+158.021980671" Feb 26 11:13:32 crc kubenswrapper[4699]: I0226 11:13:32.261236 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:32 crc kubenswrapper[4699]: I0226 11:13:32.261311 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:32 crc kubenswrapper[4699]: I0226 11:13:32.261319 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:32 crc kubenswrapper[4699]: I0226 11:13:32.261236 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:32 crc kubenswrapper[4699]: E0226 11:13:32.261425 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:32 crc kubenswrapper[4699]: E0226 11:13:32.261614 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:32 crc kubenswrapper[4699]: E0226 11:13:32.261633 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:32 crc kubenswrapper[4699]: E0226 11:13:32.261754 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:32 crc kubenswrapper[4699]: I0226 11:13:32.295242 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=4.295223131 podStartE2EDuration="4.295223131s" podCreationTimestamp="2026-02-26 11:13:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:13:32.294476734 +0000 UTC m=+158.105303168" watchObservedRunningTime="2026-02-26 11:13:32.295223131 +0000 UTC m=+158.106049585" Feb 26 11:13:32 crc kubenswrapper[4699]: I0226 11:13:32.334603 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=47.334581417 podStartE2EDuration="47.334581417s" podCreationTimestamp="2026-02-26 11:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:13:32.319717336 +0000 UTC m=+158.130543770" watchObservedRunningTime="2026-02-26 11:13:32.334581417 +0000 UTC m=+158.145407851" Feb 26 11:13:32 crc kubenswrapper[4699]: I0226 11:13:32.354337 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-tfp9h" podStartSLOduration=106.354314832 podStartE2EDuration="1m46.354314832s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:13:32.353614787 +0000 UTC m=+158.164441241" watchObservedRunningTime="2026-02-26 11:13:32.354314832 +0000 UTC m=+158.165141276" Feb 26 11:13:32 crc kubenswrapper[4699]: I0226 11:13:32.390094 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=12.39007033 podStartE2EDuration="12.39007033s" podCreationTimestamp="2026-02-26 11:13:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:13:32.388980891 +0000 UTC m=+158.199807335" watchObservedRunningTime="2026-02-26 11:13:32.39007033 +0000 UTC m=+158.200896764" Feb 26 11:13:32 crc kubenswrapper[4699]: I0226 11:13:32.390465 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9nrn4" podStartSLOduration=105.390457533 podStartE2EDuration="1m45.390457533s" podCreationTimestamp="2026-02-26 11:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:13:32.370274522 +0000 UTC m=+158.181100976" watchObservedRunningTime="2026-02-26 11:13:32.390457533 +0000 UTC m=+158.201283967" Feb 26 11:13:33 crc kubenswrapper[4699]: I0226 11:13:33.060317 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2k6b7_32ce77d1-5287-4674-aeda-810070efbb29/kube-multus/0.log" Feb 26 11:13:33 crc kubenswrapper[4699]: I0226 11:13:33.060373 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2k6b7" event={"ID":"32ce77d1-5287-4674-aeda-810070efbb29","Type":"ContainerStarted","Data":"143a97abf6e80c5d27a74181526e16c9b98e3306181c3568beb75b7c14de4b31"} Feb 26 11:13:33 crc kubenswrapper[4699]: I0226 11:13:33.075938 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podStartSLOduration=107.075921846 podStartE2EDuration="1m47.075921846s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:13:32.423325468 +0000 UTC m=+158.234151902" watchObservedRunningTime="2026-02-26 11:13:33.075921846 +0000 UTC m=+158.886748290" Feb 26 11:13:33 crc kubenswrapper[4699]: I0226 11:13:33.076156 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2k6b7" podStartSLOduration=107.076151555 podStartE2EDuration="1m47.076151555s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:13:33.075565594 +0000 UTC m=+158.886392048" watchObservedRunningTime="2026-02-26 11:13:33.076151555 +0000 UTC m=+158.886977979" Feb 26 11:13:34 crc kubenswrapper[4699]: I0226 11:13:34.260026 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:34 crc kubenswrapper[4699]: I0226 11:13:34.260087 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:34 crc kubenswrapper[4699]: I0226 11:13:34.260087 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:34 crc kubenswrapper[4699]: I0226 11:13:34.260237 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:34 crc kubenswrapper[4699]: E0226 11:13:34.260227 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:34 crc kubenswrapper[4699]: E0226 11:13:34.260363 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:34 crc kubenswrapper[4699]: E0226 11:13:34.260422 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:34 crc kubenswrapper[4699]: E0226 11:13:34.260487 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:34 crc kubenswrapper[4699]: I0226 11:13:34.886524 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 11:13:34 crc kubenswrapper[4699]: I0226 11:13:34.886560 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 11:13:34 crc kubenswrapper[4699]: I0226 11:13:34.886568 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 11:13:34 crc kubenswrapper[4699]: I0226 11:13:34.886581 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 11:13:34 crc kubenswrapper[4699]: I0226 11:13:34.886589 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T11:13:34Z","lastTransitionTime":"2026-02-26T11:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 11:13:34 crc kubenswrapper[4699]: I0226 11:13:34.928158 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-bsj47"] Feb 26 11:13:34 crc kubenswrapper[4699]: I0226 11:13:34.929056 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bsj47" Feb 26 11:13:34 crc kubenswrapper[4699]: I0226 11:13:34.931658 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 26 11:13:34 crc kubenswrapper[4699]: I0226 11:13:34.932046 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 26 11:13:34 crc kubenswrapper[4699]: I0226 11:13:34.932277 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 26 11:13:34 crc kubenswrapper[4699]: I0226 11:13:34.932596 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 26 11:13:34 crc kubenswrapper[4699]: I0226 11:13:34.933378 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/80a20711-23cf-449e-891a-acba8d452c48-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bsj47\" (UID: \"80a20711-23cf-449e-891a-acba8d452c48\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bsj47" Feb 26 11:13:34 crc kubenswrapper[4699]: I0226 11:13:34.933484 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/80a20711-23cf-449e-891a-acba8d452c48-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bsj47\" (UID: \"80a20711-23cf-449e-891a-acba8d452c48\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bsj47" Feb 26 11:13:34 crc kubenswrapper[4699]: I0226 11:13:34.933557 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80a20711-23cf-449e-891a-acba8d452c48-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bsj47\" (UID: \"80a20711-23cf-449e-891a-acba8d452c48\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bsj47" Feb 26 11:13:34 crc kubenswrapper[4699]: I0226 11:13:34.933604 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/80a20711-23cf-449e-891a-acba8d452c48-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bsj47\" (UID: \"80a20711-23cf-449e-891a-acba8d452c48\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bsj47" Feb 26 11:13:34 crc kubenswrapper[4699]: I0226 11:13:34.933695 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/80a20711-23cf-449e-891a-acba8d452c48-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bsj47\" (UID: \"80a20711-23cf-449e-891a-acba8d452c48\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bsj47" Feb 26 11:13:35 crc kubenswrapper[4699]: I0226 11:13:35.034508 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/80a20711-23cf-449e-891a-acba8d452c48-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bsj47\" (UID: \"80a20711-23cf-449e-891a-acba8d452c48\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bsj47" Feb 26 11:13:35 crc kubenswrapper[4699]: I0226 11:13:35.034579 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/80a20711-23cf-449e-891a-acba8d452c48-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bsj47\" (UID: \"80a20711-23cf-449e-891a-acba8d452c48\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bsj47" Feb 26 11:13:35 crc kubenswrapper[4699]: I0226 11:13:35.034599 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/80a20711-23cf-449e-891a-acba8d452c48-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bsj47\" (UID: \"80a20711-23cf-449e-891a-acba8d452c48\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bsj47" Feb 26 11:13:35 crc kubenswrapper[4699]: I0226 11:13:35.034624 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/80a20711-23cf-449e-891a-acba8d452c48-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bsj47\" (UID: \"80a20711-23cf-449e-891a-acba8d452c48\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bsj47" Feb 26 11:13:35 crc kubenswrapper[4699]: I0226 11:13:35.034651 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80a20711-23cf-449e-891a-acba8d452c48-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bsj47\" (UID: \"80a20711-23cf-449e-891a-acba8d452c48\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bsj47" Feb 26 11:13:35 crc kubenswrapper[4699]: I0226 11:13:35.034674 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/80a20711-23cf-449e-891a-acba8d452c48-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bsj47\" (UID: \"80a20711-23cf-449e-891a-acba8d452c48\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bsj47" Feb 26 11:13:35 crc kubenswrapper[4699]: I0226 11:13:35.034758 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/80a20711-23cf-449e-891a-acba8d452c48-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bsj47\" (UID: \"80a20711-23cf-449e-891a-acba8d452c48\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bsj47" Feb 26 11:13:35 crc kubenswrapper[4699]: I0226 11:13:35.035507 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/80a20711-23cf-449e-891a-acba8d452c48-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bsj47\" (UID: \"80a20711-23cf-449e-891a-acba8d452c48\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bsj47" Feb 26 11:13:35 crc kubenswrapper[4699]: I0226 11:13:35.040897 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80a20711-23cf-449e-891a-acba8d452c48-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bsj47\" (UID: \"80a20711-23cf-449e-891a-acba8d452c48\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bsj47" Feb 26 11:13:35 crc kubenswrapper[4699]: I0226 11:13:35.054512 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/80a20711-23cf-449e-891a-acba8d452c48-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bsj47\" (UID: \"80a20711-23cf-449e-891a-acba8d452c48\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bsj47" Feb 26 11:13:35 crc kubenswrapper[4699]: I0226 11:13:35.221732 4699 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 26 11:13:35 crc kubenswrapper[4699]: I0226 11:13:35.231870 4699 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 26 11:13:35 crc kubenswrapper[4699]: I0226 11:13:35.241736 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bsj47" Feb 26 11:13:36 crc kubenswrapper[4699]: I0226 11:13:36.069307 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bsj47" event={"ID":"80a20711-23cf-449e-891a-acba8d452c48","Type":"ContainerStarted","Data":"611ddfebed22732aaf5520081cd27230ed43d015e2fe4c756eb480bed82899bb"} Feb 26 11:13:36 crc kubenswrapper[4699]: I0226 11:13:36.069354 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bsj47" event={"ID":"80a20711-23cf-449e-891a-acba8d452c48","Type":"ContainerStarted","Data":"2cc628b383fe90399d2ba5d6f403315a5522ed1fd1b3524410f0cb62790404ae"} Feb 26 11:13:36 crc kubenswrapper[4699]: I0226 11:13:36.083920 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bsj47" podStartSLOduration=110.083900154 podStartE2EDuration="1m50.083900154s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:13:36.082973231 +0000 UTC m=+161.893799665" watchObservedRunningTime="2026-02-26 11:13:36.083900154 +0000 UTC m=+161.894726588" Feb 26 11:13:36 crc kubenswrapper[4699]: I0226 11:13:36.259718 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:36 crc kubenswrapper[4699]: I0226 11:13:36.262506 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:36 crc kubenswrapper[4699]: I0226 11:13:36.262534 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:36 crc kubenswrapper[4699]: E0226 11:13:36.262511 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:36 crc kubenswrapper[4699]: I0226 11:13:36.262564 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:36 crc kubenswrapper[4699]: E0226 11:13:36.262640 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:36 crc kubenswrapper[4699]: E0226 11:13:36.262702 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:36 crc kubenswrapper[4699]: E0226 11:13:36.262792 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:36 crc kubenswrapper[4699]: E0226 11:13:36.565792 4699 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 11:13:38 crc kubenswrapper[4699]: I0226 11:13:38.259667 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:38 crc kubenswrapper[4699]: I0226 11:13:38.259730 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:38 crc kubenswrapper[4699]: E0226 11:13:38.259807 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:38 crc kubenswrapper[4699]: I0226 11:13:38.259839 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:38 crc kubenswrapper[4699]: I0226 11:13:38.259858 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:38 crc kubenswrapper[4699]: E0226 11:13:38.259935 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:38 crc kubenswrapper[4699]: E0226 11:13:38.260130 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:38 crc kubenswrapper[4699]: E0226 11:13:38.260357 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:40 crc kubenswrapper[4699]: I0226 11:13:40.260033 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:40 crc kubenswrapper[4699]: I0226 11:13:40.260079 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:40 crc kubenswrapper[4699]: I0226 11:13:40.260047 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:40 crc kubenswrapper[4699]: E0226 11:13:40.260198 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:40 crc kubenswrapper[4699]: I0226 11:13:40.260259 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:40 crc kubenswrapper[4699]: E0226 11:13:40.260334 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:40 crc kubenswrapper[4699]: E0226 11:13:40.260431 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:40 crc kubenswrapper[4699]: E0226 11:13:40.260493 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:41 crc kubenswrapper[4699]: E0226 11:13:41.567330 4699 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 11:13:42 crc kubenswrapper[4699]: I0226 11:13:42.260699 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:42 crc kubenswrapper[4699]: I0226 11:13:42.260759 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:42 crc kubenswrapper[4699]: I0226 11:13:42.260767 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:42 crc kubenswrapper[4699]: I0226 11:13:42.260713 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:42 crc kubenswrapper[4699]: E0226 11:13:42.260841 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:42 crc kubenswrapper[4699]: E0226 11:13:42.260936 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:42 crc kubenswrapper[4699]: E0226 11:13:42.261027 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:42 crc kubenswrapper[4699]: E0226 11:13:42.261083 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:43 crc kubenswrapper[4699]: I0226 11:13:43.260393 4699 scope.go:117] "RemoveContainer" containerID="063e6f60e4a1aae569cb37aa15eefe27977691a15f63d85e697b606d5a1771d0" Feb 26 11:13:44 crc kubenswrapper[4699]: I0226 11:13:44.096417 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cw6vx_cd12b2df-7af6-45bc-88e7-d5e5e6451e65/ovnkube-controller/2.log" Feb 26 11:13:44 crc kubenswrapper[4699]: I0226 11:13:44.100260 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerStarted","Data":"674b1ddc9ce52057921afe22948e78b0ac743b734851b7422144e06a6bedf770"} Feb 26 11:13:44 crc kubenswrapper[4699]: I0226 11:13:44.260360 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:44 crc kubenswrapper[4699]: I0226 11:13:44.260410 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:44 crc kubenswrapper[4699]: E0226 11:13:44.260532 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:44 crc kubenswrapper[4699]: I0226 11:13:44.260378 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:44 crc kubenswrapper[4699]: I0226 11:13:44.260603 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:44 crc kubenswrapper[4699]: E0226 11:13:44.260745 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:44 crc kubenswrapper[4699]: E0226 11:13:44.260782 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:44 crc kubenswrapper[4699]: E0226 11:13:44.260840 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:45 crc kubenswrapper[4699]: I0226 11:13:45.103996 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:13:45 crc kubenswrapper[4699]: I0226 11:13:45.133508 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" podStartSLOduration=119.133489472 podStartE2EDuration="1m59.133489472s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:13:45.132405374 +0000 UTC m=+170.943231828" watchObservedRunningTime="2026-02-26 11:13:45.133489472 +0000 UTC m=+170.944315906" Feb 26 11:13:45 crc kubenswrapper[4699]: I0226 11:13:45.386171 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-v5ctv"] Feb 26 11:13:45 crc kubenswrapper[4699]: I0226 11:13:45.386308 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:45 crc kubenswrapper[4699]: E0226 11:13:45.386406 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:46 crc kubenswrapper[4699]: I0226 11:13:46.260415 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:46 crc kubenswrapper[4699]: I0226 11:13:46.260614 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:46 crc kubenswrapper[4699]: I0226 11:13:46.260614 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:46 crc kubenswrapper[4699]: E0226 11:13:46.261718 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:46 crc kubenswrapper[4699]: E0226 11:13:46.261811 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:46 crc kubenswrapper[4699]: E0226 11:13:46.261954 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:46 crc kubenswrapper[4699]: E0226 11:13:46.567866 4699 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 11:13:47 crc kubenswrapper[4699]: I0226 11:13:47.259825 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:47 crc kubenswrapper[4699]: E0226 11:13:47.260029 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:48 crc kubenswrapper[4699]: I0226 11:13:48.260099 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:48 crc kubenswrapper[4699]: I0226 11:13:48.260151 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:48 crc kubenswrapper[4699]: I0226 11:13:48.260185 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:48 crc kubenswrapper[4699]: E0226 11:13:48.260283 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:48 crc kubenswrapper[4699]: E0226 11:13:48.260348 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:48 crc kubenswrapper[4699]: E0226 11:13:48.260423 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:49 crc kubenswrapper[4699]: I0226 11:13:49.260294 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:49 crc kubenswrapper[4699]: E0226 11:13:49.260428 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:50 crc kubenswrapper[4699]: I0226 11:13:50.260176 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:50 crc kubenswrapper[4699]: E0226 11:13:50.260324 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:13:50 crc kubenswrapper[4699]: I0226 11:13:50.260176 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:50 crc kubenswrapper[4699]: I0226 11:13:50.260386 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:50 crc kubenswrapper[4699]: E0226 11:13:50.260451 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:13:50 crc kubenswrapper[4699]: E0226 11:13:50.260534 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:13:51 crc kubenswrapper[4699]: I0226 11:13:51.260517 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:51 crc kubenswrapper[4699]: E0226 11:13:51.260737 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v5ctv" podUID="6956c039-cf77-429b-8f7f-f93ba195d321" Feb 26 11:13:52 crc kubenswrapper[4699]: I0226 11:13:52.259756 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:13:52 crc kubenswrapper[4699]: I0226 11:13:52.259846 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:13:52 crc kubenswrapper[4699]: I0226 11:13:52.260040 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:13:52 crc kubenswrapper[4699]: I0226 11:13:52.264504 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 26 11:13:52 crc kubenswrapper[4699]: I0226 11:13:52.264827 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 26 11:13:52 crc kubenswrapper[4699]: I0226 11:13:52.265276 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 26 11:13:52 crc kubenswrapper[4699]: I0226 11:13:52.265475 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 26 11:13:53 crc kubenswrapper[4699]: I0226 11:13:53.260321 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:53 crc kubenswrapper[4699]: I0226 11:13:53.262386 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 26 11:13:53 crc kubenswrapper[4699]: I0226 11:13:53.262696 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 26 11:13:54 crc kubenswrapper[4699]: I0226 11:13:54.981925 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.024204 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.024597 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qsj62"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.024764 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gsl8w"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.025032 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.025737 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.026044 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qsj62" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.029295 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-22qbz"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.029699 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-f8s5j"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.029903 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.030255 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.030925 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.031463 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.031846 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.032349 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.035532 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.036093 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.036359 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.036743 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.036779 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.038900 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-hnsh7"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.039355 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8rd9"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.039638 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.039668 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8rd9" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.040518 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-tcnxt"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.040952 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-tcnxt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.049657 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.050744 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.051058 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.051481 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.051750 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.051897 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.051755 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.052144 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.053905 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pw64v"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.054759 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-xbpcs"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.055715 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-xbpcs" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.057699 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-pw64v" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.065548 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.065808 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.066280 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.066301 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.066378 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.072740 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.072944 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.074014 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.084331 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.101586 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.118234 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.119627 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.119831 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.120162 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2zshh"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.120303 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.120819 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2zshh" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.121584 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.121733 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.121831 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.121869 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.121963 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.122166 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.122211 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.122250 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.122302 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.122325 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.122362 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.122421 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.122429 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.122453 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.122528 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.122540 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.122595 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.122641 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.122691 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.122645 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.122735 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.122800 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.122843 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.122805 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.122883 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.122757 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.122844 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.123006 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.123043 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.123063 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.123225 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.123402 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.132160 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.132523 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.132654 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.132869 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.135781 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.138882 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.139026 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.139430 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.139511 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.139793 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.140835 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vmxjr"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.141092 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.141411 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vmxjr" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.144077 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.144231 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.144442 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.144601 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.146308 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149248 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b070a40-85a6-42e6-a1bd-d834170a9c9c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f8rd9\" (UID: \"6b070a40-85a6-42e6-a1bd-d834170a9c9c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8rd9" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149283 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afa5e1ce-a457-4771-ab06-2654a7801704-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149305 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h5hb\" (UniqueName: \"kubernetes.io/projected/afa5e1ce-a457-4771-ab06-2654a7801704-kube-api-access-7h5hb\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149324 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64bd7009-a06a-43e1-b265-3ea78b5801b9-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qsj62\" (UID: \"64bd7009-a06a-43e1-b265-3ea78b5801b9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qsj62" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149355 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcvd8\" (UniqueName: \"kubernetes.io/projected/9c1f6032-b723-4cb3-a93b-73d053eaf822-kube-api-access-vcvd8\") pod \"openshift-controller-manager-operator-756b6f6bc6-2zshh\" (UID: \"9c1f6032-b723-4cb3-a93b-73d053eaf822\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2zshh" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149372 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-audit-dir\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149386 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7knj\" (UniqueName: \"kubernetes.io/projected/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-kube-api-access-q7knj\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149400 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvvdh\" (UniqueName: \"kubernetes.io/projected/e6bdcf19-db76-497c-a2fe-a6de38fae724-kube-api-access-wvvdh\") pod \"console-f9d7485db-hnsh7\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149425 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/afa5e1ce-a457-4771-ab06-2654a7801704-etcd-client\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149438 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-image-import-ca\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149454 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a1581f-5367-4535-99bc-3f28547ab766-metrics-tls\") pod \"dns-operator-744455d44c-xbpcs\" (UID: \"61a1581f-5367-4535-99bc-3f28547ab766\") " pod="openshift-dns-operator/dns-operator-744455d44c-xbpcs" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149473 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c1f6032-b723-4cb3-a93b-73d053eaf822-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2zshh\" (UID: \"9c1f6032-b723-4cb3-a93b-73d053eaf822\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2zshh" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149487 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/744aa737-e6c7-4d6b-ba7d-a9479043ad29-serving-cert\") pod \"route-controller-manager-6576b87f9c-fq7g8\" (UID: \"744aa737-e6c7-4d6b-ba7d-a9479043ad29\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149501 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6bdcf19-db76-497c-a2fe-a6de38fae724-console-serving-cert\") pod \"console-f9d7485db-hnsh7\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149516 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5d015dd8-56c9-4f61-b133-4951cda91ca5-images\") pod \"machine-api-operator-5694c8668f-pw64v\" (UID: \"5d015dd8-56c9-4f61-b133-4951cda91ca5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw64v" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149530 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/744aa737-e6c7-4d6b-ba7d-a9479043ad29-config\") pod \"route-controller-manager-6576b87f9c-fq7g8\" (UID: \"744aa737-e6c7-4d6b-ba7d-a9479043ad29\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149548 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-config\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149564 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/796e9631-3388-48b1-8675-3fbc4b6e435d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-gsl8w\" (UID: \"796e9631-3388-48b1-8675-3fbc4b6e435d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149579 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b070a40-85a6-42e6-a1bd-d834170a9c9c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f8rd9\" (UID: \"6b070a40-85a6-42e6-a1bd-d834170a9c9c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8rd9" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149591 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afa5e1ce-a457-4771-ab06-2654a7801704-serving-cert\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149607 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-console-config\") pod \"console-f9d7485db-hnsh7\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149621 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-etcd-client\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149637 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-etcd-serving-ca\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149651 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-encryption-config\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149667 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-serving-cert\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149681 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149695 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64bd7009-a06a-43e1-b265-3ea78b5801b9-serving-cert\") pod \"authentication-operator-69f744f599-qsj62\" (UID: \"64bd7009-a06a-43e1-b265-3ea78b5801b9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qsj62" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149710 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149729 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149749 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d015dd8-56c9-4f61-b133-4951cda91ca5-config\") pod \"machine-api-operator-5694c8668f-pw64v\" (UID: \"5d015dd8-56c9-4f61-b133-4951cda91ca5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw64v" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149768 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-service-ca\") pod \"console-f9d7485db-hnsh7\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149796 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-audit-dir\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149814 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/796e9631-3388-48b1-8675-3fbc4b6e435d-client-ca\") pod \"controller-manager-879f6c89f-gsl8w\" (UID: \"796e9631-3388-48b1-8675-3fbc4b6e435d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149828 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjjgq\" (UniqueName: \"kubernetes.io/projected/796e9631-3388-48b1-8675-3fbc4b6e435d-kube-api-access-vjjgq\") pod \"controller-manager-879f6c89f-gsl8w\" (UID: \"796e9631-3388-48b1-8675-3fbc4b6e435d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149845 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03fd3407-9529-4638-89d6-cfc6b703e510-serving-cert\") pod \"openshift-config-operator-7777fb866f-vzj5b\" (UID: \"03fd3407-9529-4638-89d6-cfc6b703e510\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149860 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/afa5e1ce-a457-4771-ab06-2654a7801704-audit-policies\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149883 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/744aa737-e6c7-4d6b-ba7d-a9479043ad29-client-ca\") pod \"route-controller-manager-6576b87f9c-fq7g8\" (UID: \"744aa737-e6c7-4d6b-ba7d-a9479043ad29\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149898 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwpnc\" (UniqueName: \"kubernetes.io/projected/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-kube-api-access-xwpnc\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149916 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-trusted-ca-bundle\") pod \"console-f9d7485db-hnsh7\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149949 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149966 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e6bdcf19-db76-497c-a2fe-a6de38fae724-console-oauth-config\") pod \"console-f9d7485db-hnsh7\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.149990 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64bd7009-a06a-43e1-b265-3ea78b5801b9-service-ca-bundle\") pod \"authentication-operator-69f744f599-qsj62\" (UID: \"64bd7009-a06a-43e1-b265-3ea78b5801b9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qsj62" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.150008 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.150027 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c1f6032-b723-4cb3-a93b-73d053eaf822-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2zshh\" (UID: \"9c1f6032-b723-4cb3-a93b-73d053eaf822\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2zshh" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.150042 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-audit\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.150057 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4jdw\" (UniqueName: \"kubernetes.io/projected/03fd3407-9529-4638-89d6-cfc6b703e510-kube-api-access-j4jdw\") pod \"openshift-config-operator-7777fb866f-vzj5b\" (UID: \"03fd3407-9529-4638-89d6-cfc6b703e510\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.150074 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-audit-policies\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.150089 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.150128 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ttxr\" (UniqueName: \"kubernetes.io/projected/61a1581f-5367-4535-99bc-3f28547ab766-kube-api-access-2ttxr\") pod \"dns-operator-744455d44c-xbpcs\" (UID: \"61a1581f-5367-4535-99bc-3f28547ab766\") " pod="openshift-dns-operator/dns-operator-744455d44c-xbpcs" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.150144 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/796e9631-3388-48b1-8675-3fbc4b6e435d-config\") pod \"controller-manager-879f6c89f-gsl8w\" (UID: \"796e9631-3388-48b1-8675-3fbc4b6e435d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.150159 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.150175 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/03fd3407-9529-4638-89d6-cfc6b703e510-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vzj5b\" (UID: \"03fd3407-9529-4638-89d6-cfc6b703e510\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.153824 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.153862 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-node-pullsecrets\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.153880 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64bd7009-a06a-43e1-b265-3ea78b5801b9-config\") pod \"authentication-operator-69f744f599-qsj62\" (UID: \"64bd7009-a06a-43e1-b265-3ea78b5801b9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qsj62" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.153896 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.153911 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.153927 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4qjv\" (UniqueName: \"kubernetes.io/projected/5d015dd8-56c9-4f61-b133-4951cda91ca5-kube-api-access-j4qjv\") pod \"machine-api-operator-5694c8668f-pw64v\" (UID: \"5d015dd8-56c9-4f61-b133-4951cda91ca5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw64v" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.153942 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/afa5e1ce-a457-4771-ab06-2654a7801704-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.153956 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmzs7\" (UniqueName: \"kubernetes.io/projected/72b1bc55-f48b-4d90-ab02-3a80438096b6-kube-api-access-rmzs7\") pod \"downloads-7954f5f757-tcnxt\" (UID: \"72b1bc55-f48b-4d90-ab02-3a80438096b6\") " pod="openshift-console/downloads-7954f5f757-tcnxt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.153972 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f94s2\" (UniqueName: \"kubernetes.io/projected/64bd7009-a06a-43e1-b265-3ea78b5801b9-kube-api-access-f94s2\") pod \"authentication-operator-69f744f599-qsj62\" (UID: \"64bd7009-a06a-43e1-b265-3ea78b5801b9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qsj62" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.153985 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/796e9631-3388-48b1-8675-3fbc4b6e435d-serving-cert\") pod \"controller-manager-879f6c89f-gsl8w\" (UID: \"796e9631-3388-48b1-8675-3fbc4b6e435d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.154002 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.154018 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/afa5e1ce-a457-4771-ab06-2654a7801704-audit-dir\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.154034 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66m62\" (UniqueName: \"kubernetes.io/projected/744aa737-e6c7-4d6b-ba7d-a9479043ad29-kube-api-access-66m62\") pod \"route-controller-manager-6576b87f9c-fq7g8\" (UID: \"744aa737-e6c7-4d6b-ba7d-a9479043ad29\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.154049 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d015dd8-56c9-4f61-b133-4951cda91ca5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pw64v\" (UID: \"5d015dd8-56c9-4f61-b133-4951cda91ca5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw64v" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.154064 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-oauth-serving-cert\") pod \"console-f9d7485db-hnsh7\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.154083 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.154102 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/afa5e1ce-a457-4771-ab06-2654a7801704-encryption-config\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.154558 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b070a40-85a6-42e6-a1bd-d834170a9c9c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f8rd9\" (UID: \"6b070a40-85a6-42e6-a1bd-d834170a9c9c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8rd9" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.152756 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-p742p"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.153774 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.155401 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-j6vfb"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.155770 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvgmh"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.155987 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p742p" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.156306 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvgmh" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.156331 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.157171 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.157428 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-wcnnr"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.158239 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.158454 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.158575 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.158676 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.159290 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.159666 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.160799 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.161046 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.161232 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.171990 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-xm88w"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.172600 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gpjvh"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.173278 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-t8656"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.180162 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p9wj4"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.190109 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.191172 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.209142 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.209402 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.209630 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wcnnr" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.209832 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xm88w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.210185 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gpjvh" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.210884 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-k9bv4"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.211438 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p9wj4" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.211927 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.211980 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-k9bv4" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.212519 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.212702 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.212788 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.213091 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.213305 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.213532 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.213873 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.215725 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.215923 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-k6wtb"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.216659 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gngtb"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.217072 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gngtb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.217217 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hzqgp"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.217380 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k6wtb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.217860 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.217874 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.217967 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.218167 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.218187 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-hzqgp" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.218552 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.218657 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.218700 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.219827 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.220200 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tpntx"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.221007 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpntx" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.221353 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tm8w"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.221790 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tm8w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.222777 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvgnb"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.223456 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvgnb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.224344 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zqgj9"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.224798 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-zqgj9" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.225472 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tg675"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.226193 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tg675" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.229201 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pxsr8"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.229221 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.230047 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cd5qf"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.230501 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-czs8l"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.230568 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pxsr8" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.230602 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qf" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.230998 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-czs8l" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.233594 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535060-f97rd"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.246564 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.247336 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535060-f97rd" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.247567 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dk749"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.248650 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.249834 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.250712 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ctfcc"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.250818 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dk749" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.251250 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gsl8w"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.251291 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-22qbz"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.251306 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-tnwpn"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.251722 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ctfcc" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255245 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgnh6\" (UniqueName: \"kubernetes.io/projected/34163385-0c26-4d54-a06a-11f9ef09901d-kube-api-access-qgnh6\") pod \"multus-admission-controller-857f4d67dd-k9bv4\" (UID: \"34163385-0c26-4d54-a06a-11f9ef09901d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k9bv4" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255285 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-serving-cert\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255311 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255329 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64bd7009-a06a-43e1-b265-3ea78b5801b9-serving-cert\") pod \"authentication-operator-69f744f599-qsj62\" (UID: \"64bd7009-a06a-43e1-b265-3ea78b5801b9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qsj62" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255347 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255364 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255379 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d015dd8-56c9-4f61-b133-4951cda91ca5-config\") pod \"machine-api-operator-5694c8668f-pw64v\" (UID: \"5d015dd8-56c9-4f61-b133-4951cda91ca5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw64v" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255393 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-service-ca\") pod \"console-f9d7485db-hnsh7\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255409 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8m6h\" (UniqueName: \"kubernetes.io/projected/fab52d01-f907-44cb-8d5f-162116d75fc9-kube-api-access-d8m6h\") pod \"etcd-operator-b45778765-j6vfb\" (UID: \"fab52d01-f907-44cb-8d5f-162116d75fc9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255428 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a97e310-1811-48a9-a31a-eb9a0321d280-service-ca-bundle\") pod \"router-default-5444994796-xm88w\" (UID: \"4a97e310-1811-48a9-a31a-eb9a0321d280\") " pod="openshift-ingress/router-default-5444994796-xm88w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255443 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-audit-dir\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255471 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/796e9631-3388-48b1-8675-3fbc4b6e435d-client-ca\") pod \"controller-manager-879f6c89f-gsl8w\" (UID: \"796e9631-3388-48b1-8675-3fbc4b6e435d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255490 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjjgq\" (UniqueName: \"kubernetes.io/projected/796e9631-3388-48b1-8675-3fbc4b6e435d-kube-api-access-vjjgq\") pod \"controller-manager-879f6c89f-gsl8w\" (UID: \"796e9631-3388-48b1-8675-3fbc4b6e435d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255507 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03fd3407-9529-4638-89d6-cfc6b703e510-serving-cert\") pod \"openshift-config-operator-7777fb866f-vzj5b\" (UID: \"03fd3407-9529-4638-89d6-cfc6b703e510\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255521 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/afa5e1ce-a457-4771-ab06-2654a7801704-audit-policies\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255537 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/744aa737-e6c7-4d6b-ba7d-a9479043ad29-client-ca\") pod \"route-controller-manager-6576b87f9c-fq7g8\" (UID: \"744aa737-e6c7-4d6b-ba7d-a9479043ad29\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255552 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwpnc\" (UniqueName: \"kubernetes.io/projected/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-kube-api-access-xwpnc\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255567 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-trusted-ca-bundle\") pod \"console-f9d7485db-hnsh7\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255584 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255600 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e6bdcf19-db76-497c-a2fe-a6de38fae724-console-oauth-config\") pod \"console-f9d7485db-hnsh7\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255615 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89928475-c3fb-415f-a244-6292dc8adc33-config\") pod \"kube-controller-manager-operator-78b949d7b-gpjvh\" (UID: \"89928475-c3fb-415f-a244-6292dc8adc33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gpjvh" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255640 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64bd7009-a06a-43e1-b265-3ea78b5801b9-service-ca-bundle\") pod \"authentication-operator-69f744f599-qsj62\" (UID: \"64bd7009-a06a-43e1-b265-3ea78b5801b9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qsj62" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255656 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255672 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89928475-c3fb-415f-a244-6292dc8adc33-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gpjvh\" (UID: \"89928475-c3fb-415f-a244-6292dc8adc33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gpjvh" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255691 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c1f6032-b723-4cb3-a93b-73d053eaf822-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2zshh\" (UID: \"9c1f6032-b723-4cb3-a93b-73d053eaf822\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2zshh" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255707 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-audit\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255723 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4jdw\" (UniqueName: \"kubernetes.io/projected/03fd3407-9529-4638-89d6-cfc6b703e510-kube-api-access-j4jdw\") pod \"openshift-config-operator-7777fb866f-vzj5b\" (UID: \"03fd3407-9529-4638-89d6-cfc6b703e510\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255734 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qsj62"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255765 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255777 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-j6vfb"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255790 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2zshh"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255802 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hnsh7"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255812 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-xbpcs"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255741 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/34163385-0c26-4d54-a06a-11f9ef09901d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-k9bv4\" (UID: \"34163385-0c26-4d54-a06a-11f9ef09901d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k9bv4" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255874 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fab52d01-f907-44cb-8d5f-162116d75fc9-etcd-ca\") pod \"etcd-operator-b45778765-j6vfb\" (UID: \"fab52d01-f907-44cb-8d5f-162116d75fc9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255911 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-audit-policies\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255935 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255967 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ttxr\" (UniqueName: \"kubernetes.io/projected/61a1581f-5367-4535-99bc-3f28547ab766-kube-api-access-2ttxr\") pod \"dns-operator-744455d44c-xbpcs\" (UID: \"61a1581f-5367-4535-99bc-3f28547ab766\") " pod="openshift-dns-operator/dns-operator-744455d44c-xbpcs" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255983 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mxpf\" (UniqueName: \"kubernetes.io/projected/460579d9-ed16-49b7-a588-ef20ceb9bbf4-kube-api-access-2mxpf\") pod \"cluster-samples-operator-665b6dd947-9tm8w\" (UID: \"460579d9-ed16-49b7-a588-ef20ceb9bbf4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tm8w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.255999 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wks59\" (UniqueName: \"kubernetes.io/projected/4a97e310-1811-48a9-a31a-eb9a0321d280-kube-api-access-wks59\") pod \"router-default-5444994796-xm88w\" (UID: \"4a97e310-1811-48a9-a31a-eb9a0321d280\") " pod="openshift-ingress/router-default-5444994796-xm88w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256017 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/796e9631-3388-48b1-8675-3fbc4b6e435d-config\") pod \"controller-manager-879f6c89f-gsl8w\" (UID: \"796e9631-3388-48b1-8675-3fbc4b6e435d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256055 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256081 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/03fd3407-9529-4638-89d6-cfc6b703e510-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vzj5b\" (UID: \"03fd3407-9529-4638-89d6-cfc6b703e510\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256103 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgwjg\" (UniqueName: \"kubernetes.io/projected/bad776f4-e24b-41f1-88d8-2b1fe6258783-kube-api-access-tgwjg\") pod \"control-plane-machine-set-operator-78cbb6b69f-p9wj4\" (UID: \"bad776f4-e24b-41f1-88d8-2b1fe6258783\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p9wj4" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256155 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256185 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tkln\" (UniqueName: \"kubernetes.io/projected/36efccb8-7513-43d0-8952-d7ad9546da8e-kube-api-access-2tkln\") pod \"machine-config-controller-84d6567774-tpntx\" (UID: \"36efccb8-7513-43d0-8952-d7ad9546da8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpntx" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256206 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89928475-c3fb-415f-a244-6292dc8adc33-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gpjvh\" (UID: \"89928475-c3fb-415f-a244-6292dc8adc33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gpjvh" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256231 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4qjv\" (UniqueName: \"kubernetes.io/projected/5d015dd8-56c9-4f61-b133-4951cda91ca5-kube-api-access-j4qjv\") pod \"machine-api-operator-5694c8668f-pw64v\" (UID: \"5d015dd8-56c9-4f61-b133-4951cda91ca5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw64v" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256253 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/afa5e1ce-a457-4771-ab06-2654a7801704-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256279 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-node-pullsecrets\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256305 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64bd7009-a06a-43e1-b265-3ea78b5801b9-config\") pod \"authentication-operator-69f744f599-qsj62\" (UID: \"64bd7009-a06a-43e1-b265-3ea78b5801b9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qsj62" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256332 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256362 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256387 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmzs7\" (UniqueName: \"kubernetes.io/projected/72b1bc55-f48b-4d90-ab02-3a80438096b6-kube-api-access-rmzs7\") pod \"downloads-7954f5f757-tcnxt\" (UID: \"72b1bc55-f48b-4d90-ab02-3a80438096b6\") " pod="openshift-console/downloads-7954f5f757-tcnxt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256412 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36efccb8-7513-43d0-8952-d7ad9546da8e-proxy-tls\") pod \"machine-config-controller-84d6567774-tpntx\" (UID: \"36efccb8-7513-43d0-8952-d7ad9546da8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpntx" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256441 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f94s2\" (UniqueName: \"kubernetes.io/projected/64bd7009-a06a-43e1-b265-3ea78b5801b9-kube-api-access-f94s2\") pod \"authentication-operator-69f744f599-qsj62\" (UID: \"64bd7009-a06a-43e1-b265-3ea78b5801b9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qsj62" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256469 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/796e9631-3388-48b1-8675-3fbc4b6e435d-serving-cert\") pod \"controller-manager-879f6c89f-gsl8w\" (UID: \"796e9631-3388-48b1-8675-3fbc4b6e435d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256502 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256530 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/afa5e1ce-a457-4771-ab06-2654a7801704-audit-dir\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256555 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66m62\" (UniqueName: \"kubernetes.io/projected/744aa737-e6c7-4d6b-ba7d-a9479043ad29-kube-api-access-66m62\") pod \"route-controller-manager-6576b87f9c-fq7g8\" (UID: \"744aa737-e6c7-4d6b-ba7d-a9479043ad29\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256580 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d015dd8-56c9-4f61-b133-4951cda91ca5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pw64v\" (UID: \"5d015dd8-56c9-4f61-b133-4951cda91ca5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw64v" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256602 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-oauth-serving-cert\") pod \"console-f9d7485db-hnsh7\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256629 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256656 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/afa5e1ce-a457-4771-ab06-2654a7801704-encryption-config\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256682 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b070a40-85a6-42e6-a1bd-d834170a9c9c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f8rd9\" (UID: \"6b070a40-85a6-42e6-a1bd-d834170a9c9c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8rd9" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256704 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b070a40-85a6-42e6-a1bd-d834170a9c9c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f8rd9\" (UID: \"6b070a40-85a6-42e6-a1bd-d834170a9c9c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8rd9" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256726 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afa5e1ce-a457-4771-ab06-2654a7801704-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256748 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h5hb\" (UniqueName: \"kubernetes.io/projected/afa5e1ce-a457-4771-ab06-2654a7801704-kube-api-access-7h5hb\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256775 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/36efccb8-7513-43d0-8952-d7ad9546da8e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tpntx\" (UID: \"36efccb8-7513-43d0-8952-d7ad9546da8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpntx" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256802 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a97e310-1811-48a9-a31a-eb9a0321d280-metrics-certs\") pod \"router-default-5444994796-xm88w\" (UID: \"4a97e310-1811-48a9-a31a-eb9a0321d280\") " pod="openshift-ingress/router-default-5444994796-xm88w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256829 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64bd7009-a06a-43e1-b265-3ea78b5801b9-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qsj62\" (UID: \"64bd7009-a06a-43e1-b265-3ea78b5801b9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qsj62" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256853 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wrjv\" (UniqueName: \"kubernetes.io/projected/e0ecd5cc-b456-4d69-897c-5fd543842440-kube-api-access-6wrjv\") pod \"kube-storage-version-migrator-operator-b67b599dd-gngtb\" (UID: \"e0ecd5cc-b456-4d69-897c-5fd543842440\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gngtb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256877 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fab52d01-f907-44cb-8d5f-162116d75fc9-serving-cert\") pod \"etcd-operator-b45778765-j6vfb\" (UID: \"fab52d01-f907-44cb-8d5f-162116d75fc9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256898 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fab52d01-f907-44cb-8d5f-162116d75fc9-etcd-service-ca\") pod \"etcd-operator-b45778765-j6vfb\" (UID: \"fab52d01-f907-44cb-8d5f-162116d75fc9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.256921 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7frcg\" (UniqueName: \"kubernetes.io/projected/af5429d7-39d0-4b17-8219-21c8491384ae-kube-api-access-7frcg\") pod \"migrator-59844c95c7-k6wtb\" (UID: \"af5429d7-39d0-4b17-8219-21c8491384ae\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k6wtb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.257003 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0ecd5cc-b456-4d69-897c-5fd543842440-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gngtb\" (UID: \"e0ecd5cc-b456-4d69-897c-5fd543842440\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gngtb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.257030 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvvdh\" (UniqueName: \"kubernetes.io/projected/e6bdcf19-db76-497c-a2fe-a6de38fae724-kube-api-access-wvvdh\") pod \"console-f9d7485db-hnsh7\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.257054 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fab52d01-f907-44cb-8d5f-162116d75fc9-config\") pod \"etcd-operator-b45778765-j6vfb\" (UID: \"fab52d01-f907-44cb-8d5f-162116d75fc9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.257107 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcvd8\" (UniqueName: \"kubernetes.io/projected/9c1f6032-b723-4cb3-a93b-73d053eaf822-kube-api-access-vcvd8\") pod \"openshift-controller-manager-operator-756b6f6bc6-2zshh\" (UID: \"9c1f6032-b723-4cb3-a93b-73d053eaf822\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2zshh" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.257149 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-audit-dir\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.257175 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7knj\" (UniqueName: \"kubernetes.io/projected/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-kube-api-access-q7knj\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.257201 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4a97e310-1811-48a9-a31a-eb9a0321d280-stats-auth\") pod \"router-default-5444994796-xm88w\" (UID: \"4a97e310-1811-48a9-a31a-eb9a0321d280\") " pod="openshift-ingress/router-default-5444994796-xm88w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.257241 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/afa5e1ce-a457-4771-ab06-2654a7801704-etcd-client\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.257267 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-image-import-ca\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.257295 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a1581f-5367-4535-99bc-3f28547ab766-metrics-tls\") pod \"dns-operator-744455d44c-xbpcs\" (UID: \"61a1581f-5367-4535-99bc-3f28547ab766\") " pod="openshift-dns-operator/dns-operator-744455d44c-xbpcs" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.257325 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c1f6032-b723-4cb3-a93b-73d053eaf822-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2zshh\" (UID: \"9c1f6032-b723-4cb3-a93b-73d053eaf822\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2zshh" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.257333 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.257352 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/744aa737-e6c7-4d6b-ba7d-a9479043ad29-serving-cert\") pod \"route-controller-manager-6576b87f9c-fq7g8\" (UID: \"744aa737-e6c7-4d6b-ba7d-a9479043ad29\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.257381 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6bdcf19-db76-497c-a2fe-a6de38fae724-console-serving-cert\") pod \"console-f9d7485db-hnsh7\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.257408 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bad776f4-e24b-41f1-88d8-2b1fe6258783-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-p9wj4\" (UID: \"bad776f4-e24b-41f1-88d8-2b1fe6258783\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p9wj4" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.257435 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5d015dd8-56c9-4f61-b133-4951cda91ca5-images\") pod \"machine-api-operator-5694c8668f-pw64v\" (UID: \"5d015dd8-56c9-4f61-b133-4951cda91ca5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw64v" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.257566 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.258190 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-audit-policies\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.258342 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tnwpn" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.258668 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/796e9631-3388-48b1-8675-3fbc4b6e435d-config\") pod \"controller-manager-879f6c89f-gsl8w\" (UID: \"796e9631-3388-48b1-8675-3fbc4b6e435d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.258969 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/03fd3407-9529-4638-89d6-cfc6b703e510-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vzj5b\" (UID: \"03fd3407-9529-4638-89d6-cfc6b703e510\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.259287 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64bd7009-a06a-43e1-b265-3ea78b5801b9-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qsj62\" (UID: \"64bd7009-a06a-43e1-b265-3ea78b5801b9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qsj62" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.259542 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-audit-dir\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.259658 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-audit-dir\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.260141 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-image-import-ca\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.262042 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.262212 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.262737 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/afa5e1ce-a457-4771-ab06-2654a7801704-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.262781 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-node-pullsecrets\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.263231 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64bd7009-a06a-43e1-b265-3ea78b5801b9-config\") pod \"authentication-operator-69f744f599-qsj62\" (UID: \"64bd7009-a06a-43e1-b265-3ea78b5801b9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qsj62" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.264002 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/796e9631-3388-48b1-8675-3fbc4b6e435d-client-ca\") pod \"controller-manager-879f6c89f-gsl8w\" (UID: \"796e9631-3388-48b1-8675-3fbc4b6e435d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.257454 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fab52d01-f907-44cb-8d5f-162116d75fc9-etcd-client\") pod \"etcd-operator-b45778765-j6vfb\" (UID: \"fab52d01-f907-44cb-8d5f-162116d75fc9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.271683 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b070a40-85a6-42e6-a1bd-d834170a9c9c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f8rd9\" (UID: \"6b070a40-85a6-42e6-a1bd-d834170a9c9c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8rd9" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.271744 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afa5e1ce-a457-4771-ab06-2654a7801704-serving-cert\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.271785 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/744aa737-e6c7-4d6b-ba7d-a9479043ad29-config\") pod \"route-controller-manager-6576b87f9c-fq7g8\" (UID: \"744aa737-e6c7-4d6b-ba7d-a9479043ad29\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.271827 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-config\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.271856 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/796e9631-3388-48b1-8675-3fbc4b6e435d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-gsl8w\" (UID: \"796e9631-3388-48b1-8675-3fbc4b6e435d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.272187 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.272943 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b070a40-85a6-42e6-a1bd-d834170a9c9c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f8rd9\" (UID: \"6b070a40-85a6-42e6-a1bd-d834170a9c9c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8rd9" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.273375 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.273442 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.274141 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.274401 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/744aa737-e6c7-4d6b-ba7d-a9479043ad29-client-ca\") pod \"route-controller-manager-6576b87f9c-fq7g8\" (UID: \"744aa737-e6c7-4d6b-ba7d-a9479043ad29\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.274468 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.275401 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/afa5e1ce-a457-4771-ab06-2654a7801704-etcd-client\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.275774 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-serving-cert\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.276384 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d015dd8-56c9-4f61-b133-4951cda91ca5-config\") pod \"machine-api-operator-5694c8668f-pw64v\" (UID: \"5d015dd8-56c9-4f61-b133-4951cda91ca5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw64v" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.277008 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64bd7009-a06a-43e1-b265-3ea78b5801b9-service-ca-bundle\") pod \"authentication-operator-69f744f599-qsj62\" (UID: \"64bd7009-a06a-43e1-b265-3ea78b5801b9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qsj62" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.277694 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-audit\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.278985 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03fd3407-9529-4638-89d6-cfc6b703e510-serving-cert\") pod \"openshift-config-operator-7777fb866f-vzj5b\" (UID: \"03fd3407-9529-4638-89d6-cfc6b703e510\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.279098 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c1f6032-b723-4cb3-a93b-73d053eaf822-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2zshh\" (UID: \"9c1f6032-b723-4cb3-a93b-73d053eaf822\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2zshh" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.279851 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-trusted-ca-bundle\") pod \"console-f9d7485db-hnsh7\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.279989 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-oauth-serving-cert\") pod \"console-f9d7485db-hnsh7\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.281371 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/744aa737-e6c7-4d6b-ba7d-a9479043ad29-config\") pod \"route-controller-manager-6576b87f9c-fq7g8\" (UID: \"744aa737-e6c7-4d6b-ba7d-a9479043ad29\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.282078 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-config\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.282452 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.283559 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/796e9631-3388-48b1-8675-3fbc4b6e435d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-gsl8w\" (UID: \"796e9631-3388-48b1-8675-3fbc4b6e435d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.283844 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b070a40-85a6-42e6-a1bd-d834170a9c9c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f8rd9\" (UID: \"6b070a40-85a6-42e6-a1bd-d834170a9c9c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8rd9" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.284855 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d015dd8-56c9-4f61-b133-4951cda91ca5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pw64v\" (UID: \"5d015dd8-56c9-4f61-b133-4951cda91ca5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw64v" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.285577 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5d015dd8-56c9-4f61-b133-4951cda91ca5-images\") pod \"machine-api-operator-5694c8668f-pw64v\" (UID: \"5d015dd8-56c9-4f61-b133-4951cda91ca5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw64v" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.287790 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/afa5e1ce-a457-4771-ab06-2654a7801704-audit-dir\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.289222 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/afa5e1ce-a457-4771-ab06-2654a7801704-encryption-config\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.289696 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pw64v"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.294317 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8rd9"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.294346 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gpjvh"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.271897 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0ecd5cc-b456-4d69-897c-5fd543842440-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gngtb\" (UID: \"e0ecd5cc-b456-4d69-897c-5fd543842440\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gngtb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.294410 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-console-config\") pod \"console-f9d7485db-hnsh7\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.294446 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/460579d9-ed16-49b7-a588-ef20ceb9bbf4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9tm8w\" (UID: \"460579d9-ed16-49b7-a588-ef20ceb9bbf4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tm8w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.294479 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-encryption-config\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.294511 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4a97e310-1811-48a9-a31a-eb9a0321d280-default-certificate\") pod \"router-default-5444994796-xm88w\" (UID: \"4a97e310-1811-48a9-a31a-eb9a0321d280\") " pod="openshift-ingress/router-default-5444994796-xm88w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.294549 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-etcd-client\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.294576 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-etcd-serving-ca\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.294877 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64bd7009-a06a-43e1-b265-3ea78b5801b9-serving-cert\") pod \"authentication-operator-69f744f599-qsj62\" (UID: \"64bd7009-a06a-43e1-b265-3ea78b5801b9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qsj62" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.293845 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.291586 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c1f6032-b723-4cb3-a93b-73d053eaf822-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2zshh\" (UID: \"9c1f6032-b723-4cb3-a93b-73d053eaf822\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2zshh" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.292029 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6bdcf19-db76-497c-a2fe-a6de38fae724-console-serving-cert\") pod \"console-f9d7485db-hnsh7\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.292645 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/796e9631-3388-48b1-8675-3fbc4b6e435d-serving-cert\") pod \"controller-manager-879f6c89f-gsl8w\" (UID: \"796e9631-3388-48b1-8675-3fbc4b6e435d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.295337 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-etcd-serving-ca\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.294250 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e6bdcf19-db76-497c-a2fe-a6de38fae724-console-oauth-config\") pod \"console-f9d7485db-hnsh7\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.295728 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-console-config\") pod \"console-f9d7485db-hnsh7\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.296564 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/744aa737-e6c7-4d6b-ba7d-a9479043ad29-serving-cert\") pod \"route-controller-manager-6576b87f9c-fq7g8\" (UID: \"744aa737-e6c7-4d6b-ba7d-a9479043ad29\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.296869 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p9wj4"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.296874 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afa5e1ce-a457-4771-ab06-2654a7801704-serving-cert\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.297278 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.297556 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afa5e1ce-a457-4771-ab06-2654a7801704-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.298035 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-service-ca\") pod \"console-f9d7485db-hnsh7\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.275158 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/afa5e1ce-a457-4771-ab06-2654a7801704-audit-policies\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.300103 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a1581f-5367-4535-99bc-3f28547ab766-metrics-tls\") pod \"dns-operator-744455d44c-xbpcs\" (UID: \"61a1581f-5367-4535-99bc-3f28547ab766\") " pod="openshift-dns-operator/dns-operator-744455d44c-xbpcs" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.300190 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tm8w"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.301088 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.301762 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-etcd-client\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.303476 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pxsr8"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.304734 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-encryption-config\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.305421 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-czs8l"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.307804 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.309135 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.310583 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvgmh"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.320156 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-k9bv4"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.327303 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-qzphl"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.328541 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-qzphl" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.336779 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.336989 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-rlx7c"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.337960 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-rlx7c" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.341191 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vmxjr"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.341255 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-k6wtb"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.344566 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535060-f97rd"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.344601 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tg675"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.344771 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-wcnnr"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.347804 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.347960 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-tcnxt"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.347986 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gngtb"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.352532 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-f8s5j"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.352592 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.352637 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hzqgp"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.356644 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-t8656"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.356703 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cd5qf"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.359199 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tnwpn"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.359291 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-qzphl"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.360379 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-r2phw"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.361235 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r2phw" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.361839 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.363053 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tpntx"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.364138 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvgnb"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.365347 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zqgj9"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.366885 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ctfcc"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.368257 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dk749"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.369927 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r2phw"] Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.375182 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.388203 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.395336 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89928475-c3fb-415f-a244-6292dc8adc33-config\") pod \"kube-controller-manager-operator-78b949d7b-gpjvh\" (UID: \"89928475-c3fb-415f-a244-6292dc8adc33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gpjvh" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.395421 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89928475-c3fb-415f-a244-6292dc8adc33-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gpjvh\" (UID: \"89928475-c3fb-415f-a244-6292dc8adc33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gpjvh" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.395454 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/34163385-0c26-4d54-a06a-11f9ef09901d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-k9bv4\" (UID: \"34163385-0c26-4d54-a06a-11f9ef09901d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k9bv4" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.395470 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fab52d01-f907-44cb-8d5f-162116d75fc9-etcd-ca\") pod \"etcd-operator-b45778765-j6vfb\" (UID: \"fab52d01-f907-44cb-8d5f-162116d75fc9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.395513 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mxpf\" (UniqueName: \"kubernetes.io/projected/460579d9-ed16-49b7-a588-ef20ceb9bbf4-kube-api-access-2mxpf\") pod \"cluster-samples-operator-665b6dd947-9tm8w\" (UID: \"460579d9-ed16-49b7-a588-ef20ceb9bbf4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tm8w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.395541 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wks59\" (UniqueName: \"kubernetes.io/projected/4a97e310-1811-48a9-a31a-eb9a0321d280-kube-api-access-wks59\") pod \"router-default-5444994796-xm88w\" (UID: \"4a97e310-1811-48a9-a31a-eb9a0321d280\") " pod="openshift-ingress/router-default-5444994796-xm88w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.395568 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgwjg\" (UniqueName: \"kubernetes.io/projected/bad776f4-e24b-41f1-88d8-2b1fe6258783-kube-api-access-tgwjg\") pod \"control-plane-machine-set-operator-78cbb6b69f-p9wj4\" (UID: \"bad776f4-e24b-41f1-88d8-2b1fe6258783\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p9wj4" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.395596 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tkln\" (UniqueName: \"kubernetes.io/projected/36efccb8-7513-43d0-8952-d7ad9546da8e-kube-api-access-2tkln\") pod \"machine-config-controller-84d6567774-tpntx\" (UID: \"36efccb8-7513-43d0-8952-d7ad9546da8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpntx" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.395620 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89928475-c3fb-415f-a244-6292dc8adc33-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gpjvh\" (UID: \"89928475-c3fb-415f-a244-6292dc8adc33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gpjvh" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.395694 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36efccb8-7513-43d0-8952-d7ad9546da8e-proxy-tls\") pod \"machine-config-controller-84d6567774-tpntx\" (UID: \"36efccb8-7513-43d0-8952-d7ad9546da8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpntx" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.395911 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a97e310-1811-48a9-a31a-eb9a0321d280-metrics-certs\") pod \"router-default-5444994796-xm88w\" (UID: \"4a97e310-1811-48a9-a31a-eb9a0321d280\") " pod="openshift-ingress/router-default-5444994796-xm88w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.395951 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wrjv\" (UniqueName: \"kubernetes.io/projected/e0ecd5cc-b456-4d69-897c-5fd543842440-kube-api-access-6wrjv\") pod \"kube-storage-version-migrator-operator-b67b599dd-gngtb\" (UID: \"e0ecd5cc-b456-4d69-897c-5fd543842440\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gngtb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.396082 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fab52d01-f907-44cb-8d5f-162116d75fc9-serving-cert\") pod \"etcd-operator-b45778765-j6vfb\" (UID: \"fab52d01-f907-44cb-8d5f-162116d75fc9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.396598 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/36efccb8-7513-43d0-8952-d7ad9546da8e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tpntx\" (UID: \"36efccb8-7513-43d0-8952-d7ad9546da8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpntx" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.396640 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7frcg\" (UniqueName: \"kubernetes.io/projected/af5429d7-39d0-4b17-8219-21c8491384ae-kube-api-access-7frcg\") pod \"migrator-59844c95c7-k6wtb\" (UID: \"af5429d7-39d0-4b17-8219-21c8491384ae\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k6wtb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.396681 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0ecd5cc-b456-4d69-897c-5fd543842440-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gngtb\" (UID: \"e0ecd5cc-b456-4d69-897c-5fd543842440\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gngtb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.396709 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fab52d01-f907-44cb-8d5f-162116d75fc9-etcd-service-ca\") pod \"etcd-operator-b45778765-j6vfb\" (UID: \"fab52d01-f907-44cb-8d5f-162116d75fc9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.396782 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fab52d01-f907-44cb-8d5f-162116d75fc9-config\") pod \"etcd-operator-b45778765-j6vfb\" (UID: \"fab52d01-f907-44cb-8d5f-162116d75fc9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.396823 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fab52d01-f907-44cb-8d5f-162116d75fc9-etcd-ca\") pod \"etcd-operator-b45778765-j6vfb\" (UID: \"fab52d01-f907-44cb-8d5f-162116d75fc9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.396884 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4a97e310-1811-48a9-a31a-eb9a0321d280-stats-auth\") pod \"router-default-5444994796-xm88w\" (UID: \"4a97e310-1811-48a9-a31a-eb9a0321d280\") " pod="openshift-ingress/router-default-5444994796-xm88w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.397099 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bad776f4-e24b-41f1-88d8-2b1fe6258783-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-p9wj4\" (UID: \"bad776f4-e24b-41f1-88d8-2b1fe6258783\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p9wj4" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.397582 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fab52d01-f907-44cb-8d5f-162116d75fc9-config\") pod \"etcd-operator-b45778765-j6vfb\" (UID: \"fab52d01-f907-44cb-8d5f-162116d75fc9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.398628 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fab52d01-f907-44cb-8d5f-162116d75fc9-etcd-client\") pod \"etcd-operator-b45778765-j6vfb\" (UID: \"fab52d01-f907-44cb-8d5f-162116d75fc9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.397715 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/36efccb8-7513-43d0-8952-d7ad9546da8e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tpntx\" (UID: \"36efccb8-7513-43d0-8952-d7ad9546da8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpntx" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.397752 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fab52d01-f907-44cb-8d5f-162116d75fc9-etcd-service-ca\") pod \"etcd-operator-b45778765-j6vfb\" (UID: \"fab52d01-f907-44cb-8d5f-162116d75fc9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.398691 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0ecd5cc-b456-4d69-897c-5fd543842440-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gngtb\" (UID: \"e0ecd5cc-b456-4d69-897c-5fd543842440\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gngtb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.398750 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/460579d9-ed16-49b7-a588-ef20ceb9bbf4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9tm8w\" (UID: \"460579d9-ed16-49b7-a588-ef20ceb9bbf4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tm8w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.398795 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4a97e310-1811-48a9-a31a-eb9a0321d280-default-certificate\") pod \"router-default-5444994796-xm88w\" (UID: \"4a97e310-1811-48a9-a31a-eb9a0321d280\") " pod="openshift-ingress/router-default-5444994796-xm88w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.398848 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgnh6\" (UniqueName: \"kubernetes.io/projected/34163385-0c26-4d54-a06a-11f9ef09901d-kube-api-access-qgnh6\") pod \"multus-admission-controller-857f4d67dd-k9bv4\" (UID: \"34163385-0c26-4d54-a06a-11f9ef09901d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k9bv4" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.399070 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8m6h\" (UniqueName: \"kubernetes.io/projected/fab52d01-f907-44cb-8d5f-162116d75fc9-kube-api-access-d8m6h\") pod \"etcd-operator-b45778765-j6vfb\" (UID: \"fab52d01-f907-44cb-8d5f-162116d75fc9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.399104 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a97e310-1811-48a9-a31a-eb9a0321d280-service-ca-bundle\") pod \"router-default-5444994796-xm88w\" (UID: \"4a97e310-1811-48a9-a31a-eb9a0321d280\") " pod="openshift-ingress/router-default-5444994796-xm88w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.400499 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fab52d01-f907-44cb-8d5f-162116d75fc9-serving-cert\") pod \"etcd-operator-b45778765-j6vfb\" (UID: \"fab52d01-f907-44cb-8d5f-162116d75fc9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.405435 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fab52d01-f907-44cb-8d5f-162116d75fc9-etcd-client\") pod \"etcd-operator-b45778765-j6vfb\" (UID: \"fab52d01-f907-44cb-8d5f-162116d75fc9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.408698 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.428336 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.448192 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.467941 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.488579 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.508058 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.510072 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a97e310-1811-48a9-a31a-eb9a0321d280-service-ca-bundle\") pod \"router-default-5444994796-xm88w\" (UID: \"4a97e310-1811-48a9-a31a-eb9a0321d280\") " pod="openshift-ingress/router-default-5444994796-xm88w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.528365 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.548550 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.569534 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.583004 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4a97e310-1811-48a9-a31a-eb9a0321d280-default-certificate\") pod \"router-default-5444994796-xm88w\" (UID: \"4a97e310-1811-48a9-a31a-eb9a0321d280\") " pod="openshift-ingress/router-default-5444994796-xm88w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.588484 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.602255 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4a97e310-1811-48a9-a31a-eb9a0321d280-stats-auth\") pod \"router-default-5444994796-xm88w\" (UID: \"4a97e310-1811-48a9-a31a-eb9a0321d280\") " pod="openshift-ingress/router-default-5444994796-xm88w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.608611 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.620446 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a97e310-1811-48a9-a31a-eb9a0321d280-metrics-certs\") pod \"router-default-5444994796-xm88w\" (UID: \"4a97e310-1811-48a9-a31a-eb9a0321d280\") " pod="openshift-ingress/router-default-5444994796-xm88w" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.628690 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.638695 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89928475-c3fb-415f-a244-6292dc8adc33-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gpjvh\" (UID: \"89928475-c3fb-415f-a244-6292dc8adc33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gpjvh" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.648221 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.668009 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.676153 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89928475-c3fb-415f-a244-6292dc8adc33-config\") pod \"kube-controller-manager-operator-78b949d7b-gpjvh\" (UID: \"89928475-c3fb-415f-a244-6292dc8adc33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gpjvh" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.688012 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.700282 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bad776f4-e24b-41f1-88d8-2b1fe6258783-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-p9wj4\" (UID: \"bad776f4-e24b-41f1-88d8-2b1fe6258783\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p9wj4" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.708360 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.728886 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.747674 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.758556 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0ecd5cc-b456-4d69-897c-5fd543842440-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gngtb\" (UID: \"e0ecd5cc-b456-4d69-897c-5fd543842440\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gngtb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.769046 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.788650 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.808786 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.812809 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0ecd5cc-b456-4d69-897c-5fd543842440-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gngtb\" (UID: \"e0ecd5cc-b456-4d69-897c-5fd543842440\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gngtb" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.828069 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.849002 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.869300 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.887985 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.908788 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.919947 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/34163385-0c26-4d54-a06a-11f9ef09901d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-k9bv4\" (UID: \"34163385-0c26-4d54-a06a-11f9ef09901d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k9bv4" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.928360 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.947729 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.976218 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 26 11:13:55 crc kubenswrapper[4699]: I0226 11:13:55.988537 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.008094 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.028546 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.048302 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.068420 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.079577 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36efccb8-7513-43d0-8952-d7ad9546da8e-proxy-tls\") pod \"machine-config-controller-84d6567774-tpntx\" (UID: \"36efccb8-7513-43d0-8952-d7ad9546da8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpntx" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.088468 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.108343 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.127935 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.131739 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/460579d9-ed16-49b7-a588-ef20ceb9bbf4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9tm8w\" (UID: \"460579d9-ed16-49b7-a588-ef20ceb9bbf4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tm8w" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.148871 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.188387 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.208581 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.227002 4699 request.go:700] Waited for 1.003087806s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dcatalog-operator-serving-cert&limit=500&resourceVersion=0 Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.229669 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.248221 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.269492 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.288505 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.309325 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.329642 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.348509 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.368220 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.389308 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.409177 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.429040 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.448263 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.468394 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.488599 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.507518 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.528053 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.547974 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.567743 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.588318 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.615632 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.627942 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.647928 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.668884 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.688309 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.708922 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.728546 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.747678 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.768298 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.788018 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.808531 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.827585 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 26 11:13:56 crc kubenswrapper[4699]: I0226 11:13:56.868670 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.008068 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.027872 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.188420 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.209988 4699 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.228501 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.246825 4699 request.go:700] Waited for 1.908555411s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-server-tls&limit=500&resourceVersion=0 Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.248666 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.268170 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.288216 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.308762 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.328297 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.347847 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.368563 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.402960 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mxpf\" (UniqueName: \"kubernetes.io/projected/460579d9-ed16-49b7-a588-ef20ceb9bbf4-kube-api-access-2mxpf\") pod \"cluster-samples-operator-665b6dd947-9tm8w\" (UID: \"460579d9-ed16-49b7-a588-ef20ceb9bbf4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tm8w" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.442739 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tkln\" (UniqueName: \"kubernetes.io/projected/36efccb8-7513-43d0-8952-d7ad9546da8e-kube-api-access-2tkln\") pod \"machine-config-controller-84d6567774-tpntx\" (UID: \"36efccb8-7513-43d0-8952-d7ad9546da8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpntx" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.543688 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgnh6\" (UniqueName: \"kubernetes.io/projected/34163385-0c26-4d54-a06a-11f9ef09901d-kube-api-access-qgnh6\") pod \"multus-admission-controller-857f4d67dd-k9bv4\" (UID: \"34163385-0c26-4d54-a06a-11f9ef09901d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k9bv4" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.588237 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.607810 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.615684 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b070a40-85a6-42e6-a1bd-d834170a9c9c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f8rd9\" (UID: \"6b070a40-85a6-42e6-a1bd-d834170a9c9c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8rd9" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.624441 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpntx" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.628645 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.631008 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpx8q\" (UniqueName: \"kubernetes.io/projected/09191eec-0be2-4c45-9249-6c8081d6108a-kube-api-access-lpx8q\") pod \"ingress-operator-5b745b69d9-wcnnr\" (UID: \"09191eec-0be2-4c45-9249-6c8081d6108a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wcnnr" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.631379 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cd7cbed-d0bf-4d8c-933c-4d031170288a-config\") pod \"kube-apiserver-operator-766d6c64bb-vmxjr\" (UID: \"8cd7cbed-d0bf-4d8c-933c-4d031170288a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vmxjr" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.631493 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c7d5fe0-885a-44e4-bacf-19bceeea178f-config\") pod \"console-operator-58897d9998-hzqgp\" (UID: \"0c7d5fe0-885a-44e4-bacf-19bceeea178f\") " pod="openshift-console-operator/console-operator-58897d9998-hzqgp" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.631674 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7232eb23-31ae-4e72-ae27-c256dc4cac9a-registry-tls\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.631929 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trrf7\" (UniqueName: \"kubernetes.io/projected/7232eb23-31ae-4e72-ae27-c256dc4cac9a-kube-api-access-trrf7\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.632000 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cd7cbed-d0bf-4d8c-933c-4d031170288a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vmxjr\" (UID: \"8cd7cbed-d0bf-4d8c-933c-4d031170288a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vmxjr" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.632063 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7232eb23-31ae-4e72-ae27-c256dc4cac9a-bound-sa-token\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.632099 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/727302ed-b5c0-49b7-be17-7da9387c16c3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vvgmh\" (UID: \"727302ed-b5c0-49b7-be17-7da9387c16c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvgmh" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.632171 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c7d5fe0-885a-44e4-bacf-19bceeea178f-trusted-ca\") pod \"console-operator-58897d9998-hzqgp\" (UID: \"0c7d5fe0-885a-44e4-bacf-19bceeea178f\") " pod="openshift-console-operator/console-operator-58897d9998-hzqgp" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.632246 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/727302ed-b5c0-49b7-be17-7da9387c16c3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vvgmh\" (UID: \"727302ed-b5c0-49b7-be17-7da9387c16c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvgmh" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.632339 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/727302ed-b5c0-49b7-be17-7da9387c16c3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vvgmh\" (UID: \"727302ed-b5c0-49b7-be17-7da9387c16c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvgmh" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.632384 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/09191eec-0be2-4c45-9249-6c8081d6108a-metrics-tls\") pod \"ingress-operator-5b745b69d9-wcnnr\" (UID: \"09191eec-0be2-4c45-9249-6c8081d6108a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wcnnr" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.632478 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7232eb23-31ae-4e72-ae27-c256dc4cac9a-registry-certificates\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.632509 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7232eb23-31ae-4e72-ae27-c256dc4cac9a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.632531 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/09191eec-0be2-4c45-9249-6c8081d6108a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-wcnnr\" (UID: \"09191eec-0be2-4c45-9249-6c8081d6108a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wcnnr" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.632639 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8cd7cbed-d0bf-4d8c-933c-4d031170288a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vmxjr\" (UID: \"8cd7cbed-d0bf-4d8c-933c-4d031170288a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vmxjr" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.632684 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7chv\" (UniqueName: \"kubernetes.io/projected/679ffaa0-41b8-4638-8b4c-4c1f424812e4-kube-api-access-t7chv\") pod \"machine-approver-56656f9798-p742p\" (UID: \"679ffaa0-41b8-4638-8b4c-4c1f424812e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p742p" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.632777 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c7d5fe0-885a-44e4-bacf-19bceeea178f-serving-cert\") pod \"console-operator-58897d9998-hzqgp\" (UID: \"0c7d5fe0-885a-44e4-bacf-19bceeea178f\") " pod="openshift-console-operator/console-operator-58897d9998-hzqgp" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.632811 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7232eb23-31ae-4e72-ae27-c256dc4cac9a-trusted-ca\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.632839 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/679ffaa0-41b8-4638-8b4c-4c1f424812e4-config\") pod \"machine-approver-56656f9798-p742p\" (UID: \"679ffaa0-41b8-4638-8b4c-4c1f424812e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p742p" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.633018 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb2g6\" (UniqueName: \"kubernetes.io/projected/727302ed-b5c0-49b7-be17-7da9387c16c3-kube-api-access-sb2g6\") pod \"cluster-image-registry-operator-dc59b4c8b-vvgmh\" (UID: \"727302ed-b5c0-49b7-be17-7da9387c16c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvgmh" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.633058 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/679ffaa0-41b8-4638-8b4c-4c1f424812e4-machine-approver-tls\") pod \"machine-approver-56656f9798-p742p\" (UID: \"679ffaa0-41b8-4638-8b4c-4c1f424812e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p742p" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.633172 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/679ffaa0-41b8-4638-8b4c-4c1f424812e4-auth-proxy-config\") pod \"machine-approver-56656f9798-p742p\" (UID: \"679ffaa0-41b8-4638-8b4c-4c1f424812e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p742p" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.633220 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcndj\" (UniqueName: \"kubernetes.io/projected/0c7d5fe0-885a-44e4-bacf-19bceeea178f-kube-api-access-rcndj\") pod \"console-operator-58897d9998-hzqgp\" (UID: \"0c7d5fe0-885a-44e4-bacf-19bceeea178f\") " pod="openshift-console-operator/console-operator-58897d9998-hzqgp" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.633285 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.633321 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09191eec-0be2-4c45-9249-6c8081d6108a-trusted-ca\") pod \"ingress-operator-5b745b69d9-wcnnr\" (UID: \"09191eec-0be2-4c45-9249-6c8081d6108a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wcnnr" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.633410 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7232eb23-31ae-4e72-ae27-c256dc4cac9a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.633684 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tm8w" Feb 26 11:13:57 crc kubenswrapper[4699]: E0226 11:13:57.633910 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:13:58.133895492 +0000 UTC m=+183.944721926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.648937 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.668976 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.688622 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.708175 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.728843 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.734877 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:13:57 crc kubenswrapper[4699]: E0226 11:13:57.735048 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:13:58.235026818 +0000 UTC m=+184.045853252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.735336 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/727302ed-b5c0-49b7-be17-7da9387c16c3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vvgmh\" (UID: \"727302ed-b5c0-49b7-be17-7da9387c16c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvgmh" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.735492 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/09191eec-0be2-4c45-9249-6c8081d6108a-metrics-tls\") pod \"ingress-operator-5b745b69d9-wcnnr\" (UID: \"09191eec-0be2-4c45-9249-6c8081d6108a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wcnnr" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.735578 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89f840f7-d21f-4028-b53d-ed0e2061ff15-config-volume\") pod \"dns-default-tnwpn\" (UID: \"89f840f7-d21f-4028-b53d-ed0e2061ff15\") " pod="openshift-dns/dns-default-tnwpn" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.735674 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/44832f39-2c56-4669-b328-7e663f6cacdf-webhook-cert\") pod \"packageserver-d55dfcdfc-w7nqx\" (UID: \"44832f39-2c56-4669-b328-7e663f6cacdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.735873 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7232eb23-31ae-4e72-ae27-c256dc4cac9a-registry-certificates\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.735912 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f8a28b8-c47b-4288-877f-8e90a3b581b5-config-volume\") pod \"collect-profiles-29535060-f97rd\" (UID: \"5f8a28b8-c47b-4288-877f-8e90a3b581b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535060-f97rd" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.735938 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckkpm\" (UniqueName: \"kubernetes.io/projected/6b9ab605-cf5d-43ea-9554-20032a52e23c-kube-api-access-ckkpm\") pod \"machine-config-server-rlx7c\" (UID: \"6b9ab605-cf5d-43ea-9554-20032a52e23c\") " pod="openshift-machine-config-operator/machine-config-server-rlx7c" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.735986 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4f243e8-e08c-420e-a78b-02e6a14bf5fe-proxy-tls\") pod \"machine-config-operator-74547568cd-pxsr8\" (UID: \"b4f243e8-e08c-420e-a78b-02e6a14bf5fe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pxsr8" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.736013 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/79a9064f-5fcf-42f7-af6f-71aeeb75560e-plugins-dir\") pod \"csi-hostpathplugin-qzphl\" (UID: \"79a9064f-5fcf-42f7-af6f-71aeeb75560e\") " pod="hostpath-provisioner/csi-hostpathplugin-qzphl" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.736036 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00fcad37-801c-4a2c-8599-dabd0f36db6d-cert\") pod \"ingress-canary-r2phw\" (UID: \"00fcad37-801c-4a2c-8599-dabd0f36db6d\") " pod="openshift-ingress-canary/ingress-canary-r2phw" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.736061 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7232eb23-31ae-4e72-ae27-c256dc4cac9a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.736085 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/09191eec-0be2-4c45-9249-6c8081d6108a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-wcnnr\" (UID: \"09191eec-0be2-4c45-9249-6c8081d6108a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wcnnr" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.736106 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8cd7cbed-d0bf-4d8c-933c-4d031170288a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vmxjr\" (UID: \"8cd7cbed-d0bf-4d8c-933c-4d031170288a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vmxjr" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.736163 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7chv\" (UniqueName: \"kubernetes.io/projected/679ffaa0-41b8-4638-8b4c-4c1f424812e4-kube-api-access-t7chv\") pod \"machine-approver-56656f9798-p742p\" (UID: \"679ffaa0-41b8-4638-8b4c-4c1f424812e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p742p" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.736189 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c7d5fe0-885a-44e4-bacf-19bceeea178f-serving-cert\") pod \"console-operator-58897d9998-hzqgp\" (UID: \"0c7d5fe0-885a-44e4-bacf-19bceeea178f\") " pod="openshift-console-operator/console-operator-58897d9998-hzqgp" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.736240 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/679ffaa0-41b8-4638-8b4c-4c1f424812e4-config\") pod \"machine-approver-56656f9798-p742p\" (UID: \"679ffaa0-41b8-4638-8b4c-4c1f424812e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p742p" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.736269 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cc10041-704b-4b00-8e4e-369103434b64-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cd5qf\" (UID: \"5cc10041-704b-4b00-8e4e-369103434b64\") " pod="openshift-marketplace/marketplace-operator-79b997595-cd5qf" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.736293 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c57mh\" (UniqueName: \"kubernetes.io/projected/b4f243e8-e08c-420e-a78b-02e6a14bf5fe-kube-api-access-c57mh\") pod \"machine-config-operator-74547568cd-pxsr8\" (UID: \"b4f243e8-e08c-420e-a78b-02e6a14bf5fe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pxsr8" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.736318 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7232eb23-31ae-4e72-ae27-c256dc4cac9a-trusted-ca\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.736362 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/23bae79f-03c7-4710-ac97-25da2c7988c4-signing-key\") pod \"service-ca-9c57cc56f-zqgj9\" (UID: \"23bae79f-03c7-4710-ac97-25da2c7988c4\") " pod="openshift-service-ca/service-ca-9c57cc56f-zqgj9" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.736396 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/23bae79f-03c7-4710-ac97-25da2c7988c4-signing-cabundle\") pod \"service-ca-9c57cc56f-zqgj9\" (UID: \"23bae79f-03c7-4710-ac97-25da2c7988c4\") " pod="openshift-service-ca/service-ca-9c57cc56f-zqgj9" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.736437 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1a9875bc-9f2e-4887-8dc6-a00cc789eb4a-srv-cert\") pod \"catalog-operator-68c6474976-xvgnb\" (UID: \"1a9875bc-9f2e-4887-8dc6-a00cc789eb4a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvgnb" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.736572 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6b9ab605-cf5d-43ea-9554-20032a52e23c-node-bootstrap-token\") pod \"machine-config-server-rlx7c\" (UID: \"6b9ab605-cf5d-43ea-9554-20032a52e23c\") " pod="openshift-machine-config-operator/machine-config-server-rlx7c" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.736669 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbc38281-b1a4-4c40-a707-a106b651c107-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-tg675\" (UID: \"fbc38281-b1a4-4c40-a707-a106b651c107\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tg675" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.736725 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5f6e45f7-93da-46b8-9021-d2500076c385-srv-cert\") pod \"olm-operator-6b444d44fb-czs8l\" (UID: \"5f6e45f7-93da-46b8-9021-d2500076c385\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-czs8l" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.736835 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89f840f7-d21f-4028-b53d-ed0e2061ff15-metrics-tls\") pod \"dns-default-tnwpn\" (UID: \"89f840f7-d21f-4028-b53d-ed0e2061ff15\") " pod="openshift-dns/dns-default-tnwpn" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.737156 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb2g6\" (UniqueName: \"kubernetes.io/projected/727302ed-b5c0-49b7-be17-7da9387c16c3-kube-api-access-sb2g6\") pod \"cluster-image-registry-operator-dc59b4c8b-vvgmh\" (UID: \"727302ed-b5c0-49b7-be17-7da9387c16c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvgmh" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.737260 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/679ffaa0-41b8-4638-8b4c-4c1f424812e4-machine-approver-tls\") pod \"machine-approver-56656f9798-p742p\" (UID: \"679ffaa0-41b8-4638-8b4c-4c1f424812e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p742p" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.737296 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9qrl\" (UniqueName: \"kubernetes.io/projected/1a9875bc-9f2e-4887-8dc6-a00cc789eb4a-kube-api-access-l9qrl\") pod \"catalog-operator-68c6474976-xvgnb\" (UID: \"1a9875bc-9f2e-4887-8dc6-a00cc789eb4a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvgnb" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.737372 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/679ffaa0-41b8-4638-8b4c-4c1f424812e4-auth-proxy-config\") pod \"machine-approver-56656f9798-p742p\" (UID: \"679ffaa0-41b8-4638-8b4c-4c1f424812e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p742p" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.737397 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aac34b6-aad8-4b68-8180-f68af008611d-config\") pod \"service-ca-operator-777779d784-ctfcc\" (UID: \"0aac34b6-aad8-4b68-8180-f68af008611d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ctfcc" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.737424 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcndj\" (UniqueName: \"kubernetes.io/projected/0c7d5fe0-885a-44e4-bacf-19bceeea178f-kube-api-access-rcndj\") pod \"console-operator-58897d9998-hzqgp\" (UID: \"0c7d5fe0-885a-44e4-bacf-19bceeea178f\") " pod="openshift-console-operator/console-operator-58897d9998-hzqgp" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.737453 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.737475 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09191eec-0be2-4c45-9249-6c8081d6108a-trusted-ca\") pod \"ingress-operator-5b745b69d9-wcnnr\" (UID: \"09191eec-0be2-4c45-9249-6c8081d6108a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wcnnr" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.737522 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlm8k\" (UniqueName: \"kubernetes.io/projected/0aac34b6-aad8-4b68-8180-f68af008611d-kube-api-access-dlm8k\") pod \"service-ca-operator-777779d784-ctfcc\" (UID: \"0aac34b6-aad8-4b68-8180-f68af008611d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ctfcc" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.737542 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/79a9064f-5fcf-42f7-af6f-71aeeb75560e-registration-dir\") pod \"csi-hostpathplugin-qzphl\" (UID: \"79a9064f-5fcf-42f7-af6f-71aeeb75560e\") " pod="hostpath-provisioner/csi-hostpathplugin-qzphl" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.737572 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/44832f39-2c56-4669-b328-7e663f6cacdf-apiservice-cert\") pod \"packageserver-d55dfcdfc-w7nqx\" (UID: \"44832f39-2c56-4669-b328-7e663f6cacdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.737591 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhkh7\" (UniqueName: \"kubernetes.io/projected/00fcad37-801c-4a2c-8599-dabd0f36db6d-kube-api-access-jhkh7\") pod \"ingress-canary-r2phw\" (UID: \"00fcad37-801c-4a2c-8599-dabd0f36db6d\") " pod="openshift-ingress-canary/ingress-canary-r2phw" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.737636 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnjhv\" (UniqueName: \"kubernetes.io/projected/5f8a28b8-c47b-4288-877f-8e90a3b581b5-kube-api-access-bnjhv\") pod \"collect-profiles-29535060-f97rd\" (UID: \"5f8a28b8-c47b-4288-877f-8e90a3b581b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535060-f97rd" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.737660 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7232eb23-31ae-4e72-ae27-c256dc4cac9a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:57 crc kubenswrapper[4699]: E0226 11:13:57.737811 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:13:58.2377984 +0000 UTC m=+184.048625024 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.737834 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbc38281-b1a4-4c40-a707-a106b651c107-config\") pod \"openshift-apiserver-operator-796bbdcf4f-tg675\" (UID: \"fbc38281-b1a4-4c40-a707-a106b651c107\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tg675" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.737947 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpx8q\" (UniqueName: \"kubernetes.io/projected/09191eec-0be2-4c45-9249-6c8081d6108a-kube-api-access-lpx8q\") pod \"ingress-operator-5b745b69d9-wcnnr\" (UID: \"09191eec-0be2-4c45-9249-6c8081d6108a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wcnnr" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.737974 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/79a9064f-5fcf-42f7-af6f-71aeeb75560e-csi-data-dir\") pod \"csi-hostpathplugin-qzphl\" (UID: \"79a9064f-5fcf-42f7-af6f-71aeeb75560e\") " pod="hostpath-provisioner/csi-hostpathplugin-qzphl" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738078 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhksj\" (UniqueName: \"kubernetes.io/projected/89f840f7-d21f-4028-b53d-ed0e2061ff15-kube-api-access-nhksj\") pod \"dns-default-tnwpn\" (UID: \"89f840f7-d21f-4028-b53d-ed0e2061ff15\") " pod="openshift-dns/dns-default-tnwpn" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738108 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cd7cbed-d0bf-4d8c-933c-4d031170288a-config\") pod \"kube-apiserver-operator-766d6c64bb-vmxjr\" (UID: \"8cd7cbed-d0bf-4d8c-933c-4d031170288a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vmxjr" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738137 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2p88\" (UniqueName: \"kubernetes.io/projected/23bae79f-03c7-4710-ac97-25da2c7988c4-kube-api-access-p2p88\") pod \"service-ca-9c57cc56f-zqgj9\" (UID: \"23bae79f-03c7-4710-ac97-25da2c7988c4\") " pod="openshift-service-ca/service-ca-9c57cc56f-zqgj9" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738192 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c7d5fe0-885a-44e4-bacf-19bceeea178f-config\") pod \"console-operator-58897d9998-hzqgp\" (UID: \"0c7d5fe0-885a-44e4-bacf-19bceeea178f\") " pod="openshift-console-operator/console-operator-58897d9998-hzqgp" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738209 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aac34b6-aad8-4b68-8180-f68af008611d-serving-cert\") pod \"service-ca-operator-777779d784-ctfcc\" (UID: \"0aac34b6-aad8-4b68-8180-f68af008611d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ctfcc" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738228 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6b9ab605-cf5d-43ea-9554-20032a52e23c-certs\") pod \"machine-config-server-rlx7c\" (UID: \"6b9ab605-cf5d-43ea-9554-20032a52e23c\") " pod="openshift-machine-config-operator/machine-config-server-rlx7c" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738265 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7232eb23-31ae-4e72-ae27-c256dc4cac9a-registry-tls\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738285 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwq64\" (UniqueName: \"kubernetes.io/projected/5cc10041-704b-4b00-8e4e-369103434b64-kube-api-access-bwq64\") pod \"marketplace-operator-79b997595-cd5qf\" (UID: \"5cc10041-704b-4b00-8e4e-369103434b64\") " pod="openshift-marketplace/marketplace-operator-79b997595-cd5qf" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738302 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/79a9064f-5fcf-42f7-af6f-71aeeb75560e-mountpoint-dir\") pod \"csi-hostpathplugin-qzphl\" (UID: \"79a9064f-5fcf-42f7-af6f-71aeeb75560e\") " pod="hostpath-provisioner/csi-hostpathplugin-qzphl" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738318 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/44832f39-2c56-4669-b328-7e663f6cacdf-tmpfs\") pod \"packageserver-d55dfcdfc-w7nqx\" (UID: \"44832f39-2c56-4669-b328-7e663f6cacdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738351 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4ml4\" (UniqueName: \"kubernetes.io/projected/79a9064f-5fcf-42f7-af6f-71aeeb75560e-kube-api-access-l4ml4\") pod \"csi-hostpathplugin-qzphl\" (UID: \"79a9064f-5fcf-42f7-af6f-71aeeb75560e\") " pod="hostpath-provisioner/csi-hostpathplugin-qzphl" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738369 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8mlc\" (UniqueName: \"kubernetes.io/projected/1d3e449f-d082-43cb-951d-53d82fde40ca-kube-api-access-t8mlc\") pod \"package-server-manager-789f6589d5-dk749\" (UID: \"1d3e449f-d082-43cb-951d-53d82fde40ca\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dk749" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738402 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6jtx\" (UniqueName: \"kubernetes.io/projected/5f6e45f7-93da-46b8-9021-d2500076c385-kube-api-access-r6jtx\") pod \"olm-operator-6b444d44fb-czs8l\" (UID: \"5f6e45f7-93da-46b8-9021-d2500076c385\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-czs8l" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738419 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trrf7\" (UniqueName: \"kubernetes.io/projected/7232eb23-31ae-4e72-ae27-c256dc4cac9a-kube-api-access-trrf7\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738435 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5f6e45f7-93da-46b8-9021-d2500076c385-profile-collector-cert\") pod \"olm-operator-6b444d44fb-czs8l\" (UID: \"5f6e45f7-93da-46b8-9021-d2500076c385\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-czs8l" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738488 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cd7cbed-d0bf-4d8c-933c-4d031170288a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vmxjr\" (UID: \"8cd7cbed-d0bf-4d8c-933c-4d031170288a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vmxjr" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738521 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b4f243e8-e08c-420e-a78b-02e6a14bf5fe-images\") pod \"machine-config-operator-74547568cd-pxsr8\" (UID: \"b4f243e8-e08c-420e-a78b-02e6a14bf5fe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pxsr8" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738544 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b4f243e8-e08c-420e-a78b-02e6a14bf5fe-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pxsr8\" (UID: \"b4f243e8-e08c-420e-a78b-02e6a14bf5fe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pxsr8" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738571 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k275m\" (UniqueName: \"kubernetes.io/projected/fbc38281-b1a4-4c40-a707-a106b651c107-kube-api-access-k275m\") pod \"openshift-apiserver-operator-796bbdcf4f-tg675\" (UID: \"fbc38281-b1a4-4c40-a707-a106b651c107\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tg675" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738632 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7232eb23-31ae-4e72-ae27-c256dc4cac9a-bound-sa-token\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738657 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/727302ed-b5c0-49b7-be17-7da9387c16c3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vvgmh\" (UID: \"727302ed-b5c0-49b7-be17-7da9387c16c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvgmh" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738680 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f8a28b8-c47b-4288-877f-8e90a3b581b5-secret-volume\") pod \"collect-profiles-29535060-f97rd\" (UID: \"5f8a28b8-c47b-4288-877f-8e90a3b581b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535060-f97rd" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738716 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqcrg\" (UniqueName: \"kubernetes.io/projected/44832f39-2c56-4669-b328-7e663f6cacdf-kube-api-access-jqcrg\") pod \"packageserver-d55dfcdfc-w7nqx\" (UID: \"44832f39-2c56-4669-b328-7e663f6cacdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738754 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1a9875bc-9f2e-4887-8dc6-a00cc789eb4a-profile-collector-cert\") pod \"catalog-operator-68c6474976-xvgnb\" (UID: \"1a9875bc-9f2e-4887-8dc6-a00cc789eb4a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvgnb" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738778 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c7d5fe0-885a-44e4-bacf-19bceeea178f-trusted-ca\") pod \"console-operator-58897d9998-hzqgp\" (UID: \"0c7d5fe0-885a-44e4-bacf-19bceeea178f\") " pod="openshift-console-operator/console-operator-58897d9998-hzqgp" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738850 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/727302ed-b5c0-49b7-be17-7da9387c16c3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vvgmh\" (UID: \"727302ed-b5c0-49b7-be17-7da9387c16c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvgmh" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738875 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/79a9064f-5fcf-42f7-af6f-71aeeb75560e-socket-dir\") pod \"csi-hostpathplugin-qzphl\" (UID: \"79a9064f-5fcf-42f7-af6f-71aeeb75560e\") " pod="hostpath-provisioner/csi-hostpathplugin-qzphl" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738925 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cc10041-704b-4b00-8e4e-369103434b64-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cd5qf\" (UID: \"5cc10041-704b-4b00-8e4e-369103434b64\") " pod="openshift-marketplace/marketplace-operator-79b997595-cd5qf" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.738963 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d3e449f-d082-43cb-951d-53d82fde40ca-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dk749\" (UID: \"1d3e449f-d082-43cb-951d-53d82fde40ca\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dk749" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.742672 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7232eb23-31ae-4e72-ae27-c256dc4cac9a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.743249 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7232eb23-31ae-4e72-ae27-c256dc4cac9a-registry-certificates\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.748083 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.772786 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.788767 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.813701 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.841401 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:13:57 crc kubenswrapper[4699]: E0226 11:13:57.841479 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:13:58.341462881 +0000 UTC m=+184.152289315 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.841628 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/44832f39-2c56-4669-b328-7e663f6cacdf-tmpfs\") pod \"packageserver-d55dfcdfc-w7nqx\" (UID: \"44832f39-2c56-4669-b328-7e663f6cacdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.841663 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6jtx\" (UniqueName: \"kubernetes.io/projected/5f6e45f7-93da-46b8-9021-d2500076c385-kube-api-access-r6jtx\") pod \"olm-operator-6b444d44fb-czs8l\" (UID: \"5f6e45f7-93da-46b8-9021-d2500076c385\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-czs8l" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.841688 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4ml4\" (UniqueName: \"kubernetes.io/projected/79a9064f-5fcf-42f7-af6f-71aeeb75560e-kube-api-access-l4ml4\") pod \"csi-hostpathplugin-qzphl\" (UID: \"79a9064f-5fcf-42f7-af6f-71aeeb75560e\") " pod="hostpath-provisioner/csi-hostpathplugin-qzphl" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.841714 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8mlc\" (UniqueName: \"kubernetes.io/projected/1d3e449f-d082-43cb-951d-53d82fde40ca-kube-api-access-t8mlc\") pod \"package-server-manager-789f6589d5-dk749\" (UID: \"1d3e449f-d082-43cb-951d-53d82fde40ca\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dk749" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.841751 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5f6e45f7-93da-46b8-9021-d2500076c385-profile-collector-cert\") pod \"olm-operator-6b444d44fb-czs8l\" (UID: \"5f6e45f7-93da-46b8-9021-d2500076c385\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-czs8l" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.841833 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b4f243e8-e08c-420e-a78b-02e6a14bf5fe-images\") pod \"machine-config-operator-74547568cd-pxsr8\" (UID: \"b4f243e8-e08c-420e-a78b-02e6a14bf5fe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pxsr8" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.841857 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b4f243e8-e08c-420e-a78b-02e6a14bf5fe-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pxsr8\" (UID: \"b4f243e8-e08c-420e-a78b-02e6a14bf5fe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pxsr8" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.841885 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k275m\" (UniqueName: \"kubernetes.io/projected/fbc38281-b1a4-4c40-a707-a106b651c107-kube-api-access-k275m\") pod \"openshift-apiserver-operator-796bbdcf4f-tg675\" (UID: \"fbc38281-b1a4-4c40-a707-a106b651c107\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tg675" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.841927 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f8a28b8-c47b-4288-877f-8e90a3b581b5-secret-volume\") pod \"collect-profiles-29535060-f97rd\" (UID: \"5f8a28b8-c47b-4288-877f-8e90a3b581b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535060-f97rd" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.842257 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/44832f39-2c56-4669-b328-7e663f6cacdf-tmpfs\") pod \"packageserver-d55dfcdfc-w7nqx\" (UID: \"44832f39-2c56-4669-b328-7e663f6cacdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.842842 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b4f243e8-e08c-420e-a78b-02e6a14bf5fe-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pxsr8\" (UID: \"b4f243e8-e08c-420e-a78b-02e6a14bf5fe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pxsr8" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.843425 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b4f243e8-e08c-420e-a78b-02e6a14bf5fe-images\") pod \"machine-config-operator-74547568cd-pxsr8\" (UID: \"b4f243e8-e08c-420e-a78b-02e6a14bf5fe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pxsr8" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.843605 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqcrg\" (UniqueName: \"kubernetes.io/projected/44832f39-2c56-4669-b328-7e663f6cacdf-kube-api-access-jqcrg\") pod \"packageserver-d55dfcdfc-w7nqx\" (UID: \"44832f39-2c56-4669-b328-7e663f6cacdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.843645 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1a9875bc-9f2e-4887-8dc6-a00cc789eb4a-profile-collector-cert\") pod \"catalog-operator-68c6474976-xvgnb\" (UID: \"1a9875bc-9f2e-4887-8dc6-a00cc789eb4a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvgnb" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.843690 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/79a9064f-5fcf-42f7-af6f-71aeeb75560e-socket-dir\") pod \"csi-hostpathplugin-qzphl\" (UID: \"79a9064f-5fcf-42f7-af6f-71aeeb75560e\") " pod="hostpath-provisioner/csi-hostpathplugin-qzphl" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.843727 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cc10041-704b-4b00-8e4e-369103434b64-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cd5qf\" (UID: \"5cc10041-704b-4b00-8e4e-369103434b64\") " pod="openshift-marketplace/marketplace-operator-79b997595-cd5qf" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.843781 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d3e449f-d082-43cb-951d-53d82fde40ca-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dk749\" (UID: \"1d3e449f-d082-43cb-951d-53d82fde40ca\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dk749" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.843863 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89f840f7-d21f-4028-b53d-ed0e2061ff15-config-volume\") pod \"dns-default-tnwpn\" (UID: \"89f840f7-d21f-4028-b53d-ed0e2061ff15\") " pod="openshift-dns/dns-default-tnwpn" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.843892 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/44832f39-2c56-4669-b328-7e663f6cacdf-webhook-cert\") pod \"packageserver-d55dfcdfc-w7nqx\" (UID: \"44832f39-2c56-4669-b328-7e663f6cacdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.843939 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f8a28b8-c47b-4288-877f-8e90a3b581b5-config-volume\") pod \"collect-profiles-29535060-f97rd\" (UID: \"5f8a28b8-c47b-4288-877f-8e90a3b581b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535060-f97rd" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.843964 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckkpm\" (UniqueName: \"kubernetes.io/projected/6b9ab605-cf5d-43ea-9554-20032a52e23c-kube-api-access-ckkpm\") pod \"machine-config-server-rlx7c\" (UID: \"6b9ab605-cf5d-43ea-9554-20032a52e23c\") " pod="openshift-machine-config-operator/machine-config-server-rlx7c" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.843997 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4f243e8-e08c-420e-a78b-02e6a14bf5fe-proxy-tls\") pod \"machine-config-operator-74547568cd-pxsr8\" (UID: \"b4f243e8-e08c-420e-a78b-02e6a14bf5fe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pxsr8" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.844037 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/79a9064f-5fcf-42f7-af6f-71aeeb75560e-plugins-dir\") pod \"csi-hostpathplugin-qzphl\" (UID: \"79a9064f-5fcf-42f7-af6f-71aeeb75560e\") " pod="hostpath-provisioner/csi-hostpathplugin-qzphl" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.844056 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00fcad37-801c-4a2c-8599-dabd0f36db6d-cert\") pod \"ingress-canary-r2phw\" (UID: \"00fcad37-801c-4a2c-8599-dabd0f36db6d\") " pod="openshift-ingress-canary/ingress-canary-r2phw" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.844826 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89f840f7-d21f-4028-b53d-ed0e2061ff15-config-volume\") pod \"dns-default-tnwpn\" (UID: \"89f840f7-d21f-4028-b53d-ed0e2061ff15\") " pod="openshift-dns/dns-default-tnwpn" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.845454 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cc10041-704b-4b00-8e4e-369103434b64-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cd5qf\" (UID: \"5cc10041-704b-4b00-8e4e-369103434b64\") " pod="openshift-marketplace/marketplace-operator-79b997595-cd5qf" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.845482 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c57mh\" (UniqueName: \"kubernetes.io/projected/b4f243e8-e08c-420e-a78b-02e6a14bf5fe-kube-api-access-c57mh\") pod \"machine-config-operator-74547568cd-pxsr8\" (UID: \"b4f243e8-e08c-420e-a78b-02e6a14bf5fe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pxsr8" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.845505 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/23bae79f-03c7-4710-ac97-25da2c7988c4-signing-key\") pod \"service-ca-9c57cc56f-zqgj9\" (UID: \"23bae79f-03c7-4710-ac97-25da2c7988c4\") " pod="openshift-service-ca/service-ca-9c57cc56f-zqgj9" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.845537 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/23bae79f-03c7-4710-ac97-25da2c7988c4-signing-cabundle\") pod \"service-ca-9c57cc56f-zqgj9\" (UID: \"23bae79f-03c7-4710-ac97-25da2c7988c4\") " pod="openshift-service-ca/service-ca-9c57cc56f-zqgj9" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.845558 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1a9875bc-9f2e-4887-8dc6-a00cc789eb4a-srv-cert\") pod \"catalog-operator-68c6474976-xvgnb\" (UID: \"1a9875bc-9f2e-4887-8dc6-a00cc789eb4a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvgnb" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.845580 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6b9ab605-cf5d-43ea-9554-20032a52e23c-node-bootstrap-token\") pod \"machine-config-server-rlx7c\" (UID: \"6b9ab605-cf5d-43ea-9554-20032a52e23c\") " pod="openshift-machine-config-operator/machine-config-server-rlx7c" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.845600 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbc38281-b1a4-4c40-a707-a106b651c107-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-tg675\" (UID: \"fbc38281-b1a4-4c40-a707-a106b651c107\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tg675" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.845627 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5f6e45f7-93da-46b8-9021-d2500076c385-srv-cert\") pod \"olm-operator-6b444d44fb-czs8l\" (UID: \"5f6e45f7-93da-46b8-9021-d2500076c385\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-czs8l" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.845653 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89f840f7-d21f-4028-b53d-ed0e2061ff15-metrics-tls\") pod \"dns-default-tnwpn\" (UID: \"89f840f7-d21f-4028-b53d-ed0e2061ff15\") " pod="openshift-dns/dns-default-tnwpn" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.845724 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9qrl\" (UniqueName: \"kubernetes.io/projected/1a9875bc-9f2e-4887-8dc6-a00cc789eb4a-kube-api-access-l9qrl\") pod \"catalog-operator-68c6474976-xvgnb\" (UID: \"1a9875bc-9f2e-4887-8dc6-a00cc789eb4a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvgnb" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.845861 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aac34b6-aad8-4b68-8180-f68af008611d-config\") pod \"service-ca-operator-777779d784-ctfcc\" (UID: \"0aac34b6-aad8-4b68-8180-f68af008611d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ctfcc" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.845912 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.845963 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlm8k\" (UniqueName: \"kubernetes.io/projected/0aac34b6-aad8-4b68-8180-f68af008611d-kube-api-access-dlm8k\") pod \"service-ca-operator-777779d784-ctfcc\" (UID: \"0aac34b6-aad8-4b68-8180-f68af008611d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ctfcc" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.845987 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/79a9064f-5fcf-42f7-af6f-71aeeb75560e-registration-dir\") pod \"csi-hostpathplugin-qzphl\" (UID: \"79a9064f-5fcf-42f7-af6f-71aeeb75560e\") " pod="hostpath-provisioner/csi-hostpathplugin-qzphl" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.846008 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/44832f39-2c56-4669-b328-7e663f6cacdf-apiservice-cert\") pod \"packageserver-d55dfcdfc-w7nqx\" (UID: \"44832f39-2c56-4669-b328-7e663f6cacdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.846029 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhkh7\" (UniqueName: \"kubernetes.io/projected/00fcad37-801c-4a2c-8599-dabd0f36db6d-kube-api-access-jhkh7\") pod \"ingress-canary-r2phw\" (UID: \"00fcad37-801c-4a2c-8599-dabd0f36db6d\") " pod="openshift-ingress-canary/ingress-canary-r2phw" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.846054 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnjhv\" (UniqueName: \"kubernetes.io/projected/5f8a28b8-c47b-4288-877f-8e90a3b581b5-kube-api-access-bnjhv\") pod \"collect-profiles-29535060-f97rd\" (UID: \"5f8a28b8-c47b-4288-877f-8e90a3b581b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535060-f97rd" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.846088 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbc38281-b1a4-4c40-a707-a106b651c107-config\") pod \"openshift-apiserver-operator-796bbdcf4f-tg675\" (UID: \"fbc38281-b1a4-4c40-a707-a106b651c107\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tg675" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.846151 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/79a9064f-5fcf-42f7-af6f-71aeeb75560e-csi-data-dir\") pod \"csi-hostpathplugin-qzphl\" (UID: \"79a9064f-5fcf-42f7-af6f-71aeeb75560e\") " pod="hostpath-provisioner/csi-hostpathplugin-qzphl" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.846215 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhksj\" (UniqueName: \"kubernetes.io/projected/89f840f7-d21f-4028-b53d-ed0e2061ff15-kube-api-access-nhksj\") pod \"dns-default-tnwpn\" (UID: \"89f840f7-d21f-4028-b53d-ed0e2061ff15\") " pod="openshift-dns/dns-default-tnwpn" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.846260 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2p88\" (UniqueName: \"kubernetes.io/projected/23bae79f-03c7-4710-ac97-25da2c7988c4-kube-api-access-p2p88\") pod \"service-ca-9c57cc56f-zqgj9\" (UID: \"23bae79f-03c7-4710-ac97-25da2c7988c4\") " pod="openshift-service-ca/service-ca-9c57cc56f-zqgj9" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.846294 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aac34b6-aad8-4b68-8180-f68af008611d-serving-cert\") pod \"service-ca-operator-777779d784-ctfcc\" (UID: \"0aac34b6-aad8-4b68-8180-f68af008611d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ctfcc" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.846324 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6b9ab605-cf5d-43ea-9554-20032a52e23c-certs\") pod \"machine-config-server-rlx7c\" (UID: \"6b9ab605-cf5d-43ea-9554-20032a52e23c\") " pod="openshift-machine-config-operator/machine-config-server-rlx7c" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.846372 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwq64\" (UniqueName: \"kubernetes.io/projected/5cc10041-704b-4b00-8e4e-369103434b64-kube-api-access-bwq64\") pod \"marketplace-operator-79b997595-cd5qf\" (UID: \"5cc10041-704b-4b00-8e4e-369103434b64\") " pod="openshift-marketplace/marketplace-operator-79b997595-cd5qf" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.846396 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/79a9064f-5fcf-42f7-af6f-71aeeb75560e-mountpoint-dir\") pod \"csi-hostpathplugin-qzphl\" (UID: \"79a9064f-5fcf-42f7-af6f-71aeeb75560e\") " pod="hostpath-provisioner/csi-hostpathplugin-qzphl" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.847159 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cc10041-704b-4b00-8e4e-369103434b64-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cd5qf\" (UID: \"5cc10041-704b-4b00-8e4e-369103434b64\") " pod="openshift-marketplace/marketplace-operator-79b997595-cd5qf" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.847442 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/79a9064f-5fcf-42f7-af6f-71aeeb75560e-socket-dir\") pod \"csi-hostpathplugin-qzphl\" (UID: \"79a9064f-5fcf-42f7-af6f-71aeeb75560e\") " pod="hostpath-provisioner/csi-hostpathplugin-qzphl" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.850404 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbc38281-b1a4-4c40-a707-a106b651c107-config\") pod \"openshift-apiserver-operator-796bbdcf4f-tg675\" (UID: \"fbc38281-b1a4-4c40-a707-a106b651c107\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tg675" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.850547 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/79a9064f-5fcf-42f7-af6f-71aeeb75560e-csi-data-dir\") pod \"csi-hostpathplugin-qzphl\" (UID: \"79a9064f-5fcf-42f7-af6f-71aeeb75560e\") " pod="hostpath-provisioner/csi-hostpathplugin-qzphl" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.851466 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aac34b6-aad8-4b68-8180-f68af008611d-config\") pod \"service-ca-operator-777779d784-ctfcc\" (UID: \"0aac34b6-aad8-4b68-8180-f68af008611d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ctfcc" Feb 26 11:13:57 crc kubenswrapper[4699]: E0226 11:13:57.851892 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:13:58.351874188 +0000 UTC m=+184.162700622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.852613 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.854768 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1a9875bc-9f2e-4887-8dc6-a00cc789eb4a-profile-collector-cert\") pod \"catalog-operator-68c6474976-xvgnb\" (UID: \"1a9875bc-9f2e-4887-8dc6-a00cc789eb4a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvgnb" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.855183 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f8a28b8-c47b-4288-877f-8e90a3b581b5-secret-volume\") pod \"collect-profiles-29535060-f97rd\" (UID: \"5f8a28b8-c47b-4288-877f-8e90a3b581b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535060-f97rd" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.855245 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/79a9064f-5fcf-42f7-af6f-71aeeb75560e-mountpoint-dir\") pod \"csi-hostpathplugin-qzphl\" (UID: \"79a9064f-5fcf-42f7-af6f-71aeeb75560e\") " pod="hostpath-provisioner/csi-hostpathplugin-qzphl" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.855345 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/79a9064f-5fcf-42f7-af6f-71aeeb75560e-registration-dir\") pod \"csi-hostpathplugin-qzphl\" (UID: \"79a9064f-5fcf-42f7-af6f-71aeeb75560e\") " pod="hostpath-provisioner/csi-hostpathplugin-qzphl" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.855814 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6b9ab605-cf5d-43ea-9554-20032a52e23c-certs\") pod \"machine-config-server-rlx7c\" (UID: \"6b9ab605-cf5d-43ea-9554-20032a52e23c\") " pod="openshift-machine-config-operator/machine-config-server-rlx7c" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.856375 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbc38281-b1a4-4c40-a707-a106b651c107-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-tg675\" (UID: \"fbc38281-b1a4-4c40-a707-a106b651c107\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tg675" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.857421 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/23bae79f-03c7-4710-ac97-25da2c7988c4-signing-cabundle\") pod \"service-ca-9c57cc56f-zqgj9\" (UID: \"23bae79f-03c7-4710-ac97-25da2c7988c4\") " pod="openshift-service-ca/service-ca-9c57cc56f-zqgj9" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.857685 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5f6e45f7-93da-46b8-9021-d2500076c385-profile-collector-cert\") pod \"olm-operator-6b444d44fb-czs8l\" (UID: \"5f6e45f7-93da-46b8-9021-d2500076c385\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-czs8l" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.858909 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89f840f7-d21f-4028-b53d-ed0e2061ff15-metrics-tls\") pod \"dns-default-tnwpn\" (UID: \"89f840f7-d21f-4028-b53d-ed0e2061ff15\") " pod="openshift-dns/dns-default-tnwpn" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.858988 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d3e449f-d082-43cb-951d-53d82fde40ca-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dk749\" (UID: \"1d3e449f-d082-43cb-951d-53d82fde40ca\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dk749" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.859185 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/79a9064f-5fcf-42f7-af6f-71aeeb75560e-plugins-dir\") pod \"csi-hostpathplugin-qzphl\" (UID: \"79a9064f-5fcf-42f7-af6f-71aeeb75560e\") " pod="hostpath-provisioner/csi-hostpathplugin-qzphl" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.859514 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/44832f39-2c56-4669-b328-7e663f6cacdf-webhook-cert\") pod \"packageserver-d55dfcdfc-w7nqx\" (UID: \"44832f39-2c56-4669-b328-7e663f6cacdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.859596 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/44832f39-2c56-4669-b328-7e663f6cacdf-apiservice-cert\") pod \"packageserver-d55dfcdfc-w7nqx\" (UID: \"44832f39-2c56-4669-b328-7e663f6cacdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.859665 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cc10041-704b-4b00-8e4e-369103434b64-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cd5qf\" (UID: \"5cc10041-704b-4b00-8e4e-369103434b64\") " pod="openshift-marketplace/marketplace-operator-79b997595-cd5qf" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.859824 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f8a28b8-c47b-4288-877f-8e90a3b581b5-config-volume\") pod \"collect-profiles-29535060-f97rd\" (UID: \"5f8a28b8-c47b-4288-877f-8e90a3b581b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535060-f97rd" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.860159 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5f6e45f7-93da-46b8-9021-d2500076c385-srv-cert\") pod \"olm-operator-6b444d44fb-czs8l\" (UID: \"5f6e45f7-93da-46b8-9021-d2500076c385\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-czs8l" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.860353 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00fcad37-801c-4a2c-8599-dabd0f36db6d-cert\") pod \"ingress-canary-r2phw\" (UID: \"00fcad37-801c-4a2c-8599-dabd0f36db6d\") " pod="openshift-ingress-canary/ingress-canary-r2phw" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.860857 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/23bae79f-03c7-4710-ac97-25da2c7988c4-signing-key\") pod \"service-ca-9c57cc56f-zqgj9\" (UID: \"23bae79f-03c7-4710-ac97-25da2c7988c4\") " pod="openshift-service-ca/service-ca-9c57cc56f-zqgj9" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.861555 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6b9ab605-cf5d-43ea-9554-20032a52e23c-node-bootstrap-token\") pod \"machine-config-server-rlx7c\" (UID: \"6b9ab605-cf5d-43ea-9554-20032a52e23c\") " pod="openshift-machine-config-operator/machine-config-server-rlx7c" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.861754 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aac34b6-aad8-4b68-8180-f68af008611d-serving-cert\") pod \"service-ca-operator-777779d784-ctfcc\" (UID: \"0aac34b6-aad8-4b68-8180-f68af008611d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ctfcc" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.862178 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1a9875bc-9f2e-4887-8dc6-a00cc789eb4a-srv-cert\") pod \"catalog-operator-68c6474976-xvgnb\" (UID: \"1a9875bc-9f2e-4887-8dc6-a00cc789eb4a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvgnb" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.862689 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4f243e8-e08c-420e-a78b-02e6a14bf5fe-proxy-tls\") pod \"machine-config-operator-74547568cd-pxsr8\" (UID: \"b4f243e8-e08c-420e-a78b-02e6a14bf5fe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pxsr8" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.878713 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.879701 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tpntx"] Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.887887 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.891223 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89928475-c3fb-415f-a244-6292dc8adc33-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gpjvh\" (UID: \"89928475-c3fb-415f-a244-6292dc8adc33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gpjvh" Feb 26 11:13:57 crc kubenswrapper[4699]: W0226 11:13:57.892256 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36efccb8_7513_43d0_8952_d7ad9546da8e.slice/crio-837f0748289c32dab50176203bf6899f152b0041109c72b2c3ad10609d715f51 WatchSource:0}: Error finding container 837f0748289c32dab50176203bf6899f152b0041109c72b2c3ad10609d715f51: Status 404 returned error can't find the container with id 837f0748289c32dab50176203bf6899f152b0041109c72b2c3ad10609d715f51 Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.907363 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tm8w"] Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.908446 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.928172 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.947059 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:13:57 crc kubenswrapper[4699]: E0226 11:13:57.947904 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:13:58.447885603 +0000 UTC m=+184.258712037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.948264 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.966613 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-k9bv4" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.978835 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.991736 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ttxr\" (UniqueName: \"kubernetes.io/projected/61a1581f-5367-4535-99bc-3f28547ab766-kube-api-access-2ttxr\") pod \"dns-operator-744455d44c-xbpcs\" (UID: \"61a1581f-5367-4535-99bc-3f28547ab766\") " pod="openshift-dns-operator/dns-operator-744455d44c-xbpcs" Feb 26 11:13:57 crc kubenswrapper[4699]: I0226 11:13:57.992186 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.003875 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvvdh\" (UniqueName: \"kubernetes.io/projected/e6bdcf19-db76-497c-a2fe-a6de38fae724-kube-api-access-wvvdh\") pod \"console-f9d7485db-hnsh7\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.007286 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmzs7\" (UniqueName: \"kubernetes.io/projected/72b1bc55-f48b-4d90-ab02-3a80438096b6-kube-api-access-rmzs7\") pod \"downloads-7954f5f757-tcnxt\" (UID: \"72b1bc55-f48b-4d90-ab02-3a80438096b6\") " pod="openshift-console/downloads-7954f5f757-tcnxt" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.009794 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.028620 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.035188 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcvd8\" (UniqueName: \"kubernetes.io/projected/9c1f6032-b723-4cb3-a93b-73d053eaf822-kube-api-access-vcvd8\") pod \"openshift-controller-manager-operator-756b6f6bc6-2zshh\" (UID: \"9c1f6032-b723-4cb3-a93b-73d053eaf822\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2zshh" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.042967 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7knj\" (UniqueName: \"kubernetes.io/projected/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-kube-api-access-q7knj\") pod \"oauth-openshift-558db77b4-22qbz\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.049071 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.049101 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.049866 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:13:58.549848513 +0000 UTC m=+184.360675127 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.053468 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjjgq\" (UniqueName: \"kubernetes.io/projected/796e9631-3388-48b1-8675-3fbc4b6e435d-kube-api-access-vjjgq\") pod \"controller-manager-879f6c89f-gsl8w\" (UID: \"796e9631-3388-48b1-8675-3fbc4b6e435d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.070618 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.085373 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgwjg\" (UniqueName: \"kubernetes.io/projected/bad776f4-e24b-41f1-88d8-2b1fe6258783-kube-api-access-tgwjg\") pod \"control-plane-machine-set-operator-78cbb6b69f-p9wj4\" (UID: \"bad776f4-e24b-41f1-88d8-2b1fe6258783\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p9wj4" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.085373 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4qjv\" (UniqueName: \"kubernetes.io/projected/5d015dd8-56c9-4f61-b133-4951cda91ca5-kube-api-access-j4qjv\") pod \"machine-api-operator-5694c8668f-pw64v\" (UID: \"5d015dd8-56c9-4f61-b133-4951cda91ca5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw64v" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.088591 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.092147 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f94s2\" (UniqueName: \"kubernetes.io/projected/64bd7009-a06a-43e1-b265-3ea78b5801b9-kube-api-access-f94s2\") pod \"authentication-operator-69f744f599-qsj62\" (UID: \"64bd7009-a06a-43e1-b265-3ea78b5801b9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qsj62" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.117905 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.128777 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/727302ed-b5c0-49b7-be17-7da9387c16c3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vvgmh\" (UID: \"727302ed-b5c0-49b7-be17-7da9387c16c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvgmh" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.129546 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.130054 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7232eb23-31ae-4e72-ae27-c256dc4cac9a-trusted-ca\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.140482 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/09191eec-0be2-4c45-9249-6c8081d6108a-metrics-tls\") pod \"ingress-operator-5b745b69d9-wcnnr\" (UID: \"09191eec-0be2-4c45-9249-6c8081d6108a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wcnnr" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.149242 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.150712 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.150908 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:13:58.650870046 +0000 UTC m=+184.461696480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.151693 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.151925 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-k9bv4"] Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.152205 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:13:58.652182045 +0000 UTC m=+184.463008649 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.158422 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpntx" event={"ID":"36efccb8-7513-43d0-8952-d7ad9546da8e","Type":"ContainerStarted","Data":"837f0748289c32dab50176203bf6899f152b0041109c72b2c3ad10609d715f51"} Feb 26 11:13:58 crc kubenswrapper[4699]: W0226 11:13:58.159744 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34163385_0c26_4d54_a06a_11f9ef09901d.slice/crio-01184b470301f94db3f051ddb11b4c9962804b7dc43b0df93c9a87ed950eca02 WatchSource:0}: Error finding container 01184b470301f94db3f051ddb11b4c9962804b7dc43b0df93c9a87ed950eca02: Status 404 returned error can't find the container with id 01184b470301f94db3f051ddb11b4c9962804b7dc43b0df93c9a87ed950eca02 Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.185332 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/09191eec-0be2-4c45-9249-6c8081d6108a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-wcnnr\" (UID: \"09191eec-0be2-4c45-9249-6c8081d6108a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wcnnr" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.227669 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.237674 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/679ffaa0-41b8-4638-8b4c-4c1f424812e4-config\") pod \"machine-approver-56656f9798-p742p\" (UID: \"679ffaa0-41b8-4638-8b4c-4c1f424812e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p742p" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.252826 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.253009 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:13:58.75298238 +0000 UTC m=+184.563808814 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.253809 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.254087 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:13:58.754078382 +0000 UTC m=+184.564904816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.263245 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb2g6\" (UniqueName: \"kubernetes.io/projected/727302ed-b5c0-49b7-be17-7da9387c16c3-kube-api-access-sb2g6\") pod \"cluster-image-registry-operator-dc59b4c8b-vvgmh\" (UID: \"727302ed-b5c0-49b7-be17-7da9387c16c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvgmh" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.269972 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.280699 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/679ffaa0-41b8-4638-8b4c-4c1f424812e4-machine-approver-tls\") pod \"machine-approver-56656f9798-p742p\" (UID: \"679ffaa0-41b8-4638-8b4c-4c1f424812e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p742p" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.287926 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.298081 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/679ffaa0-41b8-4638-8b4c-4c1f424812e4-auth-proxy-config\") pod \"machine-approver-56656f9798-p742p\" (UID: \"679ffaa0-41b8-4638-8b4c-4c1f424812e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p742p" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.334830 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.338861 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09191eec-0be2-4c45-9249-6c8081d6108a-trusted-ca\") pod \"ingress-operator-5b745b69d9-wcnnr\" (UID: \"09191eec-0be2-4c45-9249-6c8081d6108a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wcnnr" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.348954 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.354515 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.354661 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:13:58.854628661 +0000 UTC m=+184.665455095 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.354938 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs\") pod \"network-metrics-daemon-v5ctv\" (UID: \"6956c039-cf77-429b-8f7f-f93ba195d321\") " pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.355153 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.355505 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:13:58.855490516 +0000 UTC m=+184.666316950 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.366289 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c7d5fe0-885a-44e4-bacf-19bceeea178f-serving-cert\") pod \"console-operator-58897d9998-hzqgp\" (UID: \"0c7d5fe0-885a-44e4-bacf-19bceeea178f\") " pod="openshift-console-operator/console-operator-58897d9998-hzqgp" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.409053 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.420072 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cd7cbed-d0bf-4d8c-933c-4d031170288a-config\") pod \"kube-apiserver-operator-766d6c64bb-vmxjr\" (UID: \"8cd7cbed-d0bf-4d8c-933c-4d031170288a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vmxjr" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.445921 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/727302ed-b5c0-49b7-be17-7da9387c16c3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vvgmh\" (UID: \"727302ed-b5c0-49b7-be17-7da9387c16c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvgmh" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.448211 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.452576 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/727302ed-b5c0-49b7-be17-7da9387c16c3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vvgmh\" (UID: \"727302ed-b5c0-49b7-be17-7da9387c16c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvgmh" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.455779 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.456565 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:13:58.9565475 +0000 UTC m=+184.767373934 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.475455 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.481598 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c7d5fe0-885a-44e4-bacf-19bceeea178f-trusted-ca\") pod \"console-operator-58897d9998-hzqgp\" (UID: \"0c7d5fe0-885a-44e4-bacf-19bceeea178f\") " pod="openshift-console-operator/console-operator-58897d9998-hzqgp" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.507620 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.510537 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c7d5fe0-885a-44e4-bacf-19bceeea178f-config\") pod \"console-operator-58897d9998-hzqgp\" (UID: \"0c7d5fe0-885a-44e4-bacf-19bceeea178f\") " pod="openshift-console-operator/console-operator-58897d9998-hzqgp" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.522718 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7232eb23-31ae-4e72-ae27-c256dc4cac9a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.528163 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.548228 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.554243 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cd7cbed-d0bf-4d8c-933c-4d031170288a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vmxjr\" (UID: \"8cd7cbed-d0bf-4d8c-933c-4d031170288a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vmxjr" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.558055 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.558456 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:13:59.058441518 +0000 UTC m=+184.869267952 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.562995 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trrf7\" (UniqueName: \"kubernetes.io/projected/7232eb23-31ae-4e72-ae27-c256dc4cac9a-kube-api-access-trrf7\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.563302 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7232eb23-31ae-4e72-ae27-c256dc4cac9a-bound-sa-token\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.568174 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7232eb23-31ae-4e72-ae27-c256dc4cac9a-registry-tls\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.569229 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.573664 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8rd9" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.590651 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.600677 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwpnc\" (UniqueName: \"kubernetes.io/projected/a550b2ea-3ce7-4df3-bbf5-f1025afca8c1-kube-api-access-xwpnc\") pod \"apiserver-76f77b778f-f8s5j\" (UID: \"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1\") " pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.608469 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.617973 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h5hb\" (UniqueName: \"kubernetes.io/projected/afa5e1ce-a457-4771-ab06-2654a7801704-kube-api-access-7h5hb\") pod \"apiserver-7bbb656c7d-tm98c\" (UID: \"afa5e1ce-a457-4771-ab06-2654a7801704\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.628222 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.651014 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.655455 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66m62\" (UniqueName: \"kubernetes.io/projected/744aa737-e6c7-4d6b-ba7d-a9479043ad29-kube-api-access-66m62\") pod \"route-controller-manager-6576b87f9c-fq7g8\" (UID: \"744aa737-e6c7-4d6b-ba7d-a9479043ad29\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.659807 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.660195 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:13:59.160143441 +0000 UTC m=+184.970969875 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.660672 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.661425 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:13:59.161416269 +0000 UTC m=+184.972242703 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.669539 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4jdw\" (UniqueName: \"kubernetes.io/projected/03fd3407-9529-4638-89d6-cfc6b703e510-kube-api-access-j4jdw\") pod \"openshift-config-operator-7777fb866f-vzj5b\" (UID: \"03fd3407-9529-4638-89d6-cfc6b703e510\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.705531 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6jtx\" (UniqueName: \"kubernetes.io/projected/5f6e45f7-93da-46b8-9021-d2500076c385-kube-api-access-r6jtx\") pod \"olm-operator-6b444d44fb-czs8l\" (UID: \"5f6e45f7-93da-46b8-9021-d2500076c385\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-czs8l" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.723777 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8mlc\" (UniqueName: \"kubernetes.io/projected/1d3e449f-d082-43cb-951d-53d82fde40ca-kube-api-access-t8mlc\") pod \"package-server-manager-789f6589d5-dk749\" (UID: \"1d3e449f-d082-43cb-951d-53d82fde40ca\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dk749" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.740471 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8rd9"] Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.742911 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4ml4\" (UniqueName: \"kubernetes.io/projected/79a9064f-5fcf-42f7-af6f-71aeeb75560e-kube-api-access-l4ml4\") pod \"csi-hostpathplugin-qzphl\" (UID: \"79a9064f-5fcf-42f7-af6f-71aeeb75560e\") " pod="hostpath-provisioner/csi-hostpathplugin-qzphl" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.761886 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.762096 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:13:59.262064 +0000 UTC m=+185.072890434 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.762523 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.762895 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:13:59.262882735 +0000 UTC m=+185.073709169 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.762927 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k275m\" (UniqueName: \"kubernetes.io/projected/fbc38281-b1a4-4c40-a707-a106b651c107-kube-api-access-k275m\") pod \"openshift-apiserver-operator-796bbdcf4f-tg675\" (UID: \"fbc38281-b1a4-4c40-a707-a106b651c107\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tg675" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.782309 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqcrg\" (UniqueName: \"kubernetes.io/projected/44832f39-2c56-4669-b328-7e663f6cacdf-kube-api-access-jqcrg\") pod \"packageserver-d55dfcdfc-w7nqx\" (UID: \"44832f39-2c56-4669-b328-7e663f6cacdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.802145 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2p88\" (UniqueName: \"kubernetes.io/projected/23bae79f-03c7-4710-ac97-25da2c7988c4-kube-api-access-p2p88\") pod \"service-ca-9c57cc56f-zqgj9\" (UID: \"23bae79f-03c7-4710-ac97-25da2c7988c4\") " pod="openshift-service-ca/service-ca-9c57cc56f-zqgj9" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.823840 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlm8k\" (UniqueName: \"kubernetes.io/projected/0aac34b6-aad8-4b68-8180-f68af008611d-kube-api-access-dlm8k\") pod \"service-ca-operator-777779d784-ctfcc\" (UID: \"0aac34b6-aad8-4b68-8180-f68af008611d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ctfcc" Feb 26 11:13:58 crc kubenswrapper[4699]: W0226 11:13:58.826796 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b070a40_85a6_42e6_a1bd_d834170a9c9c.slice/crio-c2296dd5d7be174ff8b90aed65a4512f5fe3cc9b83f7f09a1988401db2385307 WatchSource:0}: Error finding container c2296dd5d7be174ff8b90aed65a4512f5fe3cc9b83f7f09a1988401db2385307: Status 404 returned error can't find the container with id c2296dd5d7be174ff8b90aed65a4512f5fe3cc9b83f7f09a1988401db2385307 Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.843811 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhksj\" (UniqueName: \"kubernetes.io/projected/89f840f7-d21f-4028-b53d-ed0e2061ff15-kube-api-access-nhksj\") pod \"dns-default-tnwpn\" (UID: \"89f840f7-d21f-4028-b53d-ed0e2061ff15\") " pod="openshift-dns/dns-default-tnwpn" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.858558 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-zqgj9" Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.859058 4699 projected.go:288] Couldn't get configMap openshift-ingress/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.859077 4699 projected.go:194] Error preparing data for projected volume kube-api-access-wks59 for pod openshift-ingress/router-default-5444994796-xm88w: failed to sync configmap cache: timed out waiting for the condition Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.859146 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a97e310-1811-48a9-a31a-eb9a0321d280-kube-api-access-wks59 podName:4a97e310-1811-48a9-a31a-eb9a0321d280 nodeName:}" failed. No retries permitted until 2026-02-26 11:13:59.359109166 +0000 UTC m=+185.169935600 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wks59" (UniqueName: "kubernetes.io/projected/4a97e310-1811-48a9-a31a-eb9a0321d280-kube-api-access-wks59") pod "router-default-5444994796-xm88w" (UID: "4a97e310-1811-48a9-a31a-eb9a0321d280") : failed to sync configmap cache: timed out waiting for the condition Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.863720 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tg675" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.864287 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.864411 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:13:59.364391872 +0000 UTC m=+185.175218306 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.864754 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.864914 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnjhv\" (UniqueName: \"kubernetes.io/projected/5f8a28b8-c47b-4288-877f-8e90a3b581b5-kube-api-access-bnjhv\") pod \"collect-profiles-29535060-f97rd\" (UID: \"5f8a28b8-c47b-4288-877f-8e90a3b581b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535060-f97rd" Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.865025 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:13:59.36501683 +0000 UTC m=+185.175843264 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.865078 4699 request.go:700] Waited for 1.014399349s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/serviceaccounts/olm-operator-serviceaccount/token Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.883399 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9qrl\" (UniqueName: \"kubernetes.io/projected/1a9875bc-9f2e-4887-8dc6-a00cc789eb4a-kube-api-access-l9qrl\") pod \"catalog-operator-68c6474976-xvgnb\" (UID: \"1a9875bc-9f2e-4887-8dc6-a00cc789eb4a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvgnb" Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.889194 4699 projected.go:288] Couldn't get configMap openshift-kube-storage-version-migrator-operator/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.889251 4699 projected.go:194] Error preparing data for projected volume kube-api-access-6wrjv for pod openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gngtb: failed to sync configmap cache: timed out waiting for the condition Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.889332 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e0ecd5cc-b456-4d69-897c-5fd543842440-kube-api-access-6wrjv podName:e0ecd5cc-b456-4d69-897c-5fd543842440 nodeName:}" failed. No retries permitted until 2026-02-26 11:13:59.389309087 +0000 UTC m=+185.200135591 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6wrjv" (UniqueName: "kubernetes.io/projected/e0ecd5cc-b456-4d69-897c-5fd543842440-kube-api-access-6wrjv") pod "kube-storage-version-migrator-operator-b67b599dd-gngtb" (UID: "e0ecd5cc-b456-4d69-897c-5fd543842440") : failed to sync configmap cache: timed out waiting for the condition Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.895154 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-czs8l" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.905553 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535060-f97rd" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.906972 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckkpm\" (UniqueName: \"kubernetes.io/projected/6b9ab605-cf5d-43ea-9554-20032a52e23c-kube-api-access-ckkpm\") pod \"machine-config-server-rlx7c\" (UID: \"6b9ab605-cf5d-43ea-9554-20032a52e23c\") " pod="openshift-machine-config-operator/machine-config-server-rlx7c" Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.909369 4699 projected.go:288] Couldn't get configMap openshift-kube-storage-version-migrator/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.909400 4699 projected.go:194] Error preparing data for projected volume kube-api-access-7frcg for pod openshift-kube-storage-version-migrator/migrator-59844c95c7-k6wtb: failed to sync configmap cache: timed out waiting for the condition Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.909491 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/af5429d7-39d0-4b17-8219-21c8491384ae-kube-api-access-7frcg podName:af5429d7-39d0-4b17-8219-21c8491384ae nodeName:}" failed. No retries permitted until 2026-02-26 11:13:59.409463582 +0000 UTC m=+185.220290086 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7frcg" (UniqueName: "kubernetes.io/projected/af5429d7-39d0-4b17-8219-21c8491384ae-kube-api-access-7frcg") pod "migrator-59844c95c7-k6wtb" (UID: "af5429d7-39d0-4b17-8219-21c8491384ae") : failed to sync configmap cache: timed out waiting for the condition Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.917540 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.924350 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwq64\" (UniqueName: \"kubernetes.io/projected/5cc10041-704b-4b00-8e4e-369103434b64-kube-api-access-bwq64\") pod \"marketplace-operator-79b997595-cd5qf\" (UID: \"5cc10041-704b-4b00-8e4e-369103434b64\") " pod="openshift-marketplace/marketplace-operator-79b997595-cd5qf" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.927776 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dk749" Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.930146 4699 projected.go:288] Couldn't get configMap openshift-etcd-operator/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.930169 4699 projected.go:194] Error preparing data for projected volume kube-api-access-d8m6h for pod openshift-etcd-operator/etcd-operator-b45778765-j6vfb: failed to sync configmap cache: timed out waiting for the condition Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.930226 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fab52d01-f907-44cb-8d5f-162116d75fc9-kube-api-access-d8m6h podName:fab52d01-f907-44cb-8d5f-162116d75fc9 nodeName:}" failed. No retries permitted until 2026-02-26 11:13:59.430207245 +0000 UTC m=+185.241033669 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-d8m6h" (UniqueName: "kubernetes.io/projected/fab52d01-f907-44cb-8d5f-162116d75fc9-kube-api-access-d8m6h") pod "etcd-operator-b45778765-j6vfb" (UID: "fab52d01-f907-44cb-8d5f-162116d75fc9") : failed to sync configmap cache: timed out waiting for the condition Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.938297 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ctfcc" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.948512 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhkh7\" (UniqueName: \"kubernetes.io/projected/00fcad37-801c-4a2c-8599-dabd0f36db6d-kube-api-access-jhkh7\") pod \"ingress-canary-r2phw\" (UID: \"00fcad37-801c-4a2c-8599-dabd0f36db6d\") " pod="openshift-ingress-canary/ingress-canary-r2phw" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.952718 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tnwpn" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.966174 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.966468 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:13:59.466443625 +0000 UTC m=+185.277270059 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.966989 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:58 crc kubenswrapper[4699]: E0226 11:13:58.967466 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:13:59.467454005 +0000 UTC m=+185.278280439 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.968599 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.971508 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-qzphl" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.973254 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c57mh\" (UniqueName: \"kubernetes.io/projected/b4f243e8-e08c-420e-a78b-02e6a14bf5fe-kube-api-access-c57mh\") pod \"machine-config-operator-74547568cd-pxsr8\" (UID: \"b4f243e8-e08c-420e-a78b-02e6a14bf5fe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pxsr8" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.981154 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-rlx7c" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.987531 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 26 11:13:58 crc kubenswrapper[4699]: I0226 11:13:58.988657 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r2phw" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.007641 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.028406 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.048023 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.054084 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-tcnxt" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.067898 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.068449 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 26 11:13:59 crc kubenswrapper[4699]: E0226 11:13:59.068615 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:13:59.568600201 +0000 UTC m=+185.379426625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.070882 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-xbpcs" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.086008 4699 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2zshh" secret="" err="failed to sync secret cache: timed out waiting for the condition" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.086075 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2zshh" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.101978 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.113246 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.116593 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.126052 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zqgj9"] Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.126581 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.126879 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qsj62" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.146529 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvgnb" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.146606 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.155763 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.160840 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-czs8l"] Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.166957 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.171971 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.172153 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gpjvh" Feb 26 11:13:59 crc kubenswrapper[4699]: E0226 11:13:59.172442 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:13:59.672426966 +0000 UTC m=+185.483253400 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.173718 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qf" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.175153 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8rd9" event={"ID":"6b070a40-85a6-42e6-a1bd-d834170a9c9c","Type":"ContainerStarted","Data":"c2296dd5d7be174ff8b90aed65a4512f5fe3cc9b83f7f09a1988401db2385307"} Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.175414 4699 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p9wj4" secret="" err="failed to sync secret cache: timed out waiting for the condition" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.175452 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p9wj4" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.176038 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tg675"] Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.177315 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-zqgj9" event={"ID":"23bae79f-03c7-4710-ac97-25da2c7988c4","Type":"ContainerStarted","Data":"3471c9e71d3cd8d924d552747899f6cd1e83bdb74118e8458c7a8e924f4465a9"} Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.178667 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-k9bv4" event={"ID":"34163385-0c26-4d54-a06a-11f9ef09901d","Type":"ContainerStarted","Data":"01184b470301f94db3f051ddb11b4c9962804b7dc43b0df93c9a87ed950eca02"} Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.183943 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pxsr8" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.187179 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 26 11:13:59 crc kubenswrapper[4699]: E0226 11:13:59.199438 4699 projected.go:288] Couldn't get configMap openshift-cluster-machine-approver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.208157 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.213717 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535060-f97rd"] Feb 26 11:13:59 crc kubenswrapper[4699]: E0226 11:13:59.221173 4699 projected.go:288] Couldn't get configMap openshift-kube-apiserver-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 26 11:13:59 crc kubenswrapper[4699]: E0226 11:13:59.221216 4699 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vmxjr: failed to sync configmap cache: timed out waiting for the condition Feb 26 11:13:59 crc kubenswrapper[4699]: E0226 11:13:59.221287 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8cd7cbed-d0bf-4d8c-933c-4d031170288a-kube-api-access podName:8cd7cbed-d0bf-4d8c-933c-4d031170288a nodeName:}" failed. No retries permitted until 2026-02-26 11:13:59.721264078 +0000 UTC m=+185.532090522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/8cd7cbed-d0bf-4d8c-933c-4d031170288a-kube-api-access") pod "kube-apiserver-operator-766d6c64bb-vmxjr" (UID: "8cd7cbed-d0bf-4d8c-933c-4d031170288a") : failed to sync configmap cache: timed out waiting for the condition Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.237034 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 26 11:13:59 crc kubenswrapper[4699]: W0226 11:13:59.246773 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f6e45f7_93da_46b8_9021_d2500076c385.slice/crio-518234f9f0a4807650213e3d0e0a0e32a30602572f66d2aa83784c4337aef135 WatchSource:0}: Error finding container 518234f9f0a4807650213e3d0e0a0e32a30602572f66d2aa83784c4337aef135: Status 404 returned error can't find the container with id 518234f9f0a4807650213e3d0e0a0e32a30602572f66d2aa83784c4337aef135 Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.251316 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.253193 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.268202 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.273463 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:13:59 crc kubenswrapper[4699]: E0226 11:13:59.273873 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:13:59.773852821 +0000 UTC m=+185.584679255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.287331 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.301355 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6956c039-cf77-429b-8f7f-f93ba195d321-metrics-certs\") pod \"network-metrics-daemon-v5ctv\" (UID: \"6956c039-cf77-429b-8f7f-f93ba195d321\") " pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.308996 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.310569 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-pw64v" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.327106 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.337185 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx"] Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.368323 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.376990 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.377060 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvgmh" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.377239 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wks59\" (UniqueName: \"kubernetes.io/projected/4a97e310-1811-48a9-a31a-eb9a0321d280-kube-api-access-wks59\") pod \"router-default-5444994796-xm88w\" (UID: \"4a97e310-1811-48a9-a31a-eb9a0321d280\") " pod="openshift-ingress/router-default-5444994796-xm88w" Feb 26 11:13:59 crc kubenswrapper[4699]: E0226 11:13:59.377448 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:13:59.877424319 +0000 UTC m=+185.688250943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.382256 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wks59\" (UniqueName: \"kubernetes.io/projected/4a97e310-1811-48a9-a31a-eb9a0321d280-kube-api-access-wks59\") pod \"router-default-5444994796-xm88w\" (UID: \"4a97e310-1811-48a9-a31a-eb9a0321d280\") " pod="openshift-ingress/router-default-5444994796-xm88w" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.387266 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dk749"] Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.387676 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.390431 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.408760 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.415736 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.429153 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.430481 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:13:59 crc kubenswrapper[4699]: W0226 11:13:59.447797 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44832f39_2c56_4669_b328_7e663f6cacdf.slice/crio-dcfcf531fe129586ab3d0914bc42572e1dd887de749c4a861cb770ab36f5adda WatchSource:0}: Error finding container dcfcf531fe129586ab3d0914bc42572e1dd887de749c4a861cb770ab36f5adda: Status 404 returned error can't find the container with id dcfcf531fe129586ab3d0914bc42572e1dd887de749c4a861cb770ab36f5adda Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.450423 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.451513 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.468208 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ctfcc"] Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.475413 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.477801 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.478043 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7frcg\" (UniqueName: \"kubernetes.io/projected/af5429d7-39d0-4b17-8219-21c8491384ae-kube-api-access-7frcg\") pod \"migrator-59844c95c7-k6wtb\" (UID: \"af5429d7-39d0-4b17-8219-21c8491384ae\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k6wtb" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.478096 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8m6h\" (UniqueName: \"kubernetes.io/projected/fab52d01-f907-44cb-8d5f-162116d75fc9-kube-api-access-d8m6h\") pod \"etcd-operator-b45778765-j6vfb\" (UID: \"fab52d01-f907-44cb-8d5f-162116d75fc9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.478171 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wrjv\" (UniqueName: \"kubernetes.io/projected/e0ecd5cc-b456-4d69-897c-5fd543842440-kube-api-access-6wrjv\") pod \"kube-storage-version-migrator-operator-b67b599dd-gngtb\" (UID: \"e0ecd5cc-b456-4d69-897c-5fd543842440\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gngtb" Feb 26 11:13:59 crc kubenswrapper[4699]: E0226 11:13:59.478661 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:13:59.978633487 +0000 UTC m=+185.789459961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:59 crc kubenswrapper[4699]: E0226 11:13:59.484535 4699 projected.go:194] Error preparing data for projected volume kube-api-access-t7chv for pod openshift-cluster-machine-approver/machine-approver-56656f9798-p742p: failed to sync configmap cache: timed out waiting for the condition Feb 26 11:13:59 crc kubenswrapper[4699]: E0226 11:13:59.484643 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/679ffaa0-41b8-4638-8b4c-4c1f424812e4-kube-api-access-t7chv podName:679ffaa0-41b8-4638-8b4c-4c1f424812e4 nodeName:}" failed. No retries permitted until 2026-02-26 11:13:59.984619514 +0000 UTC m=+185.795445948 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-t7chv" (UniqueName: "kubernetes.io/projected/679ffaa0-41b8-4638-8b4c-4c1f424812e4-kube-api-access-t7chv") pod "machine-approver-56656f9798-p742p" (UID: "679ffaa0-41b8-4638-8b4c-4c1f424812e4") : failed to sync configmap cache: timed out waiting for the condition Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.489010 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.505322 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8m6h\" (UniqueName: \"kubernetes.io/projected/fab52d01-f907-44cb-8d5f-162116d75fc9-kube-api-access-d8m6h\") pod \"etcd-operator-b45778765-j6vfb\" (UID: \"fab52d01-f907-44cb-8d5f-162116d75fc9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.509879 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.513259 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7frcg\" (UniqueName: \"kubernetes.io/projected/af5429d7-39d0-4b17-8219-21c8491384ae-kube-api-access-7frcg\") pod \"migrator-59844c95c7-k6wtb\" (UID: \"af5429d7-39d0-4b17-8219-21c8491384ae\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k6wtb" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.519138 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpx8q\" (UniqueName: \"kubernetes.io/projected/09191eec-0be2-4c45-9249-6c8081d6108a-kube-api-access-lpx8q\") pod \"ingress-operator-5b745b69d9-wcnnr\" (UID: \"09191eec-0be2-4c45-9249-6c8081d6108a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wcnnr" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.521357 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wrjv\" (UniqueName: \"kubernetes.io/projected/e0ecd5cc-b456-4d69-897c-5fd543842440-kube-api-access-6wrjv\") pod \"kube-storage-version-migrator-operator-b67b599dd-gngtb\" (UID: \"e0ecd5cc-b456-4d69-897c-5fd543842440\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gngtb" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.525410 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r2phw"] Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.528489 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcndj\" (UniqueName: \"kubernetes.io/projected/0c7d5fe0-885a-44e4-bacf-19bceeea178f-kube-api-access-rcndj\") pod \"console-operator-58897d9998-hzqgp\" (UID: \"0c7d5fe0-885a-44e4-bacf-19bceeea178f\") " pod="openshift-console-operator/console-operator-58897d9998-hzqgp" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.581678 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:59 crc kubenswrapper[4699]: E0226 11:13:59.582470 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:00.082442992 +0000 UTC m=+185.893269606 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.607804 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.614413 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v5ctv" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.647764 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.648804 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.672257 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2zshh"] Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.682949 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:13:59 crc kubenswrapper[4699]: E0226 11:13:59.683498 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:00.183479285 +0000 UTC m=+185.994305719 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.686852 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.687920 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wcnnr" Feb 26 11:13:59 crc kubenswrapper[4699]: W0226 11:13:59.697868 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00fcad37_801c_4a2c_8599_dabd0f36db6d.slice/crio-83e0b6c3f4ca93790fd661d1c95c61e7c1263345d7f3b26630b620b6112340e3 WatchSource:0}: Error finding container 83e0b6c3f4ca93790fd661d1c95c61e7c1263345d7f3b26630b620b6112340e3: Status 404 returned error can't find the container with id 83e0b6c3f4ca93790fd661d1c95c61e7c1263345d7f3b26630b620b6112340e3 Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.712602 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.717572 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xm88w" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.749817 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.755523 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gngtb" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.776447 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.784927 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8cd7cbed-d0bf-4d8c-933c-4d031170288a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vmxjr\" (UID: \"8cd7cbed-d0bf-4d8c-933c-4d031170288a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vmxjr" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.785016 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:59 crc kubenswrapper[4699]: E0226 11:13:59.785488 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:00.285473787 +0000 UTC m=+186.096300221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.786033 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k6wtb" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.788513 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.796681 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-hzqgp" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.800238 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8cd7cbed-d0bf-4d8c-933c-4d031170288a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vmxjr\" (UID: \"8cd7cbed-d0bf-4d8c-933c-4d031170288a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vmxjr" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.889290 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:13:59 crc kubenswrapper[4699]: E0226 11:13:59.889521 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:00.389489678 +0000 UTC m=+186.200316122 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.890007 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:13:59 crc kubenswrapper[4699]: E0226 11:13:59.890610 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:00.39059438 +0000 UTC m=+186.201420814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.907502 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.915062 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vmxjr" Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.991617 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.991926 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7chv\" (UniqueName: \"kubernetes.io/projected/679ffaa0-41b8-4638-8b4c-4c1f424812e4-kube-api-access-t7chv\") pod \"machine-approver-56656f9798-p742p\" (UID: \"679ffaa0-41b8-4638-8b4c-4c1f424812e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p742p" Feb 26 11:13:59 crc kubenswrapper[4699]: E0226 11:13:59.994296 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:00.494261011 +0000 UTC m=+186.305087605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:13:59 crc kubenswrapper[4699]: I0226 11:13:59.997733 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7chv\" (UniqueName: \"kubernetes.io/projected/679ffaa0-41b8-4638-8b4c-4c1f424812e4-kube-api-access-t7chv\") pod \"machine-approver-56656f9798-p742p\" (UID: \"679ffaa0-41b8-4638-8b4c-4c1f424812e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p742p" Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.005105 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-22qbz"] Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.057489 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvgnb"] Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.094384 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:00 crc kubenswrapper[4699]: E0226 11:14:00.094889 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:00.594869941 +0000 UTC m=+186.405696375 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.097368 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-qzphl"] Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.153859 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gsl8w"] Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.153912 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tnwpn"] Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.160373 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qsj62"] Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.160820 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-xbpcs"] Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.168737 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535074-bjfld"] Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.170323 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535074-bjfld" Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.171068 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535074-bjfld"] Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.177225 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.188257 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.197612 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:00 crc kubenswrapper[4699]: E0226 11:14:00.198079 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:00.698059438 +0000 UTC m=+186.508885872 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.212399 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dk749" event={"ID":"1d3e449f-d082-43cb-951d-53d82fde40ca","Type":"ContainerStarted","Data":"317c5b2414bf1b5b19ec4ab8c5f7d2097d6a25f530ff2a0a2426b0eae6da2595"} Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.214678 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx" event={"ID":"44832f39-2c56-4669-b328-7e663f6cacdf","Type":"ContainerStarted","Data":"dcfcf531fe129586ab3d0914bc42572e1dd887de749c4a861cb770ab36f5adda"} Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.217822 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r2phw" event={"ID":"00fcad37-801c-4a2c-8599-dabd0f36db6d","Type":"ContainerStarted","Data":"83e0b6c3f4ca93790fd661d1c95c61e7c1263345d7f3b26630b620b6112340e3"} Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.220346 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2zshh" event={"ID":"9c1f6032-b723-4cb3-a93b-73d053eaf822","Type":"ContainerStarted","Data":"16e3e5ef471b13f7f9217a4479069861dfc04956adc76207b12f862e2b4b3359"} Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.221095 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ctfcc" event={"ID":"0aac34b6-aad8-4b68-8180-f68af008611d","Type":"ContainerStarted","Data":"9c569a7b4a5730a6fd5f622d9d8580518b4d59861706d52f5fa570b1144b6f7d"} Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.221900 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-czs8l" event={"ID":"5f6e45f7-93da-46b8-9021-d2500076c385","Type":"ContainerStarted","Data":"518234f9f0a4807650213e3d0e0a0e32a30602572f66d2aa83784c4337aef135"} Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.223182 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpntx" event={"ID":"36efccb8-7513-43d0-8952-d7ad9546da8e","Type":"ContainerStarted","Data":"5ef61d4e75602d31a9e85e25cd1cda7253e5ef4c2b643144b5584ca6ef9e0885"} Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.224010 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tg675" event={"ID":"fbc38281-b1a4-4c40-a707-a106b651c107","Type":"ContainerStarted","Data":"b265116c888ddda8a0c13680e3b228945374fe791d1bcfef434de7fce7ea1caa"} Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.236025 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-k9bv4" event={"ID":"34163385-0c26-4d54-a06a-11f9ef09901d","Type":"ContainerStarted","Data":"7fffc66eeb4956b76f91d87b3eef8ab447b9fb0a0df8dbccb897804b145f18ab"} Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.240529 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535060-f97rd" event={"ID":"5f8a28b8-c47b-4288-877f-8e90a3b581b5","Type":"ContainerStarted","Data":"9cc8202a0a693b54f9a7afa4f72146520cc57d28a34110bea4d4992553af18b6"} Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.241547 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8rd9" event={"ID":"6b070a40-85a6-42e6-a1bd-d834170a9c9c","Type":"ContainerStarted","Data":"508efe6a42d61af98e55fb58855c1da8e28f40533928f5ed84603c8a6eae2c6e"} Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.248417 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.248615 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-rlx7c" event={"ID":"6b9ab605-cf5d-43ea-9554-20032a52e23c","Type":"ContainerStarted","Data":"13d23702e952465cfb024c403f3a037dbd8825ad6aadefa49886bdf076336718"} Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.252443 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p742p" Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.254355 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tm8w" event={"ID":"460579d9-ed16-49b7-a588-ef20ceb9bbf4","Type":"ContainerStarted","Data":"63bb7c60f2814c3bca3cd8b3c25228a93bf6b0e1f5746a31ee1416ba021a86ef"} Feb 26 11:14:00 crc kubenswrapper[4699]: W0226 11:14:00.278295 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a9875bc_9f2e_4887_8dc6_a00cc789eb4a.slice/crio-2596306f1087bf01b8a58fd8b0bb65d12065ae02e0851934fb89d1efcdbc1abe WatchSource:0}: Error finding container 2596306f1087bf01b8a58fd8b0bb65d12065ae02e0851934fb89d1efcdbc1abe: Status 404 returned error can't find the container with id 2596306f1087bf01b8a58fd8b0bb65d12065ae02e0851934fb89d1efcdbc1abe Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.300231 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq8lx\" (UniqueName: \"kubernetes.io/projected/30d444da-9127-459c-97c6-cdcff5b20e67-kube-api-access-qq8lx\") pod \"auto-csr-approver-29535074-bjfld\" (UID: \"30d444da-9127-459c-97c6-cdcff5b20e67\") " pod="openshift-infra/auto-csr-approver-29535074-bjfld" Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.300289 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:00 crc kubenswrapper[4699]: E0226 11:14:00.302494 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:00.802477201 +0000 UTC m=+186.613303635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.404368 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:00 crc kubenswrapper[4699]: E0226 11:14:00.404581 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:00.904547794 +0000 UTC m=+186.715374228 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.404703 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq8lx\" (UniqueName: \"kubernetes.io/projected/30d444da-9127-459c-97c6-cdcff5b20e67-kube-api-access-qq8lx\") pod \"auto-csr-approver-29535074-bjfld\" (UID: \"30d444da-9127-459c-97c6-cdcff5b20e67\") " pod="openshift-infra/auto-csr-approver-29535074-bjfld" Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.404740 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:00 crc kubenswrapper[4699]: E0226 11:14:00.405051 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:00.905040499 +0000 UTC m=+186.715866933 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.460665 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq8lx\" (UniqueName: \"kubernetes.io/projected/30d444da-9127-459c-97c6-cdcff5b20e67-kube-api-access-qq8lx\") pod \"auto-csr-approver-29535074-bjfld\" (UID: \"30d444da-9127-459c-97c6-cdcff5b20e67\") " pod="openshift-infra/auto-csr-approver-29535074-bjfld" Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.470470 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pw64v"] Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.474077 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-tcnxt"] Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.476342 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hnsh7"] Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.511465 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:00 crc kubenswrapper[4699]: E0226 11:14:00.511999 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:01.011982667 +0000 UTC m=+186.822809101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.552230 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535074-bjfld" Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.563839 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p9wj4"] Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.574003 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cd5qf"] Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.574051 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvgmh"] Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.613044 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:00 crc kubenswrapper[4699]: E0226 11:14:00.613428 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:01.113414362 +0000 UTC m=+186.924240796 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.714316 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:00 crc kubenswrapper[4699]: E0226 11:14:00.714707 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:01.214692412 +0000 UTC m=+187.025518846 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.743288 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pxsr8"] Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.749207 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gpjvh"] Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.757347 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8"] Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.761502 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-f8s5j"] Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.766347 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-k6wtb"] Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.816050 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:00 crc kubenswrapper[4699]: E0226 11:14:00.817163 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:01.317145817 +0000 UTC m=+187.127972251 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.845696 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c"] Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.875066 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b"] Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.926240 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:00 crc kubenswrapper[4699]: E0226 11:14:00.926608 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:01.426593718 +0000 UTC m=+187.237420142 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:00 crc kubenswrapper[4699]: I0226 11:14:00.953853 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-v5ctv"] Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.028239 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:01 crc kubenswrapper[4699]: E0226 11:14:01.028632 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:01.52861707 +0000 UTC m=+187.339443504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.063221 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vmxjr"] Feb 26 11:14:01 crc kubenswrapper[4699]: W0226 11:14:01.066685 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03fd3407_9529_4638_89d6_cfc6b703e510.slice/crio-c353d9bed35789a32b2ea23a2f94f9c2e40f463057a2ff1c95e594e8b545182a WatchSource:0}: Error finding container c353d9bed35789a32b2ea23a2f94f9c2e40f463057a2ff1c95e594e8b545182a: Status 404 returned error can't find the container with id c353d9bed35789a32b2ea23a2f94f9c2e40f463057a2ff1c95e594e8b545182a Feb 26 11:14:01 crc kubenswrapper[4699]: W0226 11:14:01.101837 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6956c039_cf77_429b_8f7f_f93ba195d321.slice/crio-2d2c52d524532517fe59c7ad81e75737aa073b7328d6c10323d7e6a7fd831621 WatchSource:0}: Error finding container 2d2c52d524532517fe59c7ad81e75737aa073b7328d6c10323d7e6a7fd831621: Status 404 returned error can't find the container with id 2d2c52d524532517fe59c7ad81e75737aa073b7328d6c10323d7e6a7fd831621 Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.129503 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:01 crc kubenswrapper[4699]: E0226 11:14:01.129640 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:01.629612332 +0000 UTC m=+187.440438766 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.129797 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:01 crc kubenswrapper[4699]: E0226 11:14:01.130260 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:01.630241291 +0000 UTC m=+187.441067775 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.207241 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-j6vfb"] Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.223711 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gngtb"] Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.231321 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-wcnnr"] Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.232690 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:01 crc kubenswrapper[4699]: E0226 11:14:01.232859 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:01.73283318 +0000 UTC m=+187.543659624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.233014 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:01 crc kubenswrapper[4699]: E0226 11:14:01.233399 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:01.733385616 +0000 UTC m=+187.544212050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.248530 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hzqgp"] Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.294358 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2zshh" event={"ID":"9c1f6032-b723-4cb3-a93b-73d053eaf822","Type":"ContainerStarted","Data":"ca14ca703cbd66fd56d88323124a5c239a71a62cb0959b8249347a60a5a6bd7a"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.296696 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" event={"ID":"afa5e1ce-a457-4771-ab06-2654a7801704","Type":"ContainerStarted","Data":"28d18ab4af63ee8cd3146ef370273dcf2ab66f715c42eb29a815f51c721d1a2b"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.304995 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hnsh7" event={"ID":"e6bdcf19-db76-497c-a2fe-a6de38fae724","Type":"ContainerStarted","Data":"70e987324485f04a528051e1c4554d8c5806c907f67af5218c0970ab13cf9e3b"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.310935 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvgmh" event={"ID":"727302ed-b5c0-49b7-be17-7da9387c16c3","Type":"ContainerStarted","Data":"37d1d479ad57c1095d857b9bb52d51e059395ee6131cd8e758475e415c3fe86e"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.321105 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-czs8l" event={"ID":"5f6e45f7-93da-46b8-9021-d2500076c385","Type":"ContainerStarted","Data":"ac7df773855e739622ed1fc070613c600351a5438de68a58ec6d7302b25a923f"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.321644 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-czs8l" Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.324198 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" event={"ID":"796e9631-3388-48b1-8675-3fbc4b6e435d","Type":"ContainerStarted","Data":"4e6b4035a7e79b8d64117aea9e3cf6e2de88e935585f4e47f14b7523b0476a41"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.324434 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" event={"ID":"796e9631-3388-48b1-8675-3fbc4b6e435d","Type":"ContainerStarted","Data":"d46528a0707304b437a71f9c1955cd18d8b2071a99d85c6b9da1246d823f4c34"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.326174 4699 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-czs8l container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.326237 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-czs8l" podUID="5f6e45f7-93da-46b8-9021-d2500076c385" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Feb 26 11:14:01 crc kubenswrapper[4699]: W0226 11:14:01.328509 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfab52d01_f907_44cb_8d5f_162116d75fc9.slice/crio-7e76c7fe195b5c4a77f4cf9d0dcbabd8f147496fccdf7af35990ca526dc334e9 WatchSource:0}: Error finding container 7e76c7fe195b5c4a77f4cf9d0dcbabd8f147496fccdf7af35990ca526dc334e9: Status 404 returned error can't find the container with id 7e76c7fe195b5c4a77f4cf9d0dcbabd8f147496fccdf7af35990ca526dc334e9 Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.331811 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" event={"ID":"744aa737-e6c7-4d6b-ba7d-a9479043ad29","Type":"ContainerStarted","Data":"3f258b9ae41f11af5114ab5232e03c4aa9dff40c08fe1e6fde31d40c3ec891ec"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.333933 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:01 crc kubenswrapper[4699]: E0226 11:14:01.334194 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:01.834150421 +0000 UTC m=+187.644976865 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:01 crc kubenswrapper[4699]: W0226 11:14:01.334257 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0ecd5cc_b456_4d69_897c_5fd543842440.slice/crio-9488713a40a5a4743e81c00be9ca4d60ac9ec1a1f8456efac6adb5d8c076245a WatchSource:0}: Error finding container 9488713a40a5a4743e81c00be9ca4d60ac9ec1a1f8456efac6adb5d8c076245a: Status 404 returned error can't find the container with id 9488713a40a5a4743e81c00be9ca4d60ac9ec1a1f8456efac6adb5d8c076245a Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.334404 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:01 crc kubenswrapper[4699]: E0226 11:14:01.335139 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:01.835091239 +0000 UTC m=+187.645917673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:01 crc kubenswrapper[4699]: W0226 11:14:01.360760 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09191eec_0be2_4c45_9249_6c8081d6108a.slice/crio-e8aa800314c45bdd20aa1e8f81dcaae8818e4823d3d14052fbf7e64cff8fc222 WatchSource:0}: Error finding container e8aa800314c45bdd20aa1e8f81dcaae8818e4823d3d14052fbf7e64cff8fc222: Status 404 returned error can't find the container with id e8aa800314c45bdd20aa1e8f81dcaae8818e4823d3d14052fbf7e64cff8fc222 Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.367060 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f8rd9" podStartSLOduration=135.367033882 podStartE2EDuration="2m15.367033882s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:01.359654774 +0000 UTC m=+187.170481208" watchObservedRunningTime="2026-02-26 11:14:01.367033882 +0000 UTC m=+187.177860326" Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.377726 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qsj62" event={"ID":"64bd7009-a06a-43e1-b265-3ea78b5801b9","Type":"ContainerStarted","Data":"546f4e9939bdcb6b6a69bb68130d2cc00c06f5fb76617988db3d5893ca3c3033"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.377788 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qsj62" event={"ID":"64bd7009-a06a-43e1-b265-3ea78b5801b9","Type":"ContainerStarted","Data":"461c75688cba99d404711ec29bfef434c23a4cfc300f855eaa6bacc80a839a48"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.379771 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xm88w" event={"ID":"4a97e310-1811-48a9-a31a-eb9a0321d280","Type":"ContainerStarted","Data":"5b91a295abc1a424ee2421a049bdfa52ce074ce87f531c64227c3c3c8d36a6a3"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.381444 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k6wtb" event={"ID":"af5429d7-39d0-4b17-8219-21c8491384ae","Type":"ContainerStarted","Data":"0e4ca5729966bb0667537e4367e68fd53633c240e0f3a0334bb9d0da3844c7d8"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.383569 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p9wj4" event={"ID":"bad776f4-e24b-41f1-88d8-2b1fe6258783","Type":"ContainerStarted","Data":"fa85da4b93436fb19c42dc8f6a401ea56d65809f1ba71d6912b236113b2c8e20"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.386949 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qf" event={"ID":"5cc10041-704b-4b00-8e4e-369103434b64","Type":"ContainerStarted","Data":"be07ebbed72d10e6a52397198b9b567e946941b2a2ee6b1a35e4358ea9958b9f"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.392729 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gpjvh" event={"ID":"89928475-c3fb-415f-a244-6292dc8adc33","Type":"ContainerStarted","Data":"eb58766da6db5ad222a77172d145c88eee88ef68556793b7b04b904329008f36"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.410805 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vmxjr" event={"ID":"8cd7cbed-d0bf-4d8c-933c-4d031170288a","Type":"ContainerStarted","Data":"59dfddc3a12d600de8024a8dc9e456537fbd7f53513a49c25752922952cca5ef"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.439682 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:01 crc kubenswrapper[4699]: E0226 11:14:01.440091 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:01.940045248 +0000 UTC m=+187.750871682 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.444888 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-czs8l" podStartSLOduration=134.44487007 podStartE2EDuration="2m14.44487007s" podCreationTimestamp="2026-02-26 11:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:01.400205422 +0000 UTC m=+187.211031856" watchObservedRunningTime="2026-02-26 11:14:01.44487007 +0000 UTC m=+187.255696524" Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.454232 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-xbpcs" event={"ID":"61a1581f-5367-4535-99bc-3f28547ab766","Type":"ContainerStarted","Data":"74cac48911929390e354aa36a7f066350a1dc8abd1a812803d29897753864181"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.459975 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx" event={"ID":"44832f39-2c56-4669-b328-7e663f6cacdf","Type":"ContainerStarted","Data":"e5ba7421b7ee0eabc1adc04968b758675d888f789421bf4eac320c3bfbc2a860"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.460819 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx" Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.461856 4699 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-w7nqx container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" start-of-body= Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.461886 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx" podUID="44832f39-2c56-4669-b328-7e663f6cacdf" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.464666 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535060-f97rd" event={"ID":"5f8a28b8-c47b-4288-877f-8e90a3b581b5","Type":"ContainerStarted","Data":"61a2c48ee6bf74ea4766fbbb38a98752e4fc1a270493117d88d14b6af7b2c988"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.483877 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx" podStartSLOduration=134.483862111 podStartE2EDuration="2m14.483862111s" podCreationTimestamp="2026-02-26 11:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:01.481900764 +0000 UTC m=+187.292727218" watchObservedRunningTime="2026-02-26 11:14:01.483862111 +0000 UTC m=+187.294688545" Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.489030 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tnwpn" event={"ID":"89f840f7-d21f-4028-b53d-ed0e2061ff15","Type":"ContainerStarted","Data":"0e40d9202d784aea686c0e49c7ec729c734c46a5b71cd5e4083460e2aac9610a"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.500734 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qzphl" event={"ID":"79a9064f-5fcf-42f7-af6f-71aeeb75560e","Type":"ContainerStarted","Data":"b0dae2453795efff0471c82ab81da1da672c09bf5a5f39865ea967629840dfc5"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.508882 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p742p" event={"ID":"679ffaa0-41b8-4638-8b4c-4c1f424812e4","Type":"ContainerStarted","Data":"0ad7548462393f609cec4e6348b963e054d02e1bfee6adfff9394544e4d429bf"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.518423 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" event={"ID":"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1","Type":"ContainerStarted","Data":"3f775704b88aa7370619139cbc98bee3d682cdf286fe1a2c9be29db696c38016"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.524042 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-zqgj9" event={"ID":"23bae79f-03c7-4710-ac97-25da2c7988c4","Type":"ContainerStarted","Data":"354a6d0c0425c100273b0e3aa2ab50be9bee531b1d3bb56c3f721024b7d92514"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.526883 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29535060-f97rd" podStartSLOduration=135.526865361 podStartE2EDuration="2m15.526865361s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:01.524252484 +0000 UTC m=+187.335078918" watchObservedRunningTime="2026-02-26 11:14:01.526865361 +0000 UTC m=+187.337691805" Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.526920 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvgnb" event={"ID":"1a9875bc-9f2e-4887-8dc6-a00cc789eb4a","Type":"ContainerStarted","Data":"2596306f1087bf01b8a58fd8b0bb65d12065ae02e0851934fb89d1efcdbc1abe"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.528231 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pxsr8" event={"ID":"b4f243e8-e08c-420e-a78b-02e6a14bf5fe","Type":"ContainerStarted","Data":"106fcb2e12af20c2b9fe2679e1afb633dc9be2ac23683a5f742b1ed9156b7029"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.529404 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v5ctv" event={"ID":"6956c039-cf77-429b-8f7f-f93ba195d321","Type":"ContainerStarted","Data":"2d2c52d524532517fe59c7ad81e75737aa073b7328d6c10323d7e6a7fd831621"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.531393 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b" event={"ID":"03fd3407-9529-4638-89d6-cfc6b703e510","Type":"ContainerStarted","Data":"c353d9bed35789a32b2ea23a2f94f9c2e40f463057a2ff1c95e594e8b545182a"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.534332 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpntx" event={"ID":"36efccb8-7513-43d0-8952-d7ad9546da8e","Type":"ContainerStarted","Data":"3d61ef78d0d188a8104bb6b9e11972fc2e015d1356eaec6fa7c27c6982fbcadf"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.540987 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.542514 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dk749" event={"ID":"1d3e449f-d082-43cb-951d-53d82fde40ca","Type":"ContainerStarted","Data":"f1623375653186ceeeee97072b6c40764d593d9979efcd32aef8063f9c0bbe28"} Feb 26 11:14:01 crc kubenswrapper[4699]: E0226 11:14:01.542774 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:02.04275954 +0000 UTC m=+187.853585974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.545442 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tg675" event={"ID":"fbc38281-b1a4-4c40-a707-a106b651c107","Type":"ContainerStarted","Data":"a3681115709c674e5549240fab4b9c4a64d0d58866d20fa14451fcd36adf8436"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.546701 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ctfcc" event={"ID":"0aac34b6-aad8-4b68-8180-f68af008611d","Type":"ContainerStarted","Data":"aff982394be5aa630df779a0b1c809a74f2b8b80c8796a2d43b3d6187d5bfafc"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.548217 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" event={"ID":"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466","Type":"ContainerStarted","Data":"9d7ac90385fbaeacd88791e44cd5f3dbc802f7727daac69d69660d2d1079d013"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.550650 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pw64v" event={"ID":"5d015dd8-56c9-4f61-b133-4951cda91ca5","Type":"ContainerStarted","Data":"ebd3f92a38859a47f362253252572ab4dfa30e38ca7be5dde9c7b9c8d20415d3"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.556331 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tcnxt" event={"ID":"72b1bc55-f48b-4d90-ab02-3a80438096b6","Type":"ContainerStarted","Data":"06f7357a2582fe2f5b7f2bf23417d511d550f2b8ba54d68e0488f4ecdfb3c7b1"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.559512 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-zqgj9" podStartSLOduration=134.559499275 podStartE2EDuration="2m14.559499275s" podCreationTimestamp="2026-02-26 11:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:01.558593698 +0000 UTC m=+187.369420132" watchObservedRunningTime="2026-02-26 11:14:01.559499275 +0000 UTC m=+187.370325709" Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.574448 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tm8w" event={"ID":"460579d9-ed16-49b7-a588-ef20ceb9bbf4","Type":"ContainerStarted","Data":"1c18b04fd38d4e5cfb82fc15f10a9055f343d1ab9d91ad12c3161750e4de76b7"} Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.604873 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tg675" podStartSLOduration=135.604849724 podStartE2EDuration="2m15.604849724s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:01.603468863 +0000 UTC m=+187.414295307" watchObservedRunningTime="2026-02-26 11:14:01.604849724 +0000 UTC m=+187.415676158" Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.627854 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535074-bjfld"] Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.644998 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:01 crc kubenswrapper[4699]: E0226 11:14:01.646457 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:02.146411851 +0000 UTC m=+187.957238315 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.648888 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpntx" podStartSLOduration=135.648867943 podStartE2EDuration="2m15.648867943s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:01.648191943 +0000 UTC m=+187.459018377" watchObservedRunningTime="2026-02-26 11:14:01.648867943 +0000 UTC m=+187.459694377" Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.746570 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:01 crc kubenswrapper[4699]: E0226 11:14:01.747029 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:02.247006841 +0000 UTC m=+188.057833305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.806661 4699 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.848421 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:01 crc kubenswrapper[4699]: E0226 11:14:01.848595 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:02.348573149 +0000 UTC m=+188.159399583 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.848980 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:01 crc kubenswrapper[4699]: E0226 11:14:01.849730 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:02.349505966 +0000 UTC m=+188.160332400 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.950614 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:01 crc kubenswrapper[4699]: E0226 11:14:01.950777 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:02.450752255 +0000 UTC m=+188.261578689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:01 crc kubenswrapper[4699]: I0226 11:14:01.951272 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:01 crc kubenswrapper[4699]: E0226 11:14:01.951697 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:02.451685723 +0000 UTC m=+188.262512227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.053284 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:02 crc kubenswrapper[4699]: E0226 11:14:02.053773 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:02.553738506 +0000 UTC m=+188.364564940 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.054248 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:02 crc kubenswrapper[4699]: E0226 11:14:02.054629 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:02.554611992 +0000 UTC m=+188.365438426 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.157762 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:02 crc kubenswrapper[4699]: E0226 11:14:02.158109 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:02.658063086 +0000 UTC m=+188.468889650 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.158434 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:02 crc kubenswrapper[4699]: E0226 11:14:02.159040 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:02.659031155 +0000 UTC m=+188.469857589 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.261953 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:02 crc kubenswrapper[4699]: E0226 11:14:02.262402 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:02.762380956 +0000 UTC m=+188.573207390 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.363217 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:02 crc kubenswrapper[4699]: E0226 11:14:02.364615 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:02.86446106 +0000 UTC m=+188.675287494 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.467749 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:02 crc kubenswrapper[4699]: E0226 11:14:02.468246 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:02.968229194 +0000 UTC m=+188.779055628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.569028 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:02 crc kubenswrapper[4699]: E0226 11:14:02.569616 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:03.069593467 +0000 UTC m=+188.880419971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.587165 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hnsh7" event={"ID":"e6bdcf19-db76-497c-a2fe-a6de38fae724","Type":"ContainerStarted","Data":"53695fe3f72ab23b0bc81ef3a7cecb32887acb430df553db59397c8ca54a398f"} Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.588709 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvgmh" event={"ID":"727302ed-b5c0-49b7-be17-7da9387c16c3","Type":"ContainerStarted","Data":"fe9d1cdec02a8cee8cf95b26e05b856087c021ba99aa27e5af51dcfe0240cf0f"} Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.590662 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pw64v" event={"ID":"5d015dd8-56c9-4f61-b133-4951cda91ca5","Type":"ContainerStarted","Data":"4d27114bb717e86cff571e28c7aba751067298ba670dfaa69e6d2ee075e6f067"} Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.593381 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qf" event={"ID":"5cc10041-704b-4b00-8e4e-369103434b64","Type":"ContainerStarted","Data":"2d5a0c0e5922846f3c8cbf86200e16bf7b7b0416026b9755460756c2a821cc04"} Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.595020 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535074-bjfld" event={"ID":"30d444da-9127-459c-97c6-cdcff5b20e67","Type":"ContainerStarted","Data":"18076c1c5e0cfb7ca48ea66321abbc8359663b222708aa29c8481673d9c4ff5c"} Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.596448 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tnwpn" event={"ID":"89f840f7-d21f-4028-b53d-ed0e2061ff15","Type":"ContainerStarted","Data":"2318dbfdb102c963d608346d7789eb7d0ad448ba89834a14c0ed9536fff0d574"} Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.597758 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-hzqgp" event={"ID":"0c7d5fe0-885a-44e4-bacf-19bceeea178f","Type":"ContainerStarted","Data":"8e91007fd24b6b235fe00b711c6276d26c8ecc825c1024e041f2ce0b0f9007bd"} Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.600296 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-k9bv4" event={"ID":"34163385-0c26-4d54-a06a-11f9ef09901d","Type":"ContainerStarted","Data":"b82c4200e29693534baf39c659f06cfff7e11c6e83f1aa3114cbdca96a978e17"} Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.602288 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" event={"ID":"744aa737-e6c7-4d6b-ba7d-a9479043ad29","Type":"ContainerStarted","Data":"e82195f6e02bee889f35431d957fa456c0b1db807d039ddcd99b290da7bc9288"} Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.604618 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-rlx7c" event={"ID":"6b9ab605-cf5d-43ea-9554-20032a52e23c","Type":"ContainerStarted","Data":"37d92097cdf24750a95f125a3a12ba947a11479227a9d589f2bff87c62cf1dea"} Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.605859 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ctfcc" podStartSLOduration=135.605843457 podStartE2EDuration="2m15.605843457s" podCreationTimestamp="2026-02-26 11:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:01.689170353 +0000 UTC m=+187.499996797" watchObservedRunningTime="2026-02-26 11:14:02.605843457 +0000 UTC m=+188.416669891" Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.606105 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvgmh" podStartSLOduration=136.606099195 podStartE2EDuration="2m16.606099195s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:02.605151787 +0000 UTC m=+188.415978221" watchObservedRunningTime="2026-02-26 11:14:02.606099195 +0000 UTC m=+188.416925639" Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.613134 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k6wtb" event={"ID":"af5429d7-39d0-4b17-8219-21c8491384ae","Type":"ContainerStarted","Data":"eb6b24300a1f0473a2db7791c96c541a2a5af8105921dfceb5d170fda26524a1"} Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.614384 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" event={"ID":"fab52d01-f907-44cb-8d5f-162116d75fc9","Type":"ContainerStarted","Data":"7e76c7fe195b5c4a77f4cf9d0dcbabd8f147496fccdf7af35990ca526dc334e9"} Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.621528 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-rlx7c" podStartSLOduration=7.621505749 podStartE2EDuration="7.621505749s" podCreationTimestamp="2026-02-26 11:13:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:02.620005705 +0000 UTC m=+188.430832159" watchObservedRunningTime="2026-02-26 11:14:02.621505749 +0000 UTC m=+188.432332183" Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.624341 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b" event={"ID":"03fd3407-9529-4638-89d6-cfc6b703e510","Type":"ContainerStarted","Data":"ec8bc8192ce6082446d639b19d1c7574145a66a52226a83d86f89ce7579a3a4e"} Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.630081 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-xbpcs" event={"ID":"61a1581f-5367-4535-99bc-3f28547ab766","Type":"ContainerStarted","Data":"3de5a5b46da3ed916b21ce510b57be99f1b3297b846ab542ef9a25da569ef185"} Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.632843 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r2phw" event={"ID":"00fcad37-801c-4a2c-8599-dabd0f36db6d","Type":"ContainerStarted","Data":"29099d235a86ea4029c6ad27a31f4f0f3f48126cf629096571b61afd018fa9a5"} Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.634890 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gngtb" event={"ID":"e0ecd5cc-b456-4d69-897c-5fd543842440","Type":"ContainerStarted","Data":"9488713a40a5a4743e81c00be9ca4d60ac9ec1a1f8456efac6adb5d8c076245a"} Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.637070 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvgnb" event={"ID":"1a9875bc-9f2e-4887-8dc6-a00cc789eb4a","Type":"ContainerStarted","Data":"2ec7241379b9f3b05752fa70dd15ed6b5a7df760f3526f5163051cfc14d835f1"} Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.637646 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvgnb" Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.638455 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wcnnr" event={"ID":"09191eec-0be2-4c45-9249-6c8081d6108a","Type":"ContainerStarted","Data":"e8aa800314c45bdd20aa1e8f81dcaae8818e4823d3d14052fbf7e64cff8fc222"} Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.639160 4699 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xvgnb container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.639202 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvgnb" podUID="1a9875bc-9f2e-4887-8dc6-a00cc789eb4a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.640044 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p9wj4" event={"ID":"bad776f4-e24b-41f1-88d8-2b1fe6258783","Type":"ContainerStarted","Data":"f6ef511605018ef6334a323102f99d31a070e7c94cc362d42542c1f9238cf81b"} Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.644784 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" event={"ID":"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466","Type":"ContainerStarted","Data":"74570cc7e5f47cfb5ae78c7040168924d22c48d5892dd1918f787cd4639c996c"} Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.647086 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gpjvh" event={"ID":"89928475-c3fb-415f-a244-6292dc8adc33","Type":"ContainerStarted","Data":"f8083a65bee010aee31678eea79efabcd118304cb7619ff3f9d81c06b0d356ae"} Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.648066 4699 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-w7nqx container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" start-of-body= Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.648132 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx" podUID="44832f39-2c56-4669-b328-7e663f6cacdf" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.648194 4699 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-czs8l container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.648215 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-czs8l" podUID="5f6e45f7-93da-46b8-9021-d2500076c385" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.648451 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.650324 4699 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-gsl8w container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.650386 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" podUID="796e9631-3388-48b1-8675-3fbc4b6e435d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.650844 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-r2phw" podStartSLOduration=7.650829735 podStartE2EDuration="7.650829735s" podCreationTimestamp="2026-02-26 11:13:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:02.650520836 +0000 UTC m=+188.461347290" watchObservedRunningTime="2026-02-26 11:14:02.650829735 +0000 UTC m=+188.461656169" Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.672960 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:02 crc kubenswrapper[4699]: E0226 11:14:02.673381 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:03.17333824 +0000 UTC m=+188.984164834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.694345 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" podStartSLOduration=136.694320819 podStartE2EDuration="2m16.694320819s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:02.693605778 +0000 UTC m=+188.504432232" watchObservedRunningTime="2026-02-26 11:14:02.694320819 +0000 UTC m=+188.505147253" Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.714910 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2zshh" podStartSLOduration=136.714876726 podStartE2EDuration="2m16.714876726s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:02.713915588 +0000 UTC m=+188.524742042" watchObservedRunningTime="2026-02-26 11:14:02.714876726 +0000 UTC m=+188.525703160" Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.733012 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvgnb" podStartSLOduration=135.732984641 podStartE2EDuration="2m15.732984641s" podCreationTimestamp="2026-02-26 11:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:02.731789526 +0000 UTC m=+188.542615960" watchObservedRunningTime="2026-02-26 11:14:02.732984641 +0000 UTC m=+188.543811075" Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.751794 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-qsj62" podStartSLOduration=136.751766245 podStartE2EDuration="2m16.751766245s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:02.750095386 +0000 UTC m=+188.560921830" watchObservedRunningTime="2026-02-26 11:14:02.751766245 +0000 UTC m=+188.562592679" Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.767835 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p9wj4" podStartSLOduration=136.767809429 podStartE2EDuration="2m16.767809429s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:02.76717853 +0000 UTC m=+188.578004974" watchObservedRunningTime="2026-02-26 11:14:02.767809429 +0000 UTC m=+188.578635863" Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.775513 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:02 crc kubenswrapper[4699]: E0226 11:14:02.776867 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:03.276847366 +0000 UTC m=+189.087674000 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.877463 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:02 crc kubenswrapper[4699]: E0226 11:14:02.877620 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:03.377601471 +0000 UTC m=+189.188427895 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.878072 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:02 crc kubenswrapper[4699]: E0226 11:14:02.878375 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:03.378366683 +0000 UTC m=+189.189193117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.979153 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:02 crc kubenswrapper[4699]: E0226 11:14:02.979398 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:03.479366135 +0000 UTC m=+189.290192579 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:02 crc kubenswrapper[4699]: I0226 11:14:02.979514 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:02 crc kubenswrapper[4699]: E0226 11:14:02.979932 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:03.479921592 +0000 UTC m=+189.290748106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.080211 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:03 crc kubenswrapper[4699]: E0226 11:14:03.080699 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:03.580679507 +0000 UTC m=+189.391505941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.186959 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:03 crc kubenswrapper[4699]: E0226 11:14:03.187367 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:03.687351616 +0000 UTC m=+189.498178040 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.288415 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:03 crc kubenswrapper[4699]: E0226 11:14:03.288898 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:03.788878794 +0000 UTC m=+189.599705228 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.390615 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:03 crc kubenswrapper[4699]: E0226 11:14:03.391143 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:03.891122972 +0000 UTC m=+189.701949406 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.492184 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:03 crc kubenswrapper[4699]: E0226 11:14:03.492611 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:03.992590368 +0000 UTC m=+189.803416812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.594724 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:03 crc kubenswrapper[4699]: E0226 11:14:03.595951 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:04.095931279 +0000 UTC m=+189.906757713 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.679708 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pxsr8" event={"ID":"b4f243e8-e08c-420e-a78b-02e6a14bf5fe","Type":"ContainerStarted","Data":"418417a4b5a05608ddf0c9f7b70ff9d2d23d879dcec0660a1e84735a58ee62da"} Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.687013 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v5ctv" event={"ID":"6956c039-cf77-429b-8f7f-f93ba195d321","Type":"ContainerStarted","Data":"2e342cce17684d6ff691ec10d2bcc3942a85d6162a78085ad90681a3a3df3576"} Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.690617 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" event={"ID":"fab52d01-f907-44cb-8d5f-162116d75fc9","Type":"ContainerStarted","Data":"d0a5ae12fe179876187bd5fcc07db538307a9088a6b77eef993ddb984feb4ce6"} Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.694841 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p742p" event={"ID":"679ffaa0-41b8-4638-8b4c-4c1f424812e4","Type":"ContainerStarted","Data":"a2fef55cbc0542eb24393c676c5f5bd5d0c3e9eac64318f59af7579cb2cbaccb"} Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.702204 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:03 crc kubenswrapper[4699]: E0226 11:14:03.702658 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:04.20263715 +0000 UTC m=+190.013463584 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.730026 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pw64v" event={"ID":"5d015dd8-56c9-4f61-b133-4951cda91ca5","Type":"ContainerStarted","Data":"e1b4b4117e357c9cec9f98a9ad6f893e6302ee88ae56e4d7312bba96bb4ecbc3"} Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.738283 4699 generic.go:334] "Generic (PLEG): container finished" podID="a550b2ea-3ce7-4df3-bbf5-f1025afca8c1" containerID="86924254b70133e8a088814a88e325681fd85745ba692e775d8e52888481afc0" exitCode=0 Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.738406 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" event={"ID":"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1","Type":"ContainerDied","Data":"86924254b70133e8a088814a88e325681fd85745ba692e775d8e52888481afc0"} Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.746913 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dk749" event={"ID":"1d3e449f-d082-43cb-951d-53d82fde40ca","Type":"ContainerStarted","Data":"f9141d6a8a5238a8078b83999f7ed2b68f3d889e35ec03a155b62c335ec78209"} Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.747934 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dk749" Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.763807 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tcnxt" event={"ID":"72b1bc55-f48b-4d90-ab02-3a80438096b6","Type":"ContainerStarted","Data":"a48cf703fa085cd2031065303843172d7d70091d4cf97de0a11e40328102d59a"} Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.768216 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-j6vfb" podStartSLOduration=137.768192835 podStartE2EDuration="2m17.768192835s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:03.720034914 +0000 UTC m=+189.530861358" watchObservedRunningTime="2026-02-26 11:14:03.768192835 +0000 UTC m=+189.579019259" Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.777322 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-tcnxt" Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.780139 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.780225 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.785146 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k6wtb" event={"ID":"af5429d7-39d0-4b17-8219-21c8491384ae","Type":"ContainerStarted","Data":"2f9adae57e6d2b1af184657dc07068c4c1b867837d37037e20a68bce7fd5be7e"} Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.788182 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gngtb" event={"ID":"e0ecd5cc-b456-4d69-897c-5fd543842440","Type":"ContainerStarted","Data":"82c13a5022914e952a2bab53f2e05496950cc43642fe3a2c7f25c55df6a55a09"} Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.791684 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tnwpn" event={"ID":"89f840f7-d21f-4028-b53d-ed0e2061ff15","Type":"ContainerStarted","Data":"347daff707932d34a7509e2817d19f9589aa550ee3b39218f581167ad5b4425f"} Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.791868 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-tnwpn" Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.797442 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-hzqgp" event={"ID":"0c7d5fe0-885a-44e4-bacf-19bceeea178f","Type":"ContainerStarted","Data":"4b9ba0a4dd4a72e7950717c813bae7e51efbff5926a279fa5d2cda039b11d068"} Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.798523 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-hzqgp" Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.802218 4699 patch_prober.go:28] interesting pod/console-operator-58897d9998-hzqgp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.802351 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-hzqgp" podUID="0c7d5fe0-885a-44e4-bacf-19bceeea178f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.816203 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.833008 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-xbpcs" event={"ID":"61a1581f-5367-4535-99bc-3f28547ab766","Type":"ContainerStarted","Data":"bd8dd636cc1be15cadc79b3f065ace3f146caed71d3576259cd5542c9b4db330"} Feb 26 11:14:03 crc kubenswrapper[4699]: E0226 11:14:03.838669 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:04.338602904 +0000 UTC m=+190.149429348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.842835 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vmxjr" event={"ID":"8cd7cbed-d0bf-4d8c-933c-4d031170288a","Type":"ContainerStarted","Data":"5fb3c642cc2b4db8ee02516e04dda3e41ccb434f432279e5c63d1f26897348af"} Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.859668 4699 generic.go:334] "Generic (PLEG): container finished" podID="afa5e1ce-a457-4771-ab06-2654a7801704" containerID="1594adc44c33ac8ba0a68282a8063be46f209d5dc350a05f6bb0643ca257702a" exitCode=0 Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.859746 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" event={"ID":"afa5e1ce-a457-4771-ab06-2654a7801704","Type":"ContainerDied","Data":"1594adc44c33ac8ba0a68282a8063be46f209d5dc350a05f6bb0643ca257702a"} Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.865430 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-pw64v" podStartSLOduration=137.865407426 podStartE2EDuration="2m17.865407426s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:03.775998456 +0000 UTC m=+189.586824890" watchObservedRunningTime="2026-02-26 11:14:03.865407426 +0000 UTC m=+189.676233870" Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.908159 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tm8w" event={"ID":"460579d9-ed16-49b7-a588-ef20ceb9bbf4","Type":"ContainerStarted","Data":"412810a4b96fc8e6fac38daceab60328782f3b3751aa9492bb91d85380f550fc"} Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.909529 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dk749" podStartSLOduration=136.909472537 podStartE2EDuration="2m16.909472537s" podCreationTimestamp="2026-02-26 11:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:03.864103057 +0000 UTC m=+189.674929491" watchObservedRunningTime="2026-02-26 11:14:03.909472537 +0000 UTC m=+189.720299001" Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.914227 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-tcnxt" podStartSLOduration=137.914193766 podStartE2EDuration="2m17.914193766s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:03.910710733 +0000 UTC m=+189.721537167" watchObservedRunningTime="2026-02-26 11:14:03.914193766 +0000 UTC m=+189.725020210" Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.921898 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:03 crc kubenswrapper[4699]: E0226 11:14:03.922940 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:04.422917434 +0000 UTC m=+190.233743858 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.950045 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wcnnr" event={"ID":"09191eec-0be2-4c45-9249-6c8081d6108a","Type":"ContainerStarted","Data":"4ec8095ed163cca6baee580c32afe95f34593492d8e83cb223fc13e4f1513d5e"} Feb 26 11:14:03 crc kubenswrapper[4699]: I0226 11:14:03.991295 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xm88w" event={"ID":"4a97e310-1811-48a9-a31a-eb9a0321d280","Type":"ContainerStarted","Data":"7d90a4f67d2dd0c0cf5d5064360d65b4fc14ff112f5931ab2169c9464452b8c9"} Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.026065 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:04 crc kubenswrapper[4699]: E0226 11:14:04.036729 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:04.536706464 +0000 UTC m=+190.347532898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.054752 4699 generic.go:334] "Generic (PLEG): container finished" podID="03fd3407-9529-4638-89d6-cfc6b703e510" containerID="ec8bc8192ce6082446d639b19d1c7574145a66a52226a83d86f89ce7579a3a4e" exitCode=0 Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.058472 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b" event={"ID":"03fd3407-9529-4638-89d6-cfc6b703e510","Type":"ContainerDied","Data":"ec8bc8192ce6082446d639b19d1c7574145a66a52226a83d86f89ce7579a3a4e"} Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.060144 4699 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xvgnb container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.060225 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvgnb" podUID="1a9875bc-9f2e-4887-8dc6-a00cc789eb4a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.060740 4699 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-w7nqx container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" start-of-body= Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.060787 4699 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-gsl8w container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.060814 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" podUID="796e9631-3388-48b1-8675-3fbc4b6e435d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.060820 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx" podUID="44832f39-2c56-4669-b328-7e663f6cacdf" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.060910 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.061198 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qf" Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.061235 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.063053 4699 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cd5qf container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.063093 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qf" podUID="5cc10041-704b-4b00-8e4e-369103434b64" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.088274 4699 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-fq7g8 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.088327 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" podUID="744aa737-e6c7-4d6b-ba7d-a9479043ad29" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.088407 4699 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-22qbz container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" start-of-body= Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.088422 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" podUID="94f9b9d1-e4e9-4cd5-8606-80f57ee5c466" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.129309 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:04 crc kubenswrapper[4699]: E0226 11:14:04.130677 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:04.630661538 +0000 UTC m=+190.441487972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.176813 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gngtb" podStartSLOduration=138.17679548 podStartE2EDuration="2m18.17679548s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:03.973482847 +0000 UTC m=+189.784309281" watchObservedRunningTime="2026-02-26 11:14:04.17679548 +0000 UTC m=+189.987621914" Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.177558 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tm8w" podStartSLOduration=138.177553652 podStartE2EDuration="2m18.177553652s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:04.176042547 +0000 UTC m=+189.986868981" watchObservedRunningTime="2026-02-26 11:14:04.177553652 +0000 UTC m=+189.988380086" Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.235168 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:04 crc kubenswrapper[4699]: E0226 11:14:04.235512 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:04.735500083 +0000 UTC m=+190.546326517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.331287 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-tnwpn" podStartSLOduration=9.33126705 podStartE2EDuration="9.33126705s" podCreationTimestamp="2026-02-26 11:13:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:04.231902677 +0000 UTC m=+190.042729121" watchObservedRunningTime="2026-02-26 11:14:04.33126705 +0000 UTC m=+190.142093484" Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.336527 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-hzqgp" podStartSLOduration=138.336504125 podStartE2EDuration="2m18.336504125s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:04.333832956 +0000 UTC m=+190.144659380" watchObservedRunningTime="2026-02-26 11:14:04.336504125 +0000 UTC m=+190.147330569" Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.337271 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:04 crc kubenswrapper[4699]: E0226 11:14:04.337432 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:04.837411182 +0000 UTC m=+190.648237616 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.337500 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:04 crc kubenswrapper[4699]: E0226 11:14:04.337902 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:04.837892886 +0000 UTC m=+190.648719320 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.424025 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vmxjr" podStartSLOduration=138.424002508 podStartE2EDuration="2m18.424002508s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:04.421258967 +0000 UTC m=+190.232085421" watchObservedRunningTime="2026-02-26 11:14:04.424002508 +0000 UTC m=+190.234828952" Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.438943 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:04 crc kubenswrapper[4699]: E0226 11:14:04.439352 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:04.939333941 +0000 UTC m=+190.750160375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.521452 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k6wtb" podStartSLOduration=138.521424985 podStartE2EDuration="2m18.521424985s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:04.520976842 +0000 UTC m=+190.331803276" watchObservedRunningTime="2026-02-26 11:14:04.521424985 +0000 UTC m=+190.332251419" Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.541442 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:04 crc kubenswrapper[4699]: E0226 11:14:04.541909 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:05.041891719 +0000 UTC m=+190.852718153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.574444 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-xbpcs" podStartSLOduration=138.57442396 podStartE2EDuration="2m18.57442396s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:04.572186013 +0000 UTC m=+190.383012457" watchObservedRunningTime="2026-02-26 11:14:04.57442396 +0000 UTC m=+190.385250384" Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.622496 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-hnsh7" podStartSLOduration=138.622476128 podStartE2EDuration="2m18.622476128s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:04.619062697 +0000 UTC m=+190.429889151" watchObservedRunningTime="2026-02-26 11:14:04.622476128 +0000 UTC m=+190.433302562" Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.645211 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:04 crc kubenswrapper[4699]: E0226 11:14:04.645578 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:05.14556308 +0000 UTC m=+190.956389504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.647866 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qf" podStartSLOduration=138.647853657 podStartE2EDuration="2m18.647853657s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:04.645835568 +0000 UTC m=+190.456662002" watchObservedRunningTime="2026-02-26 11:14:04.647853657 +0000 UTC m=+190.458680092" Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.724472 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-xm88w" Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.726575 4699 patch_prober.go:28] interesting pod/router-default-5444994796-xm88w container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.726622 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xm88w" podUID="4a97e310-1811-48a9-a31a-eb9a0321d280" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.747221 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:04 crc kubenswrapper[4699]: E0226 11:14:04.747629 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:05.247614773 +0000 UTC m=+191.058441207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.842949 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-xm88w" podStartSLOduration=138.842927147 podStartE2EDuration="2m18.842927147s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:04.77663874 +0000 UTC m=+190.587465184" watchObservedRunningTime="2026-02-26 11:14:04.842927147 +0000 UTC m=+190.653753581" Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.844439 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" podStartSLOduration=137.844432531 podStartE2EDuration="2m17.844432531s" podCreationTimestamp="2026-02-26 11:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:04.842307279 +0000 UTC m=+190.653133733" watchObservedRunningTime="2026-02-26 11:14:04.844432531 +0000 UTC m=+190.655258965" Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.848995 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:04 crc kubenswrapper[4699]: E0226 11:14:04.849491 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:05.34947232 +0000 UTC m=+191.160298754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.884596 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" podStartSLOduration=138.884575347 podStartE2EDuration="2m18.884575347s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:04.87723969 +0000 UTC m=+190.688066134" watchObservedRunningTime="2026-02-26 11:14:04.884575347 +0000 UTC m=+190.695401781" Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.954364 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:04 crc kubenswrapper[4699]: E0226 11:14:04.954745 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:05.454729358 +0000 UTC m=+191.265555792 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:04 crc kubenswrapper[4699]: I0226 11:14:04.963571 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-k9bv4" podStartSLOduration=138.963549248 podStartE2EDuration="2m18.963549248s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:04.921736984 +0000 UTC m=+190.732563418" watchObservedRunningTime="2026-02-26 11:14:04.963549248 +0000 UTC m=+190.774375692" Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.055842 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:05 crc kubenswrapper[4699]: E0226 11:14:05.056174 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:05.556151813 +0000 UTC m=+191.366978247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.159289 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.159350 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wcnnr" event={"ID":"09191eec-0be2-4c45-9249-6c8081d6108a","Type":"ContainerStarted","Data":"b4e29a36046b28c5156e1d2d7970fd894db79a0c6417201cc321957230906e44"} Feb 26 11:14:05 crc kubenswrapper[4699]: E0226 11:14:05.159744 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:05.659729001 +0000 UTC m=+191.470555435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.194665 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b" event={"ID":"03fd3407-9529-4638-89d6-cfc6b703e510","Type":"ContainerStarted","Data":"3fa0d5be17dd80720d17b8f823f5ed502ea5130df1ef1d9a2520602f0d57c64d"} Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.195302 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b" Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.201219 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p742p" event={"ID":"679ffaa0-41b8-4638-8b4c-4c1f424812e4","Type":"ContainerStarted","Data":"de6932c7dc258437fbe25d836eeab2dd76c193476998fff02f811b5f0b5fde19"} Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.214147 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" event={"ID":"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1","Type":"ContainerStarted","Data":"dc90312239394159bc93a17b5e295cfe48f5d30d81ba0217af32834a3153cb80"} Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.228999 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gpjvh" podStartSLOduration=139.228976885 podStartE2EDuration="2m19.228976885s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:04.983722074 +0000 UTC m=+190.794548528" watchObservedRunningTime="2026-02-26 11:14:05.228976885 +0000 UTC m=+191.039803339" Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.230334 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wcnnr" podStartSLOduration=139.230326825 podStartE2EDuration="2m19.230326825s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:05.228557053 +0000 UTC m=+191.039383497" watchObservedRunningTime="2026-02-26 11:14:05.230326825 +0000 UTC m=+191.041153259" Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.265923 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:05 crc kubenswrapper[4699]: E0226 11:14:05.266206 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:05.766188304 +0000 UTC m=+191.577014738 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.266548 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:05 crc kubenswrapper[4699]: E0226 11:14:05.270347 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:05.770330366 +0000 UTC m=+191.581156800 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.283195 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pxsr8" event={"ID":"b4f243e8-e08c-420e-a78b-02e6a14bf5fe","Type":"ContainerStarted","Data":"569bd3906af910300f4d7496d6bc8901382152ed8d3a57c679d317d53c830bef"} Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.287319 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p742p" podStartSLOduration=139.287300027 podStartE2EDuration="2m19.287300027s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:05.26977783 +0000 UTC m=+191.080604264" watchObservedRunningTime="2026-02-26 11:14:05.287300027 +0000 UTC m=+191.098126461" Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.340716 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v5ctv" event={"ID":"6956c039-cf77-429b-8f7f-f93ba195d321","Type":"ContainerStarted","Data":"f7d17fd3b95c04cd27aba31e4f058d08f123e7867d573100242c1cdf0b285359"} Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.342007 4699 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-fq7g8 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.342059 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" podUID="744aa737-e6c7-4d6b-ba7d-a9479043ad29" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.343694 4699 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cd5qf container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.343733 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qf" podUID="5cc10041-704b-4b00-8e4e-369103434b64" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.344279 4699 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-22qbz container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" start-of-body= Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.344307 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" podUID="94f9b9d1-e4e9-4cd5-8606-80f57ee5c466" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.345964 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.346020 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.349585 4699 patch_prober.go:28] interesting pod/console-operator-58897d9998-hzqgp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.349628 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-hzqgp" podUID="0c7d5fe0-885a-44e4-bacf-19bceeea178f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.369578 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:05 crc kubenswrapper[4699]: E0226 11:14:05.370799 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:05.870775271 +0000 UTC m=+191.681601715 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.445233 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b" podStartSLOduration=139.445208828 podStartE2EDuration="2m19.445208828s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:05.443633342 +0000 UTC m=+191.254459776" watchObservedRunningTime="2026-02-26 11:14:05.445208828 +0000 UTC m=+191.256035262" Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.484046 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:05 crc kubenswrapper[4699]: E0226 11:14:05.493079 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:05.993057331 +0000 UTC m=+191.803883835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.494034 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-v5ctv" podStartSLOduration=139.494013909 podStartE2EDuration="2m19.494013909s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:05.493549916 +0000 UTC m=+191.304376360" watchObservedRunningTime="2026-02-26 11:14:05.494013909 +0000 UTC m=+191.304840353" Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.576227 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pxsr8" podStartSLOduration=139.576207046 podStartE2EDuration="2m19.576207046s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:05.542615504 +0000 UTC m=+191.353441948" watchObservedRunningTime="2026-02-26 11:14:05.576207046 +0000 UTC m=+191.387033480" Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.598550 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:05 crc kubenswrapper[4699]: E0226 11:14:05.608261 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:06.107928473 +0000 UTC m=+191.918754907 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.705897 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:05 crc kubenswrapper[4699]: E0226 11:14:05.706312 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:06.206299617 +0000 UTC m=+192.017126051 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.729329 4699 patch_prober.go:28] interesting pod/router-default-5444994796-xm88w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 11:14:05 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Feb 26 11:14:05 crc kubenswrapper[4699]: [+]process-running ok Feb 26 11:14:05 crc kubenswrapper[4699]: healthz check failed Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.729400 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xm88w" podUID="4a97e310-1811-48a9-a31a-eb9a0321d280" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.807716 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:05 crc kubenswrapper[4699]: E0226 11:14:05.808252 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:06.308233557 +0000 UTC m=+192.119059991 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:05 crc kubenswrapper[4699]: I0226 11:14:05.909836 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:05 crc kubenswrapper[4699]: E0226 11:14:05.910375 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:06.410356662 +0000 UTC m=+192.221183096 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.011506 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:06 crc kubenswrapper[4699]: E0226 11:14:06.011714 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:06.511682353 +0000 UTC m=+192.322508797 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.011884 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:06 crc kubenswrapper[4699]: E0226 11:14:06.012315 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:06.512300582 +0000 UTC m=+192.323127016 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.113155 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:06 crc kubenswrapper[4699]: E0226 11:14:06.113322 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:06.613304534 +0000 UTC m=+192.424130968 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.113470 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:06 crc kubenswrapper[4699]: E0226 11:14:06.113834 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:06.613824509 +0000 UTC m=+192.424650943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.214051 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:06 crc kubenswrapper[4699]: E0226 11:14:06.214500 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:06.714483541 +0000 UTC m=+192.525309975 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.320463 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:06 crc kubenswrapper[4699]: E0226 11:14:06.320841 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:06.820826421 +0000 UTC m=+192.631652855 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.321871 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" podStartSLOduration=139.321838091 podStartE2EDuration="2m19.321838091s" podCreationTimestamp="2026-02-26 11:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:05.579036559 +0000 UTC m=+191.389862993" watchObservedRunningTime="2026-02-26 11:14:06.321838091 +0000 UTC m=+192.132664525" Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.401074 4699 generic.go:334] "Generic (PLEG): container finished" podID="5f8a28b8-c47b-4288-877f-8e90a3b581b5" containerID="61a2c48ee6bf74ea4766fbbb38a98752e4fc1a270493117d88d14b6af7b2c988" exitCode=0 Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.401154 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535060-f97rd" event={"ID":"5f8a28b8-c47b-4288-877f-8e90a3b581b5","Type":"ContainerDied","Data":"61a2c48ee6bf74ea4766fbbb38a98752e4fc1a270493117d88d14b6af7b2c988"} Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.421976 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:06 crc kubenswrapper[4699]: E0226 11:14:06.422407 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:06.92238958 +0000 UTC m=+192.733216014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.428097 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qzphl" event={"ID":"79a9064f-5fcf-42f7-af6f-71aeeb75560e","Type":"ContainerStarted","Data":"ee510303e64372f08a02ae5a20a53e3bab98135ee01fe168a220c71c4a9c91b8"} Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.434566 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" event={"ID":"a550b2ea-3ce7-4df3-bbf5-f1025afca8c1","Type":"ContainerStarted","Data":"5a55c50b4b793697906fc96de58dc7541e4deae7df08fd6347550897da648c81"} Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.439669 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" event={"ID":"afa5e1ce-a457-4771-ab06-2654a7801704","Type":"ContainerStarted","Data":"0a0d420653cca8d8cbd7e0ac3b8114b6062c04c58afe51e78fb7e6d70799880b"} Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.445650 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.445672 4699 patch_prober.go:28] interesting pod/console-operator-58897d9998-hzqgp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.445712 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.445787 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-hzqgp" podUID="0c7d5fe0-885a-44e4-bacf-19bceeea178f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.526943 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:06 crc kubenswrapper[4699]: E0226 11:14:06.529486 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:07.029464491 +0000 UTC m=+192.840290955 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.628399 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:06 crc kubenswrapper[4699]: E0226 11:14:06.628655 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:07.128620359 +0000 UTC m=+192.939446793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.629293 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:06 crc kubenswrapper[4699]: E0226 11:14:06.630941 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:07.130919817 +0000 UTC m=+192.941746321 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.725852 4699 patch_prober.go:28] interesting pod/router-default-5444994796-xm88w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 11:14:06 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Feb 26 11:14:06 crc kubenswrapper[4699]: [+]process-running ok Feb 26 11:14:06 crc kubenswrapper[4699]: healthz check failed Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.725916 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xm88w" podUID="4a97e310-1811-48a9-a31a-eb9a0321d280" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.731801 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:06 crc kubenswrapper[4699]: E0226 11:14:06.732180 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:07.232150696 +0000 UTC m=+193.042977130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.835635 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:06 crc kubenswrapper[4699]: E0226 11:14:06.836022 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:07.336008752 +0000 UTC m=+193.146835186 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.852655 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" podStartSLOduration=140.852632233 podStartE2EDuration="2m20.852632233s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:06.84948197 +0000 UTC m=+192.660308414" watchObservedRunningTime="2026-02-26 11:14:06.852632233 +0000 UTC m=+192.663458667" Feb 26 11:14:06 crc kubenswrapper[4699]: I0226 11:14:06.937290 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:06 crc kubenswrapper[4699]: E0226 11:14:06.937688 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:07.437671754 +0000 UTC m=+193.248498188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.038480 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:07 crc kubenswrapper[4699]: E0226 11:14:07.038891 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:07.538867861 +0000 UTC m=+193.349694345 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.141722 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:07 crc kubenswrapper[4699]: E0226 11:14:07.142497 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:07.64247151 +0000 UTC m=+193.453297954 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.181733 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mzgjj"] Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.182783 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mzgjj" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.196344 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.243962 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:07 crc kubenswrapper[4699]: E0226 11:14:07.244390 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:07.744372899 +0000 UTC m=+193.555199323 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.319551 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mzgjj"] Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.345633 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.346054 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z6wd\" (UniqueName: \"kubernetes.io/projected/71a83978-4f86-404b-967a-0e7493ff6721-kube-api-access-9z6wd\") pod \"certified-operators-mzgjj\" (UID: \"71a83978-4f86-404b-967a-0e7493ff6721\") " pod="openshift-marketplace/certified-operators-mzgjj" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.346167 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71a83978-4f86-404b-967a-0e7493ff6721-utilities\") pod \"certified-operators-mzgjj\" (UID: \"71a83978-4f86-404b-967a-0e7493ff6721\") " pod="openshift-marketplace/certified-operators-mzgjj" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.346221 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71a83978-4f86-404b-967a-0e7493ff6721-catalog-content\") pod \"certified-operators-mzgjj\" (UID: \"71a83978-4f86-404b-967a-0e7493ff6721\") " pod="openshift-marketplace/certified-operators-mzgjj" Feb 26 11:14:07 crc kubenswrapper[4699]: E0226 11:14:07.346406 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:07.846386421 +0000 UTC m=+193.657212855 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.351432 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-czwkc"] Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.352840 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-czwkc" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.360545 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.449424 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqrqs\" (UniqueName: \"kubernetes.io/projected/ac0026c3-1fad-4b34-9c42-389971f0c773-kube-api-access-rqrqs\") pod \"community-operators-czwkc\" (UID: \"ac0026c3-1fad-4b34-9c42-389971f0c773\") " pod="openshift-marketplace/community-operators-czwkc" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.449537 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z6wd\" (UniqueName: \"kubernetes.io/projected/71a83978-4f86-404b-967a-0e7493ff6721-kube-api-access-9z6wd\") pod \"certified-operators-mzgjj\" (UID: \"71a83978-4f86-404b-967a-0e7493ff6721\") " pod="openshift-marketplace/certified-operators-mzgjj" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.449586 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac0026c3-1fad-4b34-9c42-389971f0c773-utilities\") pod \"community-operators-czwkc\" (UID: \"ac0026c3-1fad-4b34-9c42-389971f0c773\") " pod="openshift-marketplace/community-operators-czwkc" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.449629 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71a83978-4f86-404b-967a-0e7493ff6721-utilities\") pod \"certified-operators-mzgjj\" (UID: \"71a83978-4f86-404b-967a-0e7493ff6721\") " pod="openshift-marketplace/certified-operators-mzgjj" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.449658 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac0026c3-1fad-4b34-9c42-389971f0c773-catalog-content\") pod \"community-operators-czwkc\" (UID: \"ac0026c3-1fad-4b34-9c42-389971f0c773\") " pod="openshift-marketplace/community-operators-czwkc" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.449683 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71a83978-4f86-404b-967a-0e7493ff6721-catalog-content\") pod \"certified-operators-mzgjj\" (UID: \"71a83978-4f86-404b-967a-0e7493ff6721\") " pod="openshift-marketplace/certified-operators-mzgjj" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.449715 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:07 crc kubenswrapper[4699]: E0226 11:14:07.450100 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:07.950084772 +0000 UTC m=+193.760911206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.450979 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71a83978-4f86-404b-967a-0e7493ff6721-utilities\") pod \"certified-operators-mzgjj\" (UID: \"71a83978-4f86-404b-967a-0e7493ff6721\") " pod="openshift-marketplace/certified-operators-mzgjj" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.451347 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71a83978-4f86-404b-967a-0e7493ff6721-catalog-content\") pod \"certified-operators-mzgjj\" (UID: \"71a83978-4f86-404b-967a-0e7493ff6721\") " pod="openshift-marketplace/certified-operators-mzgjj" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.516008 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z6wd\" (UniqueName: \"kubernetes.io/projected/71a83978-4f86-404b-967a-0e7493ff6721-kube-api-access-9z6wd\") pod \"certified-operators-mzgjj\" (UID: \"71a83978-4f86-404b-967a-0e7493ff6721\") " pod="openshift-marketplace/certified-operators-mzgjj" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.520656 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-czwkc"] Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.524457 4699 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-vzj5b container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.524540 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b" podUID="03fd3407-9529-4638-89d6-cfc6b703e510" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.539329 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-phhbz"] Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.540541 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phhbz" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.551151 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.551519 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac0026c3-1fad-4b34-9c42-389971f0c773-catalog-content\") pod \"community-operators-czwkc\" (UID: \"ac0026c3-1fad-4b34-9c42-389971f0c773\") " pod="openshift-marketplace/community-operators-czwkc" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.551592 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqrqs\" (UniqueName: \"kubernetes.io/projected/ac0026c3-1fad-4b34-9c42-389971f0c773-kube-api-access-rqrqs\") pod \"community-operators-czwkc\" (UID: \"ac0026c3-1fad-4b34-9c42-389971f0c773\") " pod="openshift-marketplace/community-operators-czwkc" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.551750 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac0026c3-1fad-4b34-9c42-389971f0c773-utilities\") pod \"community-operators-czwkc\" (UID: \"ac0026c3-1fad-4b34-9c42-389971f0c773\") " pod="openshift-marketplace/community-operators-czwkc" Feb 26 11:14:07 crc kubenswrapper[4699]: E0226 11:14:07.552175 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:08.052155976 +0000 UTC m=+193.862982410 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.552697 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac0026c3-1fad-4b34-9c42-389971f0c773-catalog-content\") pod \"community-operators-czwkc\" (UID: \"ac0026c3-1fad-4b34-9c42-389971f0c773\") " pod="openshift-marketplace/community-operators-czwkc" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.553376 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac0026c3-1fad-4b34-9c42-389971f0c773-utilities\") pod \"community-operators-czwkc\" (UID: \"ac0026c3-1fad-4b34-9c42-389971f0c773\") " pod="openshift-marketplace/community-operators-czwkc" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.574375 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mzgjj" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.596277 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqrqs\" (UniqueName: \"kubernetes.io/projected/ac0026c3-1fad-4b34-9c42-389971f0c773-kube-api-access-rqrqs\") pod \"community-operators-czwkc\" (UID: \"ac0026c3-1fad-4b34-9c42-389971f0c773\") " pod="openshift-marketplace/community-operators-czwkc" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.616522 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-phhbz"] Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.660967 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ea10063-7888-400e-af1c-216cbde5a13e-utilities\") pod \"certified-operators-phhbz\" (UID: \"9ea10063-7888-400e-af1c-216cbde5a13e\") " pod="openshift-marketplace/certified-operators-phhbz" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.661021 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-699tw\" (UniqueName: \"kubernetes.io/projected/9ea10063-7888-400e-af1c-216cbde5a13e-kube-api-access-699tw\") pod \"certified-operators-phhbz\" (UID: \"9ea10063-7888-400e-af1c-216cbde5a13e\") " pod="openshift-marketplace/certified-operators-phhbz" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.661062 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ea10063-7888-400e-af1c-216cbde5a13e-catalog-content\") pod \"certified-operators-phhbz\" (UID: \"9ea10063-7888-400e-af1c-216cbde5a13e\") " pod="openshift-marketplace/certified-operators-phhbz" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.661094 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:07 crc kubenswrapper[4699]: E0226 11:14:07.663765 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:08.163743501 +0000 UTC m=+193.974570035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.680463 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-czwkc" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.731830 4699 patch_prober.go:28] interesting pod/router-default-5444994796-xm88w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 11:14:07 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Feb 26 11:14:07 crc kubenswrapper[4699]: [+]process-running ok Feb 26 11:14:07 crc kubenswrapper[4699]: healthz check failed Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.731895 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xm88w" podUID="4a97e310-1811-48a9-a31a-eb9a0321d280" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.770611 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fhgnz"] Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.770814 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.771035 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ea10063-7888-400e-af1c-216cbde5a13e-utilities\") pod \"certified-operators-phhbz\" (UID: \"9ea10063-7888-400e-af1c-216cbde5a13e\") " pod="openshift-marketplace/certified-operators-phhbz" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.771075 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-699tw\" (UniqueName: \"kubernetes.io/projected/9ea10063-7888-400e-af1c-216cbde5a13e-kube-api-access-699tw\") pod \"certified-operators-phhbz\" (UID: \"9ea10063-7888-400e-af1c-216cbde5a13e\") " pod="openshift-marketplace/certified-operators-phhbz" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.771103 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ea10063-7888-400e-af1c-216cbde5a13e-catalog-content\") pod \"certified-operators-phhbz\" (UID: \"9ea10063-7888-400e-af1c-216cbde5a13e\") " pod="openshift-marketplace/certified-operators-phhbz" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.771671 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ea10063-7888-400e-af1c-216cbde5a13e-catalog-content\") pod \"certified-operators-phhbz\" (UID: \"9ea10063-7888-400e-af1c-216cbde5a13e\") " pod="openshift-marketplace/certified-operators-phhbz" Feb 26 11:14:07 crc kubenswrapper[4699]: E0226 11:14:07.771754 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:08.271739429 +0000 UTC m=+194.082565863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.771888 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhgnz" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.772019 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ea10063-7888-400e-af1c-216cbde5a13e-utilities\") pod \"certified-operators-phhbz\" (UID: \"9ea10063-7888-400e-af1c-216cbde5a13e\") " pod="openshift-marketplace/certified-operators-phhbz" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.814335 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fhgnz"] Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.860866 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-699tw\" (UniqueName: \"kubernetes.io/projected/9ea10063-7888-400e-af1c-216cbde5a13e-kube-api-access-699tw\") pod \"certified-operators-phhbz\" (UID: \"9ea10063-7888-400e-af1c-216cbde5a13e\") " pod="openshift-marketplace/certified-operators-phhbz" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.872015 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.872095 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1389c8c4-9546-4193-8067-50db90448d4f-catalog-content\") pod \"community-operators-fhgnz\" (UID: \"1389c8c4-9546-4193-8067-50db90448d4f\") " pod="openshift-marketplace/community-operators-fhgnz" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.872166 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1389c8c4-9546-4193-8067-50db90448d4f-utilities\") pod \"community-operators-fhgnz\" (UID: \"1389c8c4-9546-4193-8067-50db90448d4f\") " pod="openshift-marketplace/community-operators-fhgnz" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.872194 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hnhh\" (UniqueName: \"kubernetes.io/projected/1389c8c4-9546-4193-8067-50db90448d4f-kube-api-access-8hnhh\") pod \"community-operators-fhgnz\" (UID: \"1389c8c4-9546-4193-8067-50db90448d4f\") " pod="openshift-marketplace/community-operators-fhgnz" Feb 26 11:14:07 crc kubenswrapper[4699]: E0226 11:14:07.872586 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:08.372572626 +0000 UTC m=+194.183399060 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.921152 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.922005 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.925001 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.936806 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.937279 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phhbz" Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.964172 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.973232 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:07 crc kubenswrapper[4699]: E0226 11:14:07.984606 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:08.484572653 +0000 UTC m=+194.295399087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:07 crc kubenswrapper[4699]: I0226 11:14:07.995407 4699 ???:1] "http: TLS handshake error from 192.168.126.11:60080: no serving certificate available for the kubelet" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.007187 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.007328 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1389c8c4-9546-4193-8067-50db90448d4f-catalog-content\") pod \"community-operators-fhgnz\" (UID: \"1389c8c4-9546-4193-8067-50db90448d4f\") " pod="openshift-marketplace/community-operators-fhgnz" Feb 26 11:14:08 crc kubenswrapper[4699]: E0226 11:14:08.010231 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:08.51020856 +0000 UTC m=+194.321034994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.011353 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1389c8c4-9546-4193-8067-50db90448d4f-catalog-content\") pod \"community-operators-fhgnz\" (UID: \"1389c8c4-9546-4193-8067-50db90448d4f\") " pod="openshift-marketplace/community-operators-fhgnz" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.025532 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1389c8c4-9546-4193-8067-50db90448d4f-utilities\") pod \"community-operators-fhgnz\" (UID: \"1389c8c4-9546-4193-8067-50db90448d4f\") " pod="openshift-marketplace/community-operators-fhgnz" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.025605 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hnhh\" (UniqueName: \"kubernetes.io/projected/1389c8c4-9546-4193-8067-50db90448d4f-kube-api-access-8hnhh\") pod \"community-operators-fhgnz\" (UID: \"1389c8c4-9546-4193-8067-50db90448d4f\") " pod="openshift-marketplace/community-operators-fhgnz" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.026461 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1389c8c4-9546-4193-8067-50db90448d4f-utilities\") pod \"community-operators-fhgnz\" (UID: \"1389c8c4-9546-4193-8067-50db90448d4f\") " pod="openshift-marketplace/community-operators-fhgnz" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.077098 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hnhh\" (UniqueName: \"kubernetes.io/projected/1389c8c4-9546-4193-8067-50db90448d4f-kube-api-access-8hnhh\") pod \"community-operators-fhgnz\" (UID: \"1389c8c4-9546-4193-8067-50db90448d4f\") " pod="openshift-marketplace/community-operators-fhgnz" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.088773 4699 ???:1] "http: TLS handshake error from 192.168.126.11:60088: no serving certificate available for the kubelet" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.115042 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhgnz" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.130931 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.131281 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4168d2c5-00be-4270-9a2b-c2b8847e4593-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4168d2c5-00be-4270-9a2b-c2b8847e4593\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.131312 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4168d2c5-00be-4270-9a2b-c2b8847e4593-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4168d2c5-00be-4270-9a2b-c2b8847e4593\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 11:14:08 crc kubenswrapper[4699]: E0226 11:14:08.131507 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:08.631488331 +0000 UTC m=+194.442314775 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.225356 4699 ???:1] "http: TLS handshake error from 192.168.126.11:60090: no serving certificate available for the kubelet" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.243974 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.244106 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4168d2c5-00be-4270-9a2b-c2b8847e4593-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4168d2c5-00be-4270-9a2b-c2b8847e4593\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.244147 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4168d2c5-00be-4270-9a2b-c2b8847e4593-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4168d2c5-00be-4270-9a2b-c2b8847e4593\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.244223 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4168d2c5-00be-4270-9a2b-c2b8847e4593-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4168d2c5-00be-4270-9a2b-c2b8847e4593\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 11:14:08 crc kubenswrapper[4699]: E0226 11:14:08.244525 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:08.744510058 +0000 UTC m=+194.555336492 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.311897 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4168d2c5-00be-4270-9a2b-c2b8847e4593-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4168d2c5-00be-4270-9a2b-c2b8847e4593\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.346774 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:08 crc kubenswrapper[4699]: E0226 11:14:08.347191 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:08.847168749 +0000 UTC m=+194.657995183 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.351735 4699 ???:1] "http: TLS handshake error from 192.168.126.11:60100: no serving certificate available for the kubelet" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.395769 4699 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-vzj5b container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.395819 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b" podUID="03fd3407-9529-4638-89d6-cfc6b703e510" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.396181 4699 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-vzj5b container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.396202 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b" podUID="03fd3407-9529-4638-89d6-cfc6b703e510" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.459015 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:08 crc kubenswrapper[4699]: E0226 11:14:08.459350 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:08.95933886 +0000 UTC m=+194.770165294 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.550979 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qzphl" event={"ID":"79a9064f-5fcf-42f7-af6f-71aeeb75560e","Type":"ContainerStarted","Data":"39372f8d86322e71df6515e46017a2da0585f6f0616d2e74f54472479d1581af"} Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.559946 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.561240 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:08 crc kubenswrapper[4699]: E0226 11:14:08.561669 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:09.061653391 +0000 UTC m=+194.872479815 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.663019 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:08 crc kubenswrapper[4699]: E0226 11:14:08.663649 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:09.163636532 +0000 UTC m=+194.974462966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.751350 4699 patch_prober.go:28] interesting pod/router-default-5444994796-xm88w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 11:14:08 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Feb 26 11:14:08 crc kubenswrapper[4699]: [+]process-running ok Feb 26 11:14:08 crc kubenswrapper[4699]: healthz check failed Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.751407 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xm88w" podUID="4a97e310-1811-48a9-a31a-eb9a0321d280" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.763668 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535060-f97rd" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.770165 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:08 crc kubenswrapper[4699]: E0226 11:14:08.770278 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:09.2702584 +0000 UTC m=+195.081084834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.770570 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:08 crc kubenswrapper[4699]: E0226 11:14:08.770894 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:09.270883599 +0000 UTC m=+195.081710033 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.781823 4699 ???:1] "http: TLS handshake error from 192.168.126.11:60104: no serving certificate available for the kubelet" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.882789 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnjhv\" (UniqueName: \"kubernetes.io/projected/5f8a28b8-c47b-4288-877f-8e90a3b581b5-kube-api-access-bnjhv\") pod \"5f8a28b8-c47b-4288-877f-8e90a3b581b5\" (UID: \"5f8a28b8-c47b-4288-877f-8e90a3b581b5\") " Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.882871 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f8a28b8-c47b-4288-877f-8e90a3b581b5-secret-volume\") pod \"5f8a28b8-c47b-4288-877f-8e90a3b581b5\" (UID: \"5f8a28b8-c47b-4288-877f-8e90a3b581b5\") " Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.882904 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f8a28b8-c47b-4288-877f-8e90a3b581b5-config-volume\") pod \"5f8a28b8-c47b-4288-877f-8e90a3b581b5\" (UID: \"5f8a28b8-c47b-4288-877f-8e90a3b581b5\") " Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.883063 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:08 crc kubenswrapper[4699]: E0226 11:14:08.883465 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:09.383448672 +0000 UTC m=+195.194275106 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.884276 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f8a28b8-c47b-4288-877f-8e90a3b581b5-config-volume" (OuterVolumeSpecName: "config-volume") pod "5f8a28b8-c47b-4288-877f-8e90a3b581b5" (UID: "5f8a28b8-c47b-4288-877f-8e90a3b581b5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.886829 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mzgjj"] Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.894690 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f8a28b8-c47b-4288-877f-8e90a3b581b5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5f8a28b8-c47b-4288-877f-8e90a3b581b5" (UID: "5f8a28b8-c47b-4288-877f-8e90a3b581b5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.895495 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f8a28b8-c47b-4288-877f-8e90a3b581b5-kube-api-access-bnjhv" (OuterVolumeSpecName: "kube-api-access-bnjhv") pod "5f8a28b8-c47b-4288-877f-8e90a3b581b5" (UID: "5f8a28b8-c47b-4288-877f-8e90a3b581b5"). InnerVolumeSpecName "kube-api-access-bnjhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.914985 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-czs8l" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.952367 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w7nqx" Feb 26 11:14:08 crc kubenswrapper[4699]: E0226 11:14:08.989701 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:09.489680218 +0000 UTC m=+195.300506652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.995951 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.996144 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnjhv\" (UniqueName: \"kubernetes.io/projected/5f8a28b8-c47b-4288-877f-8e90a3b581b5-kube-api-access-bnjhv\") on node \"crc\" DevicePath \"\"" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.996159 4699 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f8a28b8-c47b-4288-877f-8e90a3b581b5-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 11:14:08 crc kubenswrapper[4699]: I0226 11:14:08.996170 4699 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f8a28b8-c47b-4288-877f-8e90a3b581b5-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 11:14:09 crc kubenswrapper[4699]: W0226 11:14:09.037272 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71a83978_4f86_404b_967a_0e7493ff6721.slice/crio-1c165f4cac2c47ef0e2f5ea976276ea6634d20dbf88d2b070f23064d87eecce4 WatchSource:0}: Error finding container 1c165f4cac2c47ef0e2f5ea976276ea6634d20dbf88d2b070f23064d87eecce4: Status 404 returned error can't find the container with id 1c165f4cac2c47ef0e2f5ea976276ea6634d20dbf88d2b070f23064d87eecce4 Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.055713 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.055782 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.055716 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.056042 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.062239 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-czwkc"] Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.100180 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:09 crc kubenswrapper[4699]: E0226 11:14:09.100652 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:09.600633534 +0000 UTC m=+195.411459968 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.122985 4699 ???:1] "http: TLS handshake error from 192.168.126.11:60118: no serving certificate available for the kubelet" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.132884 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.165129 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvgnb" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.191954 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.199522 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qf" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.207184 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:09 crc kubenswrapper[4699]: E0226 11:14:09.209327 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:09.709310292 +0000 UTC m=+195.520136726 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.232632 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 26 11:14:09 crc kubenswrapper[4699]: E0226 11:14:09.240373 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f8a28b8-c47b-4288-877f-8e90a3b581b5" containerName="collect-profiles" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.240433 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f8a28b8-c47b-4288-877f-8e90a3b581b5" containerName="collect-profiles" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.240731 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f8a28b8-c47b-4288-877f-8e90a3b581b5" containerName="collect-profiles" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.241320 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.285749 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.285839 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.287889 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.287919 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.302467 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.326296 4699 patch_prober.go:28] interesting pod/console-f9d7485db-hnsh7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.326364 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hnsh7" podUID="e6bdcf19-db76-497c-a2fe-a6de38fae724" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.328520 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.329069 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.329104 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 11:14:09 crc kubenswrapper[4699]: E0226 11:14:09.329239 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:09.829220773 +0000 UTC m=+195.640047207 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.372512 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s8kpz"] Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.379881 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s8kpz" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.407870 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8kpz"] Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.419904 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.420472 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.420500 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.430389 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr725\" (UniqueName: \"kubernetes.io/projected/8c96a703-e568-4916-8035-a951ae91dc2b-kube-api-access-rr725\") pod \"redhat-marketplace-s8kpz\" (UID: \"8c96a703-e568-4916-8035-a951ae91dc2b\") " pod="openshift-marketplace/redhat-marketplace-s8kpz" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.430498 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.430557 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.430605 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c96a703-e568-4916-8035-a951ae91dc2b-catalog-content\") pod \"redhat-marketplace-s8kpz\" (UID: \"8c96a703-e568-4916-8035-a951ae91dc2b\") " pod="openshift-marketplace/redhat-marketplace-s8kpz" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.430697 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c96a703-e568-4916-8035-a951ae91dc2b-utilities\") pod \"redhat-marketplace-s8kpz\" (UID: \"8c96a703-e568-4916-8035-a951ae91dc2b\") " pod="openshift-marketplace/redhat-marketplace-s8kpz" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.430738 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.432956 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 11:14:09 crc kubenswrapper[4699]: E0226 11:14:09.433678 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:09.933660836 +0000 UTC m=+195.744487260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.436025 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.436544 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.470017 4699 ???:1] "http: TLS handshake error from 192.168.126.11:60134: no serving certificate available for the kubelet" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.485014 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.489922 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.524005 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.535334 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.535517 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c96a703-e568-4916-8035-a951ae91dc2b-utilities\") pod \"redhat-marketplace-s8kpz\" (UID: \"8c96a703-e568-4916-8035-a951ae91dc2b\") " pod="openshift-marketplace/redhat-marketplace-s8kpz" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.535709 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr725\" (UniqueName: \"kubernetes.io/projected/8c96a703-e568-4916-8035-a951ae91dc2b-kube-api-access-rr725\") pod \"redhat-marketplace-s8kpz\" (UID: \"8c96a703-e568-4916-8035-a951ae91dc2b\") " pod="openshift-marketplace/redhat-marketplace-s8kpz" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.535803 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c96a703-e568-4916-8035-a951ae91dc2b-catalog-content\") pod \"redhat-marketplace-s8kpz\" (UID: \"8c96a703-e568-4916-8035-a951ae91dc2b\") " pod="openshift-marketplace/redhat-marketplace-s8kpz" Feb 26 11:14:09 crc kubenswrapper[4699]: E0226 11:14:09.536702 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:10.036675128 +0000 UTC m=+195.847501572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.537190 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c96a703-e568-4916-8035-a951ae91dc2b-utilities\") pod \"redhat-marketplace-s8kpz\" (UID: \"8c96a703-e568-4916-8035-a951ae91dc2b\") " pod="openshift-marketplace/redhat-marketplace-s8kpz" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.540549 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c96a703-e568-4916-8035-a951ae91dc2b-catalog-content\") pod \"redhat-marketplace-s8kpz\" (UID: \"8c96a703-e568-4916-8035-a951ae91dc2b\") " pod="openshift-marketplace/redhat-marketplace-s8kpz" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.607674 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr725\" (UniqueName: \"kubernetes.io/projected/8c96a703-e568-4916-8035-a951ae91dc2b-kube-api-access-rr725\") pod \"redhat-marketplace-s8kpz\" (UID: \"8c96a703-e568-4916-8035-a951ae91dc2b\") " pod="openshift-marketplace/redhat-marketplace-s8kpz" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.607824 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-czwkc" event={"ID":"ac0026c3-1fad-4b34-9c42-389971f0c773","Type":"ContainerStarted","Data":"31376761fbf12a5b81018d6bde894ab4db92607e39e297d6342dce3d31049346"} Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.632015 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535060-f97rd" event={"ID":"5f8a28b8-c47b-4288-877f-8e90a3b581b5","Type":"ContainerDied","Data":"9cc8202a0a693b54f9a7afa4f72146520cc57d28a34110bea4d4992553af18b6"} Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.632064 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cc8202a0a693b54f9a7afa4f72146520cc57d28a34110bea4d4992553af18b6" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.632205 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535060-f97rd" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.640322 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:09 crc kubenswrapper[4699]: E0226 11:14:09.640958 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:10.140943566 +0000 UTC m=+195.951770000 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.654357 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzgjj" event={"ID":"71a83978-4f86-404b-967a-0e7493ff6721","Type":"ContainerStarted","Data":"1c165f4cac2c47ef0e2f5ea976276ea6634d20dbf88d2b070f23064d87eecce4"} Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.674328 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tm98c" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.674687 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.719442 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-xm88w" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.736885 4699 patch_prober.go:28] interesting pod/router-default-5444994796-xm88w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 11:14:09 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Feb 26 11:14:09 crc kubenswrapper[4699]: [+]process-running ok Feb 26 11:14:09 crc kubenswrapper[4699]: healthz check failed Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.737288 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xm88w" podUID="4a97e310-1811-48a9-a31a-eb9a0321d280" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.747518 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:09 crc kubenswrapper[4699]: E0226 11:14:09.749163 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:10.249140761 +0000 UTC m=+196.059967195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.781075 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hrk4n"] Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.782618 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrk4n" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.789270 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fhgnz"] Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.805796 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gsl8w"] Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.806019 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" podUID="796e9631-3388-48b1-8675-3fbc4b6e435d" containerName="controller-manager" containerID="cri-o://4e6b4035a7e79b8d64117aea9e3cf6e2de88e935585f4e47f14b7523b0476a41" gracePeriod=30 Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.818751 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrk4n"] Feb 26 11:14:09 crc kubenswrapper[4699]: W0226 11:14:09.820105 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1389c8c4_9546_4193_8067_50db90448d4f.slice/crio-f819282439bed9f63847862724adeef5b7b347bc240dc1336ee32c15da7bf7cc WatchSource:0}: Error finding container f819282439bed9f63847862724adeef5b7b347bc240dc1336ee32c15da7bf7cc: Status 404 returned error can't find the container with id f819282439bed9f63847862724adeef5b7b347bc240dc1336ee32c15da7bf7cc Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.840485 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-hzqgp" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.855596 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xhdp\" (UniqueName: \"kubernetes.io/projected/6e7ddf51-5522-4085-8567-76c9a254ed15-kube-api-access-7xhdp\") pod \"redhat-marketplace-hrk4n\" (UID: \"6e7ddf51-5522-4085-8567-76c9a254ed15\") " pod="openshift-marketplace/redhat-marketplace-hrk4n" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.855706 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.855750 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e7ddf51-5522-4085-8567-76c9a254ed15-utilities\") pod \"redhat-marketplace-hrk4n\" (UID: \"6e7ddf51-5522-4085-8567-76c9a254ed15\") " pod="openshift-marketplace/redhat-marketplace-hrk4n" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.855778 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e7ddf51-5522-4085-8567-76c9a254ed15-catalog-content\") pod \"redhat-marketplace-hrk4n\" (UID: \"6e7ddf51-5522-4085-8567-76c9a254ed15\") " pod="openshift-marketplace/redhat-marketplace-hrk4n" Feb 26 11:14:09 crc kubenswrapper[4699]: E0226 11:14:09.857180 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:10.35716463 +0000 UTC m=+196.167991064 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.877469 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s8kpz" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.905360 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-phhbz"] Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.960036 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.960429 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xhdp\" (UniqueName: \"kubernetes.io/projected/6e7ddf51-5522-4085-8567-76c9a254ed15-kube-api-access-7xhdp\") pod \"redhat-marketplace-hrk4n\" (UID: \"6e7ddf51-5522-4085-8567-76c9a254ed15\") " pod="openshift-marketplace/redhat-marketplace-hrk4n" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.960569 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e7ddf51-5522-4085-8567-76c9a254ed15-utilities\") pod \"redhat-marketplace-hrk4n\" (UID: \"6e7ddf51-5522-4085-8567-76c9a254ed15\") " pod="openshift-marketplace/redhat-marketplace-hrk4n" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.960612 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e7ddf51-5522-4085-8567-76c9a254ed15-catalog-content\") pod \"redhat-marketplace-hrk4n\" (UID: \"6e7ddf51-5522-4085-8567-76c9a254ed15\") " pod="openshift-marketplace/redhat-marketplace-hrk4n" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.961309 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e7ddf51-5522-4085-8567-76c9a254ed15-catalog-content\") pod \"redhat-marketplace-hrk4n\" (UID: \"6e7ddf51-5522-4085-8567-76c9a254ed15\") " pod="openshift-marketplace/redhat-marketplace-hrk4n" Feb 26 11:14:09 crc kubenswrapper[4699]: E0226 11:14:09.964240 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:10.46421198 +0000 UTC m=+196.275038414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.965340 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e7ddf51-5522-4085-8567-76c9a254ed15-utilities\") pod \"redhat-marketplace-hrk4n\" (UID: \"6e7ddf51-5522-4085-8567-76c9a254ed15\") " pod="openshift-marketplace/redhat-marketplace-hrk4n" Feb 26 11:14:09 crc kubenswrapper[4699]: I0226 11:14:09.986611 4699 ???:1] "http: TLS handshake error from 192.168.126.11:60138: no serving certificate available for the kubelet" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.025995 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.056452 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xhdp\" (UniqueName: \"kubernetes.io/projected/6e7ddf51-5522-4085-8567-76c9a254ed15-kube-api-access-7xhdp\") pod \"redhat-marketplace-hrk4n\" (UID: \"6e7ddf51-5522-4085-8567-76c9a254ed15\") " pod="openshift-marketplace/redhat-marketplace-hrk4n" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.057001 4699 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.063965 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8"] Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.064170 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" podUID="744aa737-e6c7-4d6b-ba7d-a9479043ad29" containerName="route-controller-manager" containerID="cri-o://e82195f6e02bee889f35431d957fa456c0b1db807d039ddcd99b290da7bc9288" gracePeriod=30 Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.065430 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:10 crc kubenswrapper[4699]: E0226 11:14:10.069294 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:10.569270612 +0000 UTC m=+196.380097046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:10 crc kubenswrapper[4699]: W0226 11:14:10.137498 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ea10063_7888_400e_af1c_216cbde5a13e.slice/crio-c8dc58ce346d0f6b6aad0363b33f0cf4112745523923e0e7d1cf3d865b90372a WatchSource:0}: Error finding container c8dc58ce346d0f6b6aad0363b33f0cf4112745523923e0e7d1cf3d865b90372a: Status 404 returned error can't find the container with id c8dc58ce346d0f6b6aad0363b33f0cf4112745523923e0e7d1cf3d865b90372a Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.169102 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:10 crc kubenswrapper[4699]: E0226 11:14:10.169912 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:10.669889813 +0000 UTC m=+196.480716247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.221777 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrk4n" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.272287 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:10 crc kubenswrapper[4699]: E0226 11:14:10.272658 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:10.772643817 +0000 UTC m=+196.583470251 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.332939 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sc9c6"] Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.336011 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sc9c6" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.343926 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.374033 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.374505 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d171ad-7d92-4c70-a686-65f60ded8a03-catalog-content\") pod \"redhat-operators-sc9c6\" (UID: \"44d171ad-7d92-4c70-a686-65f60ded8a03\") " pod="openshift-marketplace/redhat-operators-sc9c6" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.374662 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tqhd\" (UniqueName: \"kubernetes.io/projected/44d171ad-7d92-4c70-a686-65f60ded8a03-kube-api-access-2tqhd\") pod \"redhat-operators-sc9c6\" (UID: \"44d171ad-7d92-4c70-a686-65f60ded8a03\") " pod="openshift-marketplace/redhat-operators-sc9c6" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.374703 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d171ad-7d92-4c70-a686-65f60ded8a03-utilities\") pod \"redhat-operators-sc9c6\" (UID: \"44d171ad-7d92-4c70-a686-65f60ded8a03\") " pod="openshift-marketplace/redhat-operators-sc9c6" Feb 26 11:14:10 crc kubenswrapper[4699]: E0226 11:14:10.374770 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:10.874749641 +0000 UTC m=+196.685576075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.438307 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sc9c6"] Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.458051 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.481644 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tqhd\" (UniqueName: \"kubernetes.io/projected/44d171ad-7d92-4c70-a686-65f60ded8a03-kube-api-access-2tqhd\") pod \"redhat-operators-sc9c6\" (UID: \"44d171ad-7d92-4c70-a686-65f60ded8a03\") " pod="openshift-marketplace/redhat-operators-sc9c6" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.481757 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d171ad-7d92-4c70-a686-65f60ded8a03-utilities\") pod \"redhat-operators-sc9c6\" (UID: \"44d171ad-7d92-4c70-a686-65f60ded8a03\") " pod="openshift-marketplace/redhat-operators-sc9c6" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.481803 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d171ad-7d92-4c70-a686-65f60ded8a03-catalog-content\") pod \"redhat-operators-sc9c6\" (UID: \"44d171ad-7d92-4c70-a686-65f60ded8a03\") " pod="openshift-marketplace/redhat-operators-sc9c6" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.481997 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:10 crc kubenswrapper[4699]: E0226 11:14:10.482514 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:10.982501332 +0000 UTC m=+196.793327766 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.482679 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d171ad-7d92-4c70-a686-65f60ded8a03-utilities\") pod \"redhat-operators-sc9c6\" (UID: \"44d171ad-7d92-4c70-a686-65f60ded8a03\") " pod="openshift-marketplace/redhat-operators-sc9c6" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.483005 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d171ad-7d92-4c70-a686-65f60ded8a03-catalog-content\") pod \"redhat-operators-sc9c6\" (UID: \"44d171ad-7d92-4c70-a686-65f60ded8a03\") " pod="openshift-marketplace/redhat-operators-sc9c6" Feb 26 11:14:10 crc kubenswrapper[4699]: W0226 11:14:10.518985 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7b6f5fd6_ce1d_48d1_bb78_237e07a93ff7.slice/crio-c2ec8f209323100d0e1a79f63a539f1e19839405896844b0751e4c3bb600c54d WatchSource:0}: Error finding container c2ec8f209323100d0e1a79f63a539f1e19839405896844b0751e4c3bb600c54d: Status 404 returned error can't find the container with id c2ec8f209323100d0e1a79f63a539f1e19839405896844b0751e4c3bb600c54d Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.537527 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tqhd\" (UniqueName: \"kubernetes.io/projected/44d171ad-7d92-4c70-a686-65f60ded8a03-kube-api-access-2tqhd\") pod \"redhat-operators-sc9c6\" (UID: \"44d171ad-7d92-4c70-a686-65f60ded8a03\") " pod="openshift-marketplace/redhat-operators-sc9c6" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.585867 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:10 crc kubenswrapper[4699]: E0226 11:14:10.586881 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:11.086861213 +0000 UTC m=+196.897687647 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.682610 4699 ???:1] "http: TLS handshake error from 192.168.126.11:60142: no serving certificate available for the kubelet" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.689705 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:10 crc kubenswrapper[4699]: E0226 11:14:10.690085 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:11.190069601 +0000 UTC m=+197.000896035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.713623 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7","Type":"ContainerStarted","Data":"c2ec8f209323100d0e1a79f63a539f1e19839405896844b0751e4c3bb600c54d"} Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.715516 4699 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-26T11:14:10.057024491Z","Handler":null,"Name":""} Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.721387 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.743811 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jhgks"] Feb 26 11:14:10 crc kubenswrapper[4699]: E0226 11:14:10.744061 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="796e9631-3388-48b1-8675-3fbc4b6e435d" containerName="controller-manager" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.744133 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="796e9631-3388-48b1-8675-3fbc4b6e435d" containerName="controller-manager" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.744342 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="796e9631-3388-48b1-8675-3fbc4b6e435d" containerName="controller-manager" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.745261 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jhgks" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.755236 4699 patch_prober.go:28] interesting pod/router-default-5444994796-xm88w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 11:14:10 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Feb 26 11:14:10 crc kubenswrapper[4699]: [+]process-running ok Feb 26 11:14:10 crc kubenswrapper[4699]: healthz check failed Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.755331 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xm88w" podUID="4a97e310-1811-48a9-a31a-eb9a0321d280" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.766606 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sc9c6" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.790989 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/796e9631-3388-48b1-8675-3fbc4b6e435d-client-ca\") pod \"796e9631-3388-48b1-8675-3fbc4b6e435d\" (UID: \"796e9631-3388-48b1-8675-3fbc4b6e435d\") " Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.791324 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.791443 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/796e9631-3388-48b1-8675-3fbc4b6e435d-config\") pod \"796e9631-3388-48b1-8675-3fbc4b6e435d\" (UID: \"796e9631-3388-48b1-8675-3fbc4b6e435d\") " Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.791507 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/796e9631-3388-48b1-8675-3fbc4b6e435d-proxy-ca-bundles\") pod \"796e9631-3388-48b1-8675-3fbc4b6e435d\" (UID: \"796e9631-3388-48b1-8675-3fbc4b6e435d\") " Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.791617 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/796e9631-3388-48b1-8675-3fbc4b6e435d-serving-cert\") pod \"796e9631-3388-48b1-8675-3fbc4b6e435d\" (UID: \"796e9631-3388-48b1-8675-3fbc4b6e435d\") " Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.791652 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjjgq\" (UniqueName: \"kubernetes.io/projected/796e9631-3388-48b1-8675-3fbc4b6e435d-kube-api-access-vjjgq\") pod \"796e9631-3388-48b1-8675-3fbc4b6e435d\" (UID: \"796e9631-3388-48b1-8675-3fbc4b6e435d\") " Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.791966 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b9da973-6b5f-4485-adca-8792b0a3d256-catalog-content\") pod \"redhat-operators-jhgks\" (UID: \"6b9da973-6b5f-4485-adca-8792b0a3d256\") " pod="openshift-marketplace/redhat-operators-jhgks" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.792012 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b9da973-6b5f-4485-adca-8792b0a3d256-utilities\") pod \"redhat-operators-jhgks\" (UID: \"6b9da973-6b5f-4485-adca-8792b0a3d256\") " pod="openshift-marketplace/redhat-operators-jhgks" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.792043 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44jnw\" (UniqueName: \"kubernetes.io/projected/6b9da973-6b5f-4485-adca-8792b0a3d256-kube-api-access-44jnw\") pod \"redhat-operators-jhgks\" (UID: \"6b9da973-6b5f-4485-adca-8792b0a3d256\") " pod="openshift-marketplace/redhat-operators-jhgks" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.793453 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/796e9631-3388-48b1-8675-3fbc4b6e435d-client-ca" (OuterVolumeSpecName: "client-ca") pod "796e9631-3388-48b1-8675-3fbc4b6e435d" (UID: "796e9631-3388-48b1-8675-3fbc4b6e435d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:14:10 crc kubenswrapper[4699]: E0226 11:14:10.793545 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 11:14:11.293526565 +0000 UTC m=+197.104352999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.794047 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/796e9631-3388-48b1-8675-3fbc4b6e435d-config" (OuterVolumeSpecName: "config") pod "796e9631-3388-48b1-8675-3fbc4b6e435d" (UID: "796e9631-3388-48b1-8675-3fbc4b6e435d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.797451 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/796e9631-3388-48b1-8675-3fbc4b6e435d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "796e9631-3388-48b1-8675-3fbc4b6e435d" (UID: "796e9631-3388-48b1-8675-3fbc4b6e435d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.799301 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jhgks"] Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.893742 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.894793 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b9da973-6b5f-4485-adca-8792b0a3d256-catalog-content\") pod \"redhat-operators-jhgks\" (UID: \"6b9da973-6b5f-4485-adca-8792b0a3d256\") " pod="openshift-marketplace/redhat-operators-jhgks" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.894924 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b9da973-6b5f-4485-adca-8792b0a3d256-utilities\") pod \"redhat-operators-jhgks\" (UID: \"6b9da973-6b5f-4485-adca-8792b0a3d256\") " pod="openshift-marketplace/redhat-operators-jhgks" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.894970 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44jnw\" (UniqueName: \"kubernetes.io/projected/6b9da973-6b5f-4485-adca-8792b0a3d256-kube-api-access-44jnw\") pod \"redhat-operators-jhgks\" (UID: \"6b9da973-6b5f-4485-adca-8792b0a3d256\") " pod="openshift-marketplace/redhat-operators-jhgks" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.895234 4699 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/796e9631-3388-48b1-8675-3fbc4b6e435d-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.895263 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/796e9631-3388-48b1-8675-3fbc4b6e435d-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.895283 4699 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/796e9631-3388-48b1-8675-3fbc4b6e435d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 11:14:10 crc kubenswrapper[4699]: E0226 11:14:10.896089 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 11:14:11.396075553 +0000 UTC m=+197.206901987 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8656" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.896792 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b9da973-6b5f-4485-adca-8792b0a3d256-catalog-content\") pod \"redhat-operators-jhgks\" (UID: \"6b9da973-6b5f-4485-adca-8792b0a3d256\") " pod="openshift-marketplace/redhat-operators-jhgks" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.897156 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b9da973-6b5f-4485-adca-8792b0a3d256-utilities\") pod \"redhat-operators-jhgks\" (UID: \"6b9da973-6b5f-4485-adca-8792b0a3d256\") " pod="openshift-marketplace/redhat-operators-jhgks" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.910091 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/796e9631-3388-48b1-8675-3fbc4b6e435d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "796e9631-3388-48b1-8675-3fbc4b6e435d" (UID: "796e9631-3388-48b1-8675-3fbc4b6e435d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.919865 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/796e9631-3388-48b1-8675-3fbc4b6e435d-kube-api-access-vjjgq" (OuterVolumeSpecName: "kube-api-access-vjjgq") pod "796e9631-3388-48b1-8675-3fbc4b6e435d" (UID: "796e9631-3388-48b1-8675-3fbc4b6e435d"). InnerVolumeSpecName "kube-api-access-vjjgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.921532 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4168d2c5-00be-4270-9a2b-c2b8847e4593","Type":"ContainerStarted","Data":"17985416ddd6869d379ec85a7e60b0e0d55af715771863737c8600be5d531fb7"} Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.931540 4699 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.932546 4699 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.944369 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44jnw\" (UniqueName: \"kubernetes.io/projected/6b9da973-6b5f-4485-adca-8792b0a3d256-kube-api-access-44jnw\") pod \"redhat-operators-jhgks\" (UID: \"6b9da973-6b5f-4485-adca-8792b0a3d256\") " pod="openshift-marketplace/redhat-operators-jhgks" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.958752 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jhgks" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.983774 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8kpz"] Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.997443 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.997794 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/796e9631-3388-48b1-8675-3fbc4b6e435d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:14:10 crc kubenswrapper[4699]: I0226 11:14:10.997808 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjjgq\" (UniqueName: \"kubernetes.io/projected/796e9631-3388-48b1-8675-3fbc4b6e435d-kube-api-access-vjjgq\") on node \"crc\" DevicePath \"\"" Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.013302 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.038802 4699 generic.go:334] "Generic (PLEG): container finished" podID="744aa737-e6c7-4d6b-ba7d-a9479043ad29" containerID="e82195f6e02bee889f35431d957fa456c0b1db807d039ddcd99b290da7bc9288" exitCode=0 Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.038920 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" event={"ID":"744aa737-e6c7-4d6b-ba7d-a9479043ad29","Type":"ContainerDied","Data":"e82195f6e02bee889f35431d957fa456c0b1db807d039ddcd99b290da7bc9288"} Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.049367 4699 generic.go:334] "Generic (PLEG): container finished" podID="1389c8c4-9546-4193-8067-50db90448d4f" containerID="52ffe1a540a589fb575f8cfc11cab09c8b7aa57c3ace31541c3b66e087bf8460" exitCode=0 Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.049531 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhgnz" event={"ID":"1389c8c4-9546-4193-8067-50db90448d4f","Type":"ContainerDied","Data":"52ffe1a540a589fb575f8cfc11cab09c8b7aa57c3ace31541c3b66e087bf8460"} Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.049566 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhgnz" event={"ID":"1389c8c4-9546-4193-8067-50db90448d4f","Type":"ContainerStarted","Data":"f819282439bed9f63847862724adeef5b7b347bc240dc1336ee32c15da7bf7cc"} Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.056700 4699 generic.go:334] "Generic (PLEG): container finished" podID="71a83978-4f86-404b-967a-0e7493ff6721" containerID="f1b31944470f82af52e860af7004767cf2db0ef2acdf2a9986adc95701213e55" exitCode=0 Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.056934 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzgjj" event={"ID":"71a83978-4f86-404b-967a-0e7493ff6721","Type":"ContainerDied","Data":"f1b31944470f82af52e860af7004767cf2db0ef2acdf2a9986adc95701213e55"} Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.087501 4699 generic.go:334] "Generic (PLEG): container finished" podID="ac0026c3-1fad-4b34-9c42-389971f0c773" containerID="39ff3a6e4269604cce0aea66db001b967d934c0076038e7958d8b015de9375a1" exitCode=0 Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.087615 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-czwkc" event={"ID":"ac0026c3-1fad-4b34-9c42-389971f0c773","Type":"ContainerDied","Data":"39ff3a6e4269604cce0aea66db001b967d934c0076038e7958d8b015de9375a1"} Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.098676 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qzphl" event={"ID":"79a9064f-5fcf-42f7-af6f-71aeeb75560e","Type":"ContainerStarted","Data":"d1999d25b37d3ebe60c93058b5602a84f3b965ce72276f7b288ccf6a4dd3ff40"} Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.099600 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.105249 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.111295 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phhbz" event={"ID":"9ea10063-7888-400e-af1c-216cbde5a13e","Type":"ContainerStarted","Data":"c8dc58ce346d0f6b6aad0363b33f0cf4112745523923e0e7d1cf3d865b90372a"} Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.124014 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrk4n"] Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.135051 4699 generic.go:334] "Generic (PLEG): container finished" podID="796e9631-3388-48b1-8675-3fbc4b6e435d" containerID="4e6b4035a7e79b8d64117aea9e3cf6e2de88e935585f4e47f14b7523b0476a41" exitCode=0 Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.135196 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" event={"ID":"796e9631-3388-48b1-8675-3fbc4b6e435d","Type":"ContainerDied","Data":"4e6b4035a7e79b8d64117aea9e3cf6e2de88e935585f4e47f14b7523b0476a41"} Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.135236 4699 scope.go:117] "RemoveContainer" containerID="4e6b4035a7e79b8d64117aea9e3cf6e2de88e935585f4e47f14b7523b0476a41" Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.135274 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gsl8w" Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.147973 4699 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.148019 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.202903 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66m62\" (UniqueName: \"kubernetes.io/projected/744aa737-e6c7-4d6b-ba7d-a9479043ad29-kube-api-access-66m62\") pod \"744aa737-e6c7-4d6b-ba7d-a9479043ad29\" (UID: \"744aa737-e6c7-4d6b-ba7d-a9479043ad29\") " Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.202959 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/744aa737-e6c7-4d6b-ba7d-a9479043ad29-client-ca\") pod \"744aa737-e6c7-4d6b-ba7d-a9479043ad29\" (UID: \"744aa737-e6c7-4d6b-ba7d-a9479043ad29\") " Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.202982 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/744aa737-e6c7-4d6b-ba7d-a9479043ad29-serving-cert\") pod \"744aa737-e6c7-4d6b-ba7d-a9479043ad29\" (UID: \"744aa737-e6c7-4d6b-ba7d-a9479043ad29\") " Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.203031 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/744aa737-e6c7-4d6b-ba7d-a9479043ad29-config\") pod \"744aa737-e6c7-4d6b-ba7d-a9479043ad29\" (UID: \"744aa737-e6c7-4d6b-ba7d-a9479043ad29\") " Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.205759 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/744aa737-e6c7-4d6b-ba7d-a9479043ad29-client-ca" (OuterVolumeSpecName: "client-ca") pod "744aa737-e6c7-4d6b-ba7d-a9479043ad29" (UID: "744aa737-e6c7-4d6b-ba7d-a9479043ad29"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.206371 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/744aa737-e6c7-4d6b-ba7d-a9479043ad29-config" (OuterVolumeSpecName: "config") pod "744aa737-e6c7-4d6b-ba7d-a9479043ad29" (UID: "744aa737-e6c7-4d6b-ba7d-a9479043ad29"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.223217 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/744aa737-e6c7-4d6b-ba7d-a9479043ad29-kube-api-access-66m62" (OuterVolumeSpecName: "kube-api-access-66m62") pod "744aa737-e6c7-4d6b-ba7d-a9479043ad29" (UID: "744aa737-e6c7-4d6b-ba7d-a9479043ad29"). InnerVolumeSpecName "kube-api-access-66m62". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.230243 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/744aa737-e6c7-4d6b-ba7d-a9479043ad29-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "744aa737-e6c7-4d6b-ba7d-a9479043ad29" (UID: "744aa737-e6c7-4d6b-ba7d-a9479043ad29"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.238848 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8656\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.264849 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gsl8w"] Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.265237 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gsl8w"] Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.311829 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66m62\" (UniqueName: \"kubernetes.io/projected/744aa737-e6c7-4d6b-ba7d-a9479043ad29-kube-api-access-66m62\") on node \"crc\" DevicePath \"\"" Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.311867 4699 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/744aa737-e6c7-4d6b-ba7d-a9479043ad29-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.311884 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/744aa737-e6c7-4d6b-ba7d-a9479043ad29-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.311896 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/744aa737-e6c7-4d6b-ba7d-a9479043ad29-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.357698 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.360331 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.411211 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b" Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.585056 4699 patch_prober.go:28] interesting pod/apiserver-76f77b778f-f8s5j container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 26 11:14:11 crc kubenswrapper[4699]: [+]log ok Feb 26 11:14:11 crc kubenswrapper[4699]: [+]etcd ok Feb 26 11:14:11 crc kubenswrapper[4699]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 26 11:14:11 crc kubenswrapper[4699]: [+]poststarthook/generic-apiserver-start-informers ok Feb 26 11:14:11 crc kubenswrapper[4699]: [+]poststarthook/max-in-flight-filter ok Feb 26 11:14:11 crc kubenswrapper[4699]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 26 11:14:11 crc kubenswrapper[4699]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 26 11:14:11 crc kubenswrapper[4699]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 26 11:14:11 crc kubenswrapper[4699]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 26 11:14:11 crc kubenswrapper[4699]: [+]poststarthook/project.openshift.io-projectcache ok Feb 26 11:14:11 crc kubenswrapper[4699]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 26 11:14:11 crc kubenswrapper[4699]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Feb 26 11:14:11 crc kubenswrapper[4699]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 26 11:14:11 crc kubenswrapper[4699]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 26 11:14:11 crc kubenswrapper[4699]: livez check failed Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.585490 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" podUID="a550b2ea-3ce7-4df3-bbf5-f1025afca8c1" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.657725 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sc9c6"] Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.720688 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jhgks"] Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.730632 4699 patch_prober.go:28] interesting pod/router-default-5444994796-xm88w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 11:14:11 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Feb 26 11:14:11 crc kubenswrapper[4699]: [+]process-running ok Feb 26 11:14:11 crc kubenswrapper[4699]: healthz check failed Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.730679 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xm88w" podUID="4a97e310-1811-48a9-a31a-eb9a0321d280" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.867839 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-t8656"] Feb 26 11:14:11 crc kubenswrapper[4699]: W0226 11:14:11.901464 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7232eb23_31ae_4e72_ae27_c256dc4cac9a.slice/crio-39ab42bfea1ba6c0800a2508ff52b7eb12199142899ef006de5ffbee4f2135a3 WatchSource:0}: Error finding container 39ab42bfea1ba6c0800a2508ff52b7eb12199142899ef006de5ffbee4f2135a3: Status 404 returned error can't find the container with id 39ab42bfea1ba6c0800a2508ff52b7eb12199142899ef006de5ffbee4f2135a3 Feb 26 11:14:11 crc kubenswrapper[4699]: I0226 11:14:11.970501 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.009356 4699 ???:1] "http: TLS handshake error from 192.168.126.11:60148: no serving certificate available for the kubelet" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.163703 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7","Type":"ContainerStarted","Data":"3f1d1b042714ef5e17278b4b9d7bda264b1ad3f30ee29d94ea5f89d3666dabf8"} Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.168845 4699 generic.go:334] "Generic (PLEG): container finished" podID="4168d2c5-00be-4270-9a2b-c2b8847e4593" containerID="20bd2ec5dc40708614f63aff6ec3d623d4d0578a8054c95d7cd179f080c036fe" exitCode=0 Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.168920 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4168d2c5-00be-4270-9a2b-c2b8847e4593","Type":"ContainerDied","Data":"20bd2ec5dc40708614f63aff6ec3d623d4d0578a8054c95d7cd179f080c036fe"} Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.174802 4699 generic.go:334] "Generic (PLEG): container finished" podID="9ea10063-7888-400e-af1c-216cbde5a13e" containerID="e2ca3e75def51c6eedb622aaa6507c8da48849ebf241567dc8e903d48fc3a6e5" exitCode=0 Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.174892 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phhbz" event={"ID":"9ea10063-7888-400e-af1c-216cbde5a13e","Type":"ContainerDied","Data":"e2ca3e75def51c6eedb622aaa6507c8da48849ebf241567dc8e903d48fc3a6e5"} Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.178552 4699 generic.go:334] "Generic (PLEG): container finished" podID="44d171ad-7d92-4c70-a686-65f60ded8a03" containerID="0d415d903af1673dff3ecf368cade4c0a0a93c2b3158c0519393d68509c7e6d3" exitCode=0 Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.178635 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sc9c6" event={"ID":"44d171ad-7d92-4c70-a686-65f60ded8a03","Type":"ContainerDied","Data":"0d415d903af1673dff3ecf368cade4c0a0a93c2b3158c0519393d68509c7e6d3"} Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.178664 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sc9c6" event={"ID":"44d171ad-7d92-4c70-a686-65f60ded8a03","Type":"ContainerStarted","Data":"8416abc544344d1375d554f38d43ac67e9642de8063e20464268f9eaf0d51147"} Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.184216 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.184199884 podStartE2EDuration="3.184199884s" podCreationTimestamp="2026-02-26 11:14:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:12.182805573 +0000 UTC m=+197.993632027" watchObservedRunningTime="2026-02-26 11:14:12.184199884 +0000 UTC m=+197.995026318" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.185387 4699 generic.go:334] "Generic (PLEG): container finished" podID="6e7ddf51-5522-4085-8567-76c9a254ed15" containerID="3255d554cf00b3f149c14b7b5562baa6c773b2f01ac34c99e514e81d89810bb1" exitCode=0 Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.185538 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrk4n" event={"ID":"6e7ddf51-5522-4085-8567-76c9a254ed15","Type":"ContainerDied","Data":"3255d554cf00b3f149c14b7b5562baa6c773b2f01ac34c99e514e81d89810bb1"} Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.185569 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrk4n" event={"ID":"6e7ddf51-5522-4085-8567-76c9a254ed15","Type":"ContainerStarted","Data":"64ab7f5c1142b79d1cad6017fda721d048cccdd042121faa577213948620ffa2"} Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.198821 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qzphl" event={"ID":"79a9064f-5fcf-42f7-af6f-71aeeb75560e","Type":"ContainerStarted","Data":"d95936cc463e6a5cd671e1ed3f7b1e7ee6c426a777f09428f3b6caa620a5e5de"} Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.202916 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf"] Feb 26 11:14:12 crc kubenswrapper[4699]: E0226 11:14:12.203312 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="744aa737-e6c7-4d6b-ba7d-a9479043ad29" containerName="route-controller-manager" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.203340 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="744aa737-e6c7-4d6b-ba7d-a9479043ad29" containerName="route-controller-manager" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.203690 4699 generic.go:334] "Generic (PLEG): container finished" podID="8c96a703-e568-4916-8035-a951ae91dc2b" containerID="0c88d150d726034804b09cdfd6ed7b9a516e4ecd807d5799c0ea12f3955c7b69" exitCode=0 Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.203840 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="744aa737-e6c7-4d6b-ba7d-a9479043ad29" containerName="route-controller-manager" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.204466 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8kpz" event={"ID":"8c96a703-e568-4916-8035-a951ae91dc2b","Type":"ContainerDied","Data":"0c88d150d726034804b09cdfd6ed7b9a516e4ecd807d5799c0ea12f3955c7b69"} Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.204509 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8kpz" event={"ID":"8c96a703-e568-4916-8035-a951ae91dc2b","Type":"ContainerStarted","Data":"18a720cd12fbf1604976388b722cf7ea85f1660cb3d90ac7f016d51d465b43d1"} Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.204612 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.207179 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" event={"ID":"744aa737-e6c7-4d6b-ba7d-a9479043ad29","Type":"ContainerDied","Data":"3f258b9ae41f11af5114ab5232e03c4aa9dff40c08fe1e6fde31d40c3ec891ec"} Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.207225 4699 scope.go:117] "RemoveContainer" containerID="e82195f6e02bee889f35431d957fa456c0b1db807d039ddcd99b290da7bc9288" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.207326 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.208337 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c78ff548b-nppmt"] Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.209290 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.219559 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.219721 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.220213 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.220326 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.220384 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.228788 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.237621 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-t8656" event={"ID":"7232eb23-31ae-4e72-ae27-c256dc4cac9a","Type":"ContainerStarted","Data":"39ab42bfea1ba6c0800a2508ff52b7eb12199142899ef006de5ffbee4f2135a3"} Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.242913 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.248488 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf"] Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.286948 4699 generic.go:334] "Generic (PLEG): container finished" podID="6b9da973-6b5f-4485-adca-8792b0a3d256" containerID="e514effd43a8aac49eb2edbdb6959f6095c102c0f8bc4412986233930c5d5ff6" exitCode=0 Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.292691 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="796e9631-3388-48b1-8675-3fbc4b6e435d" path="/var/lib/kubelet/pods/796e9631-3388-48b1-8675-3fbc4b6e435d/volumes" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.293901 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.294629 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jhgks" event={"ID":"6b9da973-6b5f-4485-adca-8792b0a3d256","Type":"ContainerDied","Data":"e514effd43a8aac49eb2edbdb6959f6095c102c0f8bc4412986233930c5d5ff6"} Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.294861 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c78ff548b-nppmt"] Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.294970 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jhgks" event={"ID":"6b9da973-6b5f-4485-adca-8792b0a3d256","Type":"ContainerStarted","Data":"1df59f3f6cf47eeaee6c7803f5d095457eb18adeaca6dc9c81e5b0dfb758e003"} Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.339969 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-qzphl" podStartSLOduration=17.339950752 podStartE2EDuration="17.339950752s" podCreationTimestamp="2026-02-26 11:13:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:12.33818598 +0000 UTC m=+198.149012444" watchObservedRunningTime="2026-02-26 11:14:12.339950752 +0000 UTC m=+198.150777186" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.352266 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f265819-8c24-4d84-9afe-423152764dfb-client-ca\") pod \"route-controller-manager-59b4784554-77qxf\" (UID: \"2f265819-8c24-4d84-9afe-423152764dfb\") " pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.352574 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md2wz\" (UniqueName: \"kubernetes.io/projected/2f265819-8c24-4d84-9afe-423152764dfb-kube-api-access-md2wz\") pod \"route-controller-manager-59b4784554-77qxf\" (UID: \"2f265819-8c24-4d84-9afe-423152764dfb\") " pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.352670 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1873d943-4785-4bc5-a9c4-5a027a932464-proxy-ca-bundles\") pod \"controller-manager-7c78ff548b-nppmt\" (UID: \"1873d943-4785-4bc5-a9c4-5a027a932464\") " pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.352761 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f265819-8c24-4d84-9afe-423152764dfb-serving-cert\") pod \"route-controller-manager-59b4784554-77qxf\" (UID: \"2f265819-8c24-4d84-9afe-423152764dfb\") " pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.352842 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f265819-8c24-4d84-9afe-423152764dfb-config\") pod \"route-controller-manager-59b4784554-77qxf\" (UID: \"2f265819-8c24-4d84-9afe-423152764dfb\") " pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.352949 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v87n7\" (UniqueName: \"kubernetes.io/projected/1873d943-4785-4bc5-a9c4-5a027a932464-kube-api-access-v87n7\") pod \"controller-manager-7c78ff548b-nppmt\" (UID: \"1873d943-4785-4bc5-a9c4-5a027a932464\") " pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.353034 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1873d943-4785-4bc5-a9c4-5a027a932464-client-ca\") pod \"controller-manager-7c78ff548b-nppmt\" (UID: \"1873d943-4785-4bc5-a9c4-5a027a932464\") " pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.353109 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1873d943-4785-4bc5-a9c4-5a027a932464-serving-cert\") pod \"controller-manager-7c78ff548b-nppmt\" (UID: \"1873d943-4785-4bc5-a9c4-5a027a932464\") " pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.353290 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1873d943-4785-4bc5-a9c4-5a027a932464-config\") pod \"controller-manager-7c78ff548b-nppmt\" (UID: \"1873d943-4785-4bc5-a9c4-5a027a932464\") " pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.415363 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8"] Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.422825 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fq7g8"] Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.455592 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1873d943-4785-4bc5-a9c4-5a027a932464-proxy-ca-bundles\") pod \"controller-manager-7c78ff548b-nppmt\" (UID: \"1873d943-4785-4bc5-a9c4-5a027a932464\") " pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.455946 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f265819-8c24-4d84-9afe-423152764dfb-serving-cert\") pod \"route-controller-manager-59b4784554-77qxf\" (UID: \"2f265819-8c24-4d84-9afe-423152764dfb\") " pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.455999 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f265819-8c24-4d84-9afe-423152764dfb-config\") pod \"route-controller-manager-59b4784554-77qxf\" (UID: \"2f265819-8c24-4d84-9afe-423152764dfb\") " pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.456105 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v87n7\" (UniqueName: \"kubernetes.io/projected/1873d943-4785-4bc5-a9c4-5a027a932464-kube-api-access-v87n7\") pod \"controller-manager-7c78ff548b-nppmt\" (UID: \"1873d943-4785-4bc5-a9c4-5a027a932464\") " pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.456155 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1873d943-4785-4bc5-a9c4-5a027a932464-serving-cert\") pod \"controller-manager-7c78ff548b-nppmt\" (UID: \"1873d943-4785-4bc5-a9c4-5a027a932464\") " pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.456212 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1873d943-4785-4bc5-a9c4-5a027a932464-client-ca\") pod \"controller-manager-7c78ff548b-nppmt\" (UID: \"1873d943-4785-4bc5-a9c4-5a027a932464\") " pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.456289 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1873d943-4785-4bc5-a9c4-5a027a932464-config\") pod \"controller-manager-7c78ff548b-nppmt\" (UID: \"1873d943-4785-4bc5-a9c4-5a027a932464\") " pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.456399 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f265819-8c24-4d84-9afe-423152764dfb-client-ca\") pod \"route-controller-manager-59b4784554-77qxf\" (UID: \"2f265819-8c24-4d84-9afe-423152764dfb\") " pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.456439 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md2wz\" (UniqueName: \"kubernetes.io/projected/2f265819-8c24-4d84-9afe-423152764dfb-kube-api-access-md2wz\") pod \"route-controller-manager-59b4784554-77qxf\" (UID: \"2f265819-8c24-4d84-9afe-423152764dfb\") " pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.460645 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1873d943-4785-4bc5-a9c4-5a027a932464-config\") pod \"controller-manager-7c78ff548b-nppmt\" (UID: \"1873d943-4785-4bc5-a9c4-5a027a932464\") " pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.461501 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1873d943-4785-4bc5-a9c4-5a027a932464-client-ca\") pod \"controller-manager-7c78ff548b-nppmt\" (UID: \"1873d943-4785-4bc5-a9c4-5a027a932464\") " pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.462238 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f265819-8c24-4d84-9afe-423152764dfb-client-ca\") pod \"route-controller-manager-59b4784554-77qxf\" (UID: \"2f265819-8c24-4d84-9afe-423152764dfb\") " pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.465344 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1873d943-4785-4bc5-a9c4-5a027a932464-proxy-ca-bundles\") pod \"controller-manager-7c78ff548b-nppmt\" (UID: \"1873d943-4785-4bc5-a9c4-5a027a932464\") " pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.468572 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1873d943-4785-4bc5-a9c4-5a027a932464-serving-cert\") pod \"controller-manager-7c78ff548b-nppmt\" (UID: \"1873d943-4785-4bc5-a9c4-5a027a932464\") " pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.471621 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f265819-8c24-4d84-9afe-423152764dfb-serving-cert\") pod \"route-controller-manager-59b4784554-77qxf\" (UID: \"2f265819-8c24-4d84-9afe-423152764dfb\") " pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.471782 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f265819-8c24-4d84-9afe-423152764dfb-config\") pod \"route-controller-manager-59b4784554-77qxf\" (UID: \"2f265819-8c24-4d84-9afe-423152764dfb\") " pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.480567 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md2wz\" (UniqueName: \"kubernetes.io/projected/2f265819-8c24-4d84-9afe-423152764dfb-kube-api-access-md2wz\") pod \"route-controller-manager-59b4784554-77qxf\" (UID: \"2f265819-8c24-4d84-9afe-423152764dfb\") " pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.485198 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v87n7\" (UniqueName: \"kubernetes.io/projected/1873d943-4785-4bc5-a9c4-5a027a932464-kube-api-access-v87n7\") pod \"controller-manager-7c78ff548b-nppmt\" (UID: \"1873d943-4785-4bc5-a9c4-5a027a932464\") " pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.581149 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.624462 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.734512 4699 patch_prober.go:28] interesting pod/router-default-5444994796-xm88w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 11:14:12 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Feb 26 11:14:12 crc kubenswrapper[4699]: [+]process-running ok Feb 26 11:14:12 crc kubenswrapper[4699]: healthz check failed Feb 26 11:14:12 crc kubenswrapper[4699]: I0226 11:14:12.734590 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xm88w" podUID="4a97e310-1811-48a9-a31a-eb9a0321d280" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 11:14:13 crc kubenswrapper[4699]: I0226 11:14:13.147767 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf"] Feb 26 11:14:13 crc kubenswrapper[4699]: W0226 11:14:13.175047 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f265819_8c24_4d84_9afe_423152764dfb.slice/crio-ad9a383bdf2560f2e8e699488a34354989214933a94da177587d7b8f57d53fa3 WatchSource:0}: Error finding container ad9a383bdf2560f2e8e699488a34354989214933a94da177587d7b8f57d53fa3: Status 404 returned error can't find the container with id ad9a383bdf2560f2e8e699488a34354989214933a94da177587d7b8f57d53fa3 Feb 26 11:14:13 crc kubenswrapper[4699]: I0226 11:14:13.331258 4699 generic.go:334] "Generic (PLEG): container finished" podID="7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7" containerID="3f1d1b042714ef5e17278b4b9d7bda264b1ad3f30ee29d94ea5f89d3666dabf8" exitCode=0 Feb 26 11:14:13 crc kubenswrapper[4699]: I0226 11:14:13.331634 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7","Type":"ContainerDied","Data":"3f1d1b042714ef5e17278b4b9d7bda264b1ad3f30ee29d94ea5f89d3666dabf8"} Feb 26 11:14:13 crc kubenswrapper[4699]: I0226 11:14:13.337593 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-t8656" event={"ID":"7232eb23-31ae-4e72-ae27-c256dc4cac9a","Type":"ContainerStarted","Data":"4779da011a858c6a8df3a7fdfbfb2e01a004c252953c916a440f89808caa4efd"} Feb 26 11:14:13 crc kubenswrapper[4699]: I0226 11:14:13.338413 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:13 crc kubenswrapper[4699]: I0226 11:14:13.343765 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" event={"ID":"2f265819-8c24-4d84-9afe-423152764dfb","Type":"ContainerStarted","Data":"ad9a383bdf2560f2e8e699488a34354989214933a94da177587d7b8f57d53fa3"} Feb 26 11:14:13 crc kubenswrapper[4699]: I0226 11:14:13.383099 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-t8656" podStartSLOduration=147.380797143 podStartE2EDuration="2m27.380797143s" podCreationTimestamp="2026-02-26 11:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:13.379067892 +0000 UTC m=+199.189894416" watchObservedRunningTime="2026-02-26 11:14:13.380797143 +0000 UTC m=+199.191623577" Feb 26 11:14:13 crc kubenswrapper[4699]: I0226 11:14:13.459220 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c78ff548b-nppmt"] Feb 26 11:14:13 crc kubenswrapper[4699]: W0226 11:14:13.487020 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1873d943_4785_4bc5_a9c4_5a027a932464.slice/crio-8c28a63f20c67fc69e31c70403424ed7ca9920d8e519c0772be0d307ed557c9d WatchSource:0}: Error finding container 8c28a63f20c67fc69e31c70403424ed7ca9920d8e519c0772be0d307ed557c9d: Status 404 returned error can't find the container with id 8c28a63f20c67fc69e31c70403424ed7ca9920d8e519c0772be0d307ed557c9d Feb 26 11:14:13 crc kubenswrapper[4699]: I0226 11:14:13.722773 4699 patch_prober.go:28] interesting pod/router-default-5444994796-xm88w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 11:14:13 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Feb 26 11:14:13 crc kubenswrapper[4699]: [+]process-running ok Feb 26 11:14:13 crc kubenswrapper[4699]: healthz check failed Feb 26 11:14:13 crc kubenswrapper[4699]: I0226 11:14:13.722839 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xm88w" podUID="4a97e310-1811-48a9-a31a-eb9a0321d280" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 11:14:13 crc kubenswrapper[4699]: I0226 11:14:13.886029 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 11:14:13 crc kubenswrapper[4699]: I0226 11:14:13.992127 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4168d2c5-00be-4270-9a2b-c2b8847e4593-kubelet-dir\") pod \"4168d2c5-00be-4270-9a2b-c2b8847e4593\" (UID: \"4168d2c5-00be-4270-9a2b-c2b8847e4593\") " Feb 26 11:14:13 crc kubenswrapper[4699]: I0226 11:14:13.992323 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4168d2c5-00be-4270-9a2b-c2b8847e4593-kube-api-access\") pod \"4168d2c5-00be-4270-9a2b-c2b8847e4593\" (UID: \"4168d2c5-00be-4270-9a2b-c2b8847e4593\") " Feb 26 11:14:13 crc kubenswrapper[4699]: I0226 11:14:13.998079 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4168d2c5-00be-4270-9a2b-c2b8847e4593-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4168d2c5-00be-4270-9a2b-c2b8847e4593" (UID: "4168d2c5-00be-4270-9a2b-c2b8847e4593"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:14:14 crc kubenswrapper[4699]: I0226 11:14:13.999956 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-tnwpn" Feb 26 11:14:14 crc kubenswrapper[4699]: I0226 11:14:14.000817 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4168d2c5-00be-4270-9a2b-c2b8847e4593-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4168d2c5-00be-4270-9a2b-c2b8847e4593" (UID: "4168d2c5-00be-4270-9a2b-c2b8847e4593"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:14:14 crc kubenswrapper[4699]: I0226 11:14:14.096008 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4168d2c5-00be-4270-9a2b-c2b8847e4593-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 11:14:14 crc kubenswrapper[4699]: I0226 11:14:14.096041 4699 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4168d2c5-00be-4270-9a2b-c2b8847e4593-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 11:14:14 crc kubenswrapper[4699]: I0226 11:14:14.294413 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="744aa737-e6c7-4d6b-ba7d-a9479043ad29" path="/var/lib/kubelet/pods/744aa737-e6c7-4d6b-ba7d-a9479043ad29/volumes" Feb 26 11:14:14 crc kubenswrapper[4699]: I0226 11:14:14.394479 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4168d2c5-00be-4270-9a2b-c2b8847e4593","Type":"ContainerDied","Data":"17985416ddd6869d379ec85a7e60b0e0d55af715771863737c8600be5d531fb7"} Feb 26 11:14:14 crc kubenswrapper[4699]: I0226 11:14:14.394524 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17985416ddd6869d379ec85a7e60b0e0d55af715771863737c8600be5d531fb7" Feb 26 11:14:14 crc kubenswrapper[4699]: I0226 11:14:14.394591 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 11:14:14 crc kubenswrapper[4699]: I0226 11:14:14.426495 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" event={"ID":"2f265819-8c24-4d84-9afe-423152764dfb","Type":"ContainerStarted","Data":"438953ae4c2dd5482a447caa172401f76f9355d9f070a2edd3604d4596edc619"} Feb 26 11:14:14 crc kubenswrapper[4699]: I0226 11:14:14.427337 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" Feb 26 11:14:14 crc kubenswrapper[4699]: I0226 11:14:14.434610 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:14:14 crc kubenswrapper[4699]: I0226 11:14:14.437402 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" Feb 26 11:14:14 crc kubenswrapper[4699]: I0226 11:14:14.441503 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" event={"ID":"1873d943-4785-4bc5-a9c4-5a027a932464","Type":"ContainerStarted","Data":"a08711cd4ea22926a99c6ab95e9c9d94bed1d66b8cc11c69332013859eda951b"} Feb 26 11:14:14 crc kubenswrapper[4699]: I0226 11:14:14.441554 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" Feb 26 11:14:14 crc kubenswrapper[4699]: I0226 11:14:14.441569 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" event={"ID":"1873d943-4785-4bc5-a9c4-5a027a932464","Type":"ContainerStarted","Data":"8c28a63f20c67fc69e31c70403424ed7ca9920d8e519c0772be0d307ed557c9d"} Feb 26 11:14:14 crc kubenswrapper[4699]: I0226 11:14:14.445559 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-f8s5j" Feb 26 11:14:14 crc kubenswrapper[4699]: I0226 11:14:14.461264 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" podStartSLOduration=4.461245832 podStartE2EDuration="4.461245832s" podCreationTimestamp="2026-02-26 11:14:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:14.460050217 +0000 UTC m=+200.270876671" watchObservedRunningTime="2026-02-26 11:14:14.461245832 +0000 UTC m=+200.272072276" Feb 26 11:14:14 crc kubenswrapper[4699]: I0226 11:14:14.495410 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" Feb 26 11:14:14 crc kubenswrapper[4699]: I0226 11:14:14.501641 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" podStartSLOduration=4.501617254 podStartE2EDuration="4.501617254s" podCreationTimestamp="2026-02-26 11:14:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:14:14.497443501 +0000 UTC m=+200.308269945" watchObservedRunningTime="2026-02-26 11:14:14.501617254 +0000 UTC m=+200.312443688" Feb 26 11:14:14 crc kubenswrapper[4699]: I0226 11:14:14.704981 4699 ???:1] "http: TLS handshake error from 192.168.126.11:60164: no serving certificate available for the kubelet" Feb 26 11:14:14 crc kubenswrapper[4699]: I0226 11:14:14.726386 4699 patch_prober.go:28] interesting pod/router-default-5444994796-xm88w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 11:14:14 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Feb 26 11:14:14 crc kubenswrapper[4699]: [+]process-running ok Feb 26 11:14:14 crc kubenswrapper[4699]: healthz check failed Feb 26 11:14:14 crc kubenswrapper[4699]: I0226 11:14:14.726700 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xm88w" podUID="4a97e310-1811-48a9-a31a-eb9a0321d280" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 11:14:15 crc kubenswrapper[4699]: I0226 11:14:15.395441 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 11:14:15 crc kubenswrapper[4699]: I0226 11:14:15.531960 4699 ???:1] "http: TLS handshake error from 192.168.126.11:60166: no serving certificate available for the kubelet" Feb 26 11:14:15 crc kubenswrapper[4699]: I0226 11:14:15.550702 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 11:14:15 crc kubenswrapper[4699]: I0226 11:14:15.550839 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7","Type":"ContainerDied","Data":"c2ec8f209323100d0e1a79f63a539f1e19839405896844b0751e4c3bb600c54d"} Feb 26 11:14:15 crc kubenswrapper[4699]: I0226 11:14:15.550870 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2ec8f209323100d0e1a79f63a539f1e19839405896844b0751e4c3bb600c54d" Feb 26 11:14:15 crc kubenswrapper[4699]: I0226 11:14:15.552773 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7-kubelet-dir\") pod \"7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7\" (UID: \"7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7\") " Feb 26 11:14:15 crc kubenswrapper[4699]: I0226 11:14:15.552843 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7-kube-api-access\") pod \"7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7\" (UID: \"7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7\") " Feb 26 11:14:15 crc kubenswrapper[4699]: I0226 11:14:15.554025 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7" (UID: "7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:14:15 crc kubenswrapper[4699]: I0226 11:14:15.576835 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7" (UID: "7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:14:15 crc kubenswrapper[4699]: I0226 11:14:15.654942 4699 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 11:14:15 crc kubenswrapper[4699]: I0226 11:14:15.655369 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 11:14:15 crc kubenswrapper[4699]: I0226 11:14:15.734462 4699 patch_prober.go:28] interesting pod/router-default-5444994796-xm88w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 11:14:15 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Feb 26 11:14:15 crc kubenswrapper[4699]: [+]process-running ok Feb 26 11:14:15 crc kubenswrapper[4699]: healthz check failed Feb 26 11:14:15 crc kubenswrapper[4699]: I0226 11:14:15.734544 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xm88w" podUID="4a97e310-1811-48a9-a31a-eb9a0321d280" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 11:14:16 crc kubenswrapper[4699]: I0226 11:14:16.721952 4699 patch_prober.go:28] interesting pod/router-default-5444994796-xm88w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 11:14:16 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Feb 26 11:14:16 crc kubenswrapper[4699]: [+]process-running ok Feb 26 11:14:16 crc kubenswrapper[4699]: healthz check failed Feb 26 11:14:16 crc kubenswrapper[4699]: I0226 11:14:16.722001 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xm88w" podUID="4a97e310-1811-48a9-a31a-eb9a0321d280" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 11:14:17 crc kubenswrapper[4699]: I0226 11:14:17.726637 4699 patch_prober.go:28] interesting pod/router-default-5444994796-xm88w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 11:14:17 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Feb 26 11:14:17 crc kubenswrapper[4699]: [+]process-running ok Feb 26 11:14:17 crc kubenswrapper[4699]: healthz check failed Feb 26 11:14:17 crc kubenswrapper[4699]: I0226 11:14:17.727024 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xm88w" podUID="4a97e310-1811-48a9-a31a-eb9a0321d280" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 11:14:18 crc kubenswrapper[4699]: I0226 11:14:18.722257 4699 patch_prober.go:28] interesting pod/router-default-5444994796-xm88w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 11:14:18 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Feb 26 11:14:18 crc kubenswrapper[4699]: [+]process-running ok Feb 26 11:14:18 crc kubenswrapper[4699]: healthz check failed Feb 26 11:14:18 crc kubenswrapper[4699]: I0226 11:14:18.722333 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xm88w" podUID="4a97e310-1811-48a9-a31a-eb9a0321d280" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 11:14:19 crc kubenswrapper[4699]: I0226 11:14:19.055863 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:14:19 crc kubenswrapper[4699]: I0226 11:14:19.055916 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:14:19 crc kubenswrapper[4699]: I0226 11:14:19.055980 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:14:19 crc kubenswrapper[4699]: I0226 11:14:19.056036 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:14:19 crc kubenswrapper[4699]: I0226 11:14:19.260331 4699 patch_prober.go:28] interesting pod/console-f9d7485db-hnsh7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 26 11:14:19 crc kubenswrapper[4699]: I0226 11:14:19.260413 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hnsh7" podUID="e6bdcf19-db76-497c-a2fe-a6de38fae724" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 26 11:14:19 crc kubenswrapper[4699]: I0226 11:14:19.721171 4699 patch_prober.go:28] interesting pod/router-default-5444994796-xm88w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 11:14:19 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Feb 26 11:14:19 crc kubenswrapper[4699]: [+]process-running ok Feb 26 11:14:19 crc kubenswrapper[4699]: healthz check failed Feb 26 11:14:19 crc kubenswrapper[4699]: I0226 11:14:19.721399 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xm88w" podUID="4a97e310-1811-48a9-a31a-eb9a0321d280" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 11:14:19 crc kubenswrapper[4699]: I0226 11:14:19.861552 4699 ???:1] "http: TLS handshake error from 192.168.126.11:47002: no serving certificate available for the kubelet" Feb 26 11:14:20 crc kubenswrapper[4699]: I0226 11:14:20.720366 4699 patch_prober.go:28] interesting pod/router-default-5444994796-xm88w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 11:14:20 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Feb 26 11:14:20 crc kubenswrapper[4699]: [+]process-running ok Feb 26 11:14:20 crc kubenswrapper[4699]: healthz check failed Feb 26 11:14:20 crc kubenswrapper[4699]: I0226 11:14:20.720722 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xm88w" podUID="4a97e310-1811-48a9-a31a-eb9a0321d280" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 11:14:21 crc kubenswrapper[4699]: I0226 11:14:21.722152 4699 patch_prober.go:28] interesting pod/router-default-5444994796-xm88w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 11:14:21 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Feb 26 11:14:21 crc kubenswrapper[4699]: [+]process-running ok Feb 26 11:14:21 crc kubenswrapper[4699]: healthz check failed Feb 26 11:14:21 crc kubenswrapper[4699]: I0226 11:14:21.722213 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xm88w" podUID="4a97e310-1811-48a9-a31a-eb9a0321d280" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 11:14:22 crc kubenswrapper[4699]: I0226 11:14:22.727151 4699 patch_prober.go:28] interesting pod/router-default-5444994796-xm88w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 11:14:22 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Feb 26 11:14:22 crc kubenswrapper[4699]: [+]process-running ok Feb 26 11:14:22 crc kubenswrapper[4699]: healthz check failed Feb 26 11:14:22 crc kubenswrapper[4699]: I0226 11:14:22.727236 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xm88w" podUID="4a97e310-1811-48a9-a31a-eb9a0321d280" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 11:14:23 crc kubenswrapper[4699]: I0226 11:14:23.725576 4699 patch_prober.go:28] interesting pod/router-default-5444994796-xm88w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 11:14:23 crc kubenswrapper[4699]: [+]has-synced ok Feb 26 11:14:23 crc kubenswrapper[4699]: [+]process-running ok Feb 26 11:14:23 crc kubenswrapper[4699]: healthz check failed Feb 26 11:14:23 crc kubenswrapper[4699]: I0226 11:14:23.725713 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xm88w" podUID="4a97e310-1811-48a9-a31a-eb9a0321d280" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 11:14:24 crc kubenswrapper[4699]: I0226 11:14:24.721633 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-xm88w" Feb 26 11:14:24 crc kubenswrapper[4699]: I0226 11:14:24.726051 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-xm88w" Feb 26 11:14:29 crc kubenswrapper[4699]: I0226 11:14:28.870175 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:14:29 crc kubenswrapper[4699]: I0226 11:14:28.870276 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:14:29 crc kubenswrapper[4699]: I0226 11:14:28.870341 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:14:29 crc kubenswrapper[4699]: I0226 11:14:28.870388 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:14:29 crc kubenswrapper[4699]: I0226 11:14:29.055581 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:14:29 crc kubenswrapper[4699]: I0226 11:14:29.055635 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:14:29 crc kubenswrapper[4699]: I0226 11:14:29.056010 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:14:29 crc kubenswrapper[4699]: I0226 11:14:29.056027 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:14:29 crc kubenswrapper[4699]: I0226 11:14:29.254355 4699 patch_prober.go:28] interesting pod/console-f9d7485db-hnsh7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 26 11:14:29 crc kubenswrapper[4699]: I0226 11:14:29.254434 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hnsh7" podUID="e6bdcf19-db76-497c-a2fe-a6de38fae724" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 26 11:14:32 crc kubenswrapper[4699]: E0226 11:14:32.340628 4699 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Feb 26 11:14:32 crc kubenswrapper[4699]: E0226 11:14:32.341002 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 11:16:34.340975723 +0000 UTC m=+340.151802167 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Feb 26 11:14:32 crc kubenswrapper[4699]: E0226 11:14:32.341059 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 26 11:14:32 crc kubenswrapper[4699]: E0226 11:14:32.341126 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 26 11:14:32 crc kubenswrapper[4699]: E0226 11:14:32.341181 4699 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Feb 26 11:14:32 crc kubenswrapper[4699]: E0226 11:14:32.341247 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 11:16:34.34122796 +0000 UTC m=+340.152054394 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Feb 26 11:14:32 crc kubenswrapper[4699]: I0226 11:14:32.383438 4699 patch_prober.go:28] interesting pod/router-default-5444994796-xm88w container/router namespace/openshift-ingress: Liveness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]controller ok Feb 26 11:14:32 crc kubenswrapper[4699]: [-]backend-http failed: reason withheld Feb 26 11:14:32 crc kubenswrapper[4699]: healthz check failed Feb 26 11:14:32 crc kubenswrapper[4699]: I0226 11:14:32.383523 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-xm88w" podUID="4a97e310-1811-48a9-a31a-eb9a0321d280" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 11:14:32 crc kubenswrapper[4699]: I0226 11:14:32.383971 4699 patch_prober.go:28] interesting pod/router-default-5444994796-xm88w container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 11:14:32 crc kubenswrapper[4699]: I0226 11:14:32.384004 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-xm88w" podUID="4a97e310-1811-48a9-a31a-eb9a0321d280" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 11:14:32 crc kubenswrapper[4699]: I0226 11:14:32.391358 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 26 11:14:32 crc kubenswrapper[4699]: I0226 11:14:32.391894 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 26 11:14:32 crc kubenswrapper[4699]: I0226 11:14:32.392014 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 26 11:14:32 crc kubenswrapper[4699]: E0226 11:14:32.394370 4699 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Feb 26 11:14:32 crc kubenswrapper[4699]: E0226 11:14:32.394460 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 11:16:34.394425371 +0000 UTC m=+340.205251795 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Feb 26 11:14:32 crc kubenswrapper[4699]: E0226 11:14:32.394566 4699 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Feb 26 11:14:32 crc kubenswrapper[4699]: E0226 11:14:32.394632 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 11:16:34.394610917 +0000 UTC m=+340.205437351 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Feb 26 11:14:32 crc kubenswrapper[4699]: I0226 11:14:32.397693 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 26 11:14:32 crc kubenswrapper[4699]: E0226 11:14:32.424105 4699 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="3.529s" Feb 26 11:14:33 crc kubenswrapper[4699]: I0226 11:14:33.554369 4699 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-vzj5b container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": context deadline exceeded" start-of-body= Feb 26 11:14:33 crc kubenswrapper[4699]: I0226 11:14:33.554460 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vzj5b" podUID="03fd3407-9529-4638-89d6-cfc6b703e510" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": context deadline exceeded" Feb 26 11:14:33 crc kubenswrapper[4699]: E0226 11:14:33.725267 4699 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.301s" Feb 26 11:14:33 crc kubenswrapper[4699]: I0226 11:14:33.725413 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-tcnxt" Feb 26 11:14:33 crc kubenswrapper[4699]: I0226 11:14:33.743294 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:14:33 crc kubenswrapper[4699]: I0226 11:14:33.746483 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"a48cf703fa085cd2031065303843172d7d70091d4cf97de0a11e40328102d59a"} pod="openshift-console/downloads-7954f5f757-tcnxt" containerMessage="Container download-server failed liveness probe, will be restarted" Feb 26 11:14:33 crc kubenswrapper[4699]: I0226 11:14:33.752314 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" containerID="cri-o://a48cf703fa085cd2031065303843172d7d70091d4cf97de0a11e40328102d59a" gracePeriod=2 Feb 26 11:14:33 crc kubenswrapper[4699]: I0226 11:14:33.769888 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:14:33 crc kubenswrapper[4699]: I0226 11:14:33.769992 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:14:34 crc kubenswrapper[4699]: I0226 11:14:34.505845 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c78ff548b-nppmt"] Feb 26 11:14:34 crc kubenswrapper[4699]: I0226 11:14:34.506152 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" podUID="1873d943-4785-4bc5-a9c4-5a027a932464" containerName="controller-manager" containerID="cri-o://a08711cd4ea22926a99c6ab95e9c9d94bed1d66b8cc11c69332013859eda951b" gracePeriod=30 Feb 26 11:14:34 crc kubenswrapper[4699]: I0226 11:14:34.534392 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf"] Feb 26 11:14:34 crc kubenswrapper[4699]: I0226 11:14:34.534678 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" podUID="2f265819-8c24-4d84-9afe-423152764dfb" containerName="route-controller-manager" containerID="cri-o://438953ae4c2dd5482a447caa172401f76f9355d9f070a2edd3604d4596edc619" gracePeriod=30 Feb 26 11:14:34 crc kubenswrapper[4699]: I0226 11:14:34.754169 4699 generic.go:334] "Generic (PLEG): container finished" podID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerID="a48cf703fa085cd2031065303843172d7d70091d4cf97de0a11e40328102d59a" exitCode=0 Feb 26 11:14:34 crc kubenswrapper[4699]: I0226 11:14:34.754292 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tcnxt" event={"ID":"72b1bc55-f48b-4d90-ab02-3a80438096b6","Type":"ContainerDied","Data":"a48cf703fa085cd2031065303843172d7d70091d4cf97de0a11e40328102d59a"} Feb 26 11:14:34 crc kubenswrapper[4699]: I0226 11:14:34.759161 4699 generic.go:334] "Generic (PLEG): container finished" podID="1873d943-4785-4bc5-a9c4-5a027a932464" containerID="a08711cd4ea22926a99c6ab95e9c9d94bed1d66b8cc11c69332013859eda951b" exitCode=0 Feb 26 11:14:34 crc kubenswrapper[4699]: I0226 11:14:34.759216 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" event={"ID":"1873d943-4785-4bc5-a9c4-5a027a932464","Type":"ContainerDied","Data":"a08711cd4ea22926a99c6ab95e9c9d94bed1d66b8cc11c69332013859eda951b"} Feb 26 11:14:35 crc kubenswrapper[4699]: I0226 11:14:35.898244 4699 generic.go:334] "Generic (PLEG): container finished" podID="2f265819-8c24-4d84-9afe-423152764dfb" containerID="438953ae4c2dd5482a447caa172401f76f9355d9f070a2edd3604d4596edc619" exitCode=0 Feb 26 11:14:35 crc kubenswrapper[4699]: I0226 11:14:35.898331 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" event={"ID":"2f265819-8c24-4d84-9afe-423152764dfb","Type":"ContainerDied","Data":"438953ae4c2dd5482a447caa172401f76f9355d9f070a2edd3604d4596edc619"} Feb 26 11:14:39 crc kubenswrapper[4699]: I0226 11:14:39.066012 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:14:39 crc kubenswrapper[4699]: I0226 11:14:39.066457 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:14:39 crc kubenswrapper[4699]: I0226 11:14:39.071889 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dk749" Feb 26 11:14:39 crc kubenswrapper[4699]: I0226 11:14:39.266610 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:14:39 crc kubenswrapper[4699]: I0226 11:14:39.275674 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:14:40 crc kubenswrapper[4699]: I0226 11:14:40.443219 4699 ???:1] "http: TLS handshake error from 192.168.126.11:40548: no serving certificate available for the kubelet" Feb 26 11:14:41 crc kubenswrapper[4699]: I0226 11:14:41.585232 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:14:41 crc kubenswrapper[4699]: I0226 11:14:41.585292 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:14:42 crc kubenswrapper[4699]: I0226 11:14:42.626960 4699 patch_prober.go:28] interesting pod/route-controller-manager-59b4784554-77qxf container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" start-of-body= Feb 26 11:14:42 crc kubenswrapper[4699]: I0226 11:14:42.627442 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" podUID="2f265819-8c24-4d84-9afe-423152764dfb" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" Feb 26 11:14:42 crc kubenswrapper[4699]: I0226 11:14:42.628078 4699 patch_prober.go:28] interesting pod/controller-manager-7c78ff548b-nppmt container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" start-of-body= Feb 26 11:14:42 crc kubenswrapper[4699]: I0226 11:14:42.628160 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" podUID="1873d943-4785-4bc5-a9c4-5a027a932464" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" Feb 26 11:14:45 crc kubenswrapper[4699]: I0226 11:14:45.576605 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 26 11:14:45 crc kubenswrapper[4699]: E0226 11:14:45.576899 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4168d2c5-00be-4270-9a2b-c2b8847e4593" containerName="pruner" Feb 26 11:14:45 crc kubenswrapper[4699]: I0226 11:14:45.576916 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="4168d2c5-00be-4270-9a2b-c2b8847e4593" containerName="pruner" Feb 26 11:14:45 crc kubenswrapper[4699]: E0226 11:14:45.576930 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7" containerName="pruner" Feb 26 11:14:45 crc kubenswrapper[4699]: I0226 11:14:45.576939 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7" containerName="pruner" Feb 26 11:14:45 crc kubenswrapper[4699]: I0226 11:14:45.577087 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="4168d2c5-00be-4270-9a2b-c2b8847e4593" containerName="pruner" Feb 26 11:14:45 crc kubenswrapper[4699]: I0226 11:14:45.577101 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b6f5fd6-ce1d-48d1-bb78-237e07a93ff7" containerName="pruner" Feb 26 11:14:45 crc kubenswrapper[4699]: I0226 11:14:45.577486 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 11:14:45 crc kubenswrapper[4699]: I0226 11:14:45.579622 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 26 11:14:45 crc kubenswrapper[4699]: I0226 11:14:45.580102 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 26 11:14:45 crc kubenswrapper[4699]: I0226 11:14:45.590582 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 26 11:14:45 crc kubenswrapper[4699]: I0226 11:14:45.750743 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e999a971-660e-4244-8ff3-5d41795bd7f1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e999a971-660e-4244-8ff3-5d41795bd7f1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 11:14:45 crc kubenswrapper[4699]: I0226 11:14:45.750830 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e999a971-660e-4244-8ff3-5d41795bd7f1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e999a971-660e-4244-8ff3-5d41795bd7f1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 11:14:45 crc kubenswrapper[4699]: I0226 11:14:45.852605 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e999a971-660e-4244-8ff3-5d41795bd7f1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e999a971-660e-4244-8ff3-5d41795bd7f1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 11:14:45 crc kubenswrapper[4699]: I0226 11:14:45.852793 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e999a971-660e-4244-8ff3-5d41795bd7f1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e999a971-660e-4244-8ff3-5d41795bd7f1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 11:14:45 crc kubenswrapper[4699]: I0226 11:14:45.852937 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e999a971-660e-4244-8ff3-5d41795bd7f1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e999a971-660e-4244-8ff3-5d41795bd7f1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 11:14:45 crc kubenswrapper[4699]: I0226 11:14:45.892715 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e999a971-660e-4244-8ff3-5d41795bd7f1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e999a971-660e-4244-8ff3-5d41795bd7f1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 11:14:45 crc kubenswrapper[4699]: I0226 11:14:45.910226 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 11:14:49 crc kubenswrapper[4699]: I0226 11:14:49.057735 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:14:49 crc kubenswrapper[4699]: I0226 11:14:49.057839 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:14:50 crc kubenswrapper[4699]: I0226 11:14:50.782809 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 26 11:14:50 crc kubenswrapper[4699]: I0226 11:14:50.784363 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 11:14:50 crc kubenswrapper[4699]: I0226 11:14:50.788262 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 26 11:14:50 crc kubenswrapper[4699]: I0226 11:14:50.881188 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a904aa73-23d7-4994-882a-4afafe02fb82-var-lock\") pod \"installer-9-crc\" (UID: \"a904aa73-23d7-4994-882a-4afafe02fb82\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 11:14:50 crc kubenswrapper[4699]: I0226 11:14:50.881333 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a904aa73-23d7-4994-882a-4afafe02fb82-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a904aa73-23d7-4994-882a-4afafe02fb82\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 11:14:50 crc kubenswrapper[4699]: I0226 11:14:50.881435 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a904aa73-23d7-4994-882a-4afafe02fb82-kube-api-access\") pod \"installer-9-crc\" (UID: \"a904aa73-23d7-4994-882a-4afafe02fb82\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 11:14:50 crc kubenswrapper[4699]: I0226 11:14:50.983382 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a904aa73-23d7-4994-882a-4afafe02fb82-kube-api-access\") pod \"installer-9-crc\" (UID: \"a904aa73-23d7-4994-882a-4afafe02fb82\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 11:14:50 crc kubenswrapper[4699]: I0226 11:14:50.983459 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a904aa73-23d7-4994-882a-4afafe02fb82-var-lock\") pod \"installer-9-crc\" (UID: \"a904aa73-23d7-4994-882a-4afafe02fb82\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 11:14:50 crc kubenswrapper[4699]: I0226 11:14:50.983534 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a904aa73-23d7-4994-882a-4afafe02fb82-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a904aa73-23d7-4994-882a-4afafe02fb82\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 11:14:50 crc kubenswrapper[4699]: I0226 11:14:50.983615 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a904aa73-23d7-4994-882a-4afafe02fb82-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a904aa73-23d7-4994-882a-4afafe02fb82\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 11:14:50 crc kubenswrapper[4699]: I0226 11:14:50.983952 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a904aa73-23d7-4994-882a-4afafe02fb82-var-lock\") pod \"installer-9-crc\" (UID: \"a904aa73-23d7-4994-882a-4afafe02fb82\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 11:14:51 crc kubenswrapper[4699]: I0226 11:14:51.002716 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a904aa73-23d7-4994-882a-4afafe02fb82-kube-api-access\") pod \"installer-9-crc\" (UID: \"a904aa73-23d7-4994-882a-4afafe02fb82\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 11:14:51 crc kubenswrapper[4699]: I0226 11:14:51.122627 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 11:14:52 crc kubenswrapper[4699]: I0226 11:14:52.583355 4699 patch_prober.go:28] interesting pod/route-controller-manager-59b4784554-77qxf container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" start-of-body= Feb 26 11:14:52 crc kubenswrapper[4699]: I0226 11:14:52.584247 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" podUID="2f265819-8c24-4d84-9afe-423152764dfb" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" Feb 26 11:14:52 crc kubenswrapper[4699]: I0226 11:14:52.625455 4699 patch_prober.go:28] interesting pod/controller-manager-7c78ff548b-nppmt container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" start-of-body= Feb 26 11:14:52 crc kubenswrapper[4699]: I0226 11:14:52.625517 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" podUID="1873d943-4785-4bc5-a9c4-5a027a932464" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" Feb 26 11:14:59 crc kubenswrapper[4699]: I0226 11:14:59.056751 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:14:59 crc kubenswrapper[4699]: I0226 11:14:59.057141 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:15:00 crc kubenswrapper[4699]: I0226 11:15:00.151166 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535075-hl4g4"] Feb 26 11:15:00 crc kubenswrapper[4699]: I0226 11:15:00.153043 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535075-hl4g4" Feb 26 11:15:00 crc kubenswrapper[4699]: I0226 11:15:00.155722 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 11:15:00 crc kubenswrapper[4699]: I0226 11:15:00.156940 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 11:15:00 crc kubenswrapper[4699]: I0226 11:15:00.158637 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535075-hl4g4"] Feb 26 11:15:00 crc kubenswrapper[4699]: I0226 11:15:00.284331 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed8aec36-74ad-4c69-baf8-d672010495e9-secret-volume\") pod \"collect-profiles-29535075-hl4g4\" (UID: \"ed8aec36-74ad-4c69-baf8-d672010495e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535075-hl4g4" Feb 26 11:15:00 crc kubenswrapper[4699]: I0226 11:15:00.284392 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed8aec36-74ad-4c69-baf8-d672010495e9-config-volume\") pod \"collect-profiles-29535075-hl4g4\" (UID: \"ed8aec36-74ad-4c69-baf8-d672010495e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535075-hl4g4" Feb 26 11:15:00 crc kubenswrapper[4699]: I0226 11:15:00.284467 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kmjp\" (UniqueName: \"kubernetes.io/projected/ed8aec36-74ad-4c69-baf8-d672010495e9-kube-api-access-7kmjp\") pod \"collect-profiles-29535075-hl4g4\" (UID: \"ed8aec36-74ad-4c69-baf8-d672010495e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535075-hl4g4" Feb 26 11:15:00 crc kubenswrapper[4699]: I0226 11:15:00.385719 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed8aec36-74ad-4c69-baf8-d672010495e9-secret-volume\") pod \"collect-profiles-29535075-hl4g4\" (UID: \"ed8aec36-74ad-4c69-baf8-d672010495e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535075-hl4g4" Feb 26 11:15:00 crc kubenswrapper[4699]: I0226 11:15:00.385777 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed8aec36-74ad-4c69-baf8-d672010495e9-config-volume\") pod \"collect-profiles-29535075-hl4g4\" (UID: \"ed8aec36-74ad-4c69-baf8-d672010495e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535075-hl4g4" Feb 26 11:15:00 crc kubenswrapper[4699]: I0226 11:15:00.385851 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kmjp\" (UniqueName: \"kubernetes.io/projected/ed8aec36-74ad-4c69-baf8-d672010495e9-kube-api-access-7kmjp\") pod \"collect-profiles-29535075-hl4g4\" (UID: \"ed8aec36-74ad-4c69-baf8-d672010495e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535075-hl4g4" Feb 26 11:15:00 crc kubenswrapper[4699]: I0226 11:15:00.630034 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed8aec36-74ad-4c69-baf8-d672010495e9-config-volume\") pod \"collect-profiles-29535075-hl4g4\" (UID: \"ed8aec36-74ad-4c69-baf8-d672010495e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535075-hl4g4" Feb 26 11:15:00 crc kubenswrapper[4699]: I0226 11:15:00.639659 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kmjp\" (UniqueName: \"kubernetes.io/projected/ed8aec36-74ad-4c69-baf8-d672010495e9-kube-api-access-7kmjp\") pod \"collect-profiles-29535075-hl4g4\" (UID: \"ed8aec36-74ad-4c69-baf8-d672010495e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535075-hl4g4" Feb 26 11:15:00 crc kubenswrapper[4699]: I0226 11:15:00.640174 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed8aec36-74ad-4c69-baf8-d672010495e9-secret-volume\") pod \"collect-profiles-29535075-hl4g4\" (UID: \"ed8aec36-74ad-4c69-baf8-d672010495e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535075-hl4g4" Feb 26 11:15:00 crc kubenswrapper[4699]: I0226 11:15:00.776722 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535075-hl4g4" Feb 26 11:15:02 crc kubenswrapper[4699]: I0226 11:15:02.642382 4699 patch_prober.go:28] interesting pod/controller-manager-7c78ff548b-nppmt container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" start-of-body= Feb 26 11:15:02 crc kubenswrapper[4699]: I0226 11:15:02.642946 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" podUID="1873d943-4785-4bc5-a9c4-5a027a932464" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" Feb 26 11:15:03 crc kubenswrapper[4699]: I0226 11:15:03.639923 4699 patch_prober.go:28] interesting pod/route-controller-manager-59b4784554-77qxf container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 11:15:03 crc kubenswrapper[4699]: I0226 11:15:03.640650 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" podUID="2f265819-8c24-4d84-9afe-423152764dfb" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.589861 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.625103 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz"] Feb 26 11:15:08 crc kubenswrapper[4699]: E0226 11:15:08.625759 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f265819-8c24-4d84-9afe-423152764dfb" containerName="route-controller-manager" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.625817 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f265819-8c24-4d84-9afe-423152764dfb" containerName="route-controller-manager" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.625986 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f265819-8c24-4d84-9afe-423152764dfb" containerName="route-controller-manager" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.626701 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.649486 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz"] Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.718591 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f265819-8c24-4d84-9afe-423152764dfb-client-ca\") pod \"2f265819-8c24-4d84-9afe-423152764dfb\" (UID: \"2f265819-8c24-4d84-9afe-423152764dfb\") " Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.718667 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md2wz\" (UniqueName: \"kubernetes.io/projected/2f265819-8c24-4d84-9afe-423152764dfb-kube-api-access-md2wz\") pod \"2f265819-8c24-4d84-9afe-423152764dfb\" (UID: \"2f265819-8c24-4d84-9afe-423152764dfb\") " Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.718735 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f265819-8c24-4d84-9afe-423152764dfb-config\") pod \"2f265819-8c24-4d84-9afe-423152764dfb\" (UID: \"2f265819-8c24-4d84-9afe-423152764dfb\") " Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.718798 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f265819-8c24-4d84-9afe-423152764dfb-serving-cert\") pod \"2f265819-8c24-4d84-9afe-423152764dfb\" (UID: \"2f265819-8c24-4d84-9afe-423152764dfb\") " Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.719031 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8wsx\" (UniqueName: \"kubernetes.io/projected/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-kube-api-access-z8wsx\") pod \"route-controller-manager-7d67f9fbb8-7gsgz\" (UID: \"fb7738e1-5c72-401d-ba71-9ae3b1d9d266\") " pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.719128 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-serving-cert\") pod \"route-controller-manager-7d67f9fbb8-7gsgz\" (UID: \"fb7738e1-5c72-401d-ba71-9ae3b1d9d266\") " pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.719185 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-config\") pod \"route-controller-manager-7d67f9fbb8-7gsgz\" (UID: \"fb7738e1-5c72-401d-ba71-9ae3b1d9d266\") " pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.719242 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-client-ca\") pod \"route-controller-manager-7d67f9fbb8-7gsgz\" (UID: \"fb7738e1-5c72-401d-ba71-9ae3b1d9d266\") " pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.720486 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f265819-8c24-4d84-9afe-423152764dfb-config" (OuterVolumeSpecName: "config") pod "2f265819-8c24-4d84-9afe-423152764dfb" (UID: "2f265819-8c24-4d84-9afe-423152764dfb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.721215 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f265819-8c24-4d84-9afe-423152764dfb-client-ca" (OuterVolumeSpecName: "client-ca") pod "2f265819-8c24-4d84-9afe-423152764dfb" (UID: "2f265819-8c24-4d84-9afe-423152764dfb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.736030 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f265819-8c24-4d84-9afe-423152764dfb-kube-api-access-md2wz" (OuterVolumeSpecName: "kube-api-access-md2wz") pod "2f265819-8c24-4d84-9afe-423152764dfb" (UID: "2f265819-8c24-4d84-9afe-423152764dfb"). InnerVolumeSpecName "kube-api-access-md2wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.820928 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-config\") pod \"route-controller-manager-7d67f9fbb8-7gsgz\" (UID: \"fb7738e1-5c72-401d-ba71-9ae3b1d9d266\") " pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.821020 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-client-ca\") pod \"route-controller-manager-7d67f9fbb8-7gsgz\" (UID: \"fb7738e1-5c72-401d-ba71-9ae3b1d9d266\") " pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.821102 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8wsx\" (UniqueName: \"kubernetes.io/projected/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-kube-api-access-z8wsx\") pod \"route-controller-manager-7d67f9fbb8-7gsgz\" (UID: \"fb7738e1-5c72-401d-ba71-9ae3b1d9d266\") " pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.821188 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-serving-cert\") pod \"route-controller-manager-7d67f9fbb8-7gsgz\" (UID: \"fb7738e1-5c72-401d-ba71-9ae3b1d9d266\") " pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.821278 4699 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f265819-8c24-4d84-9afe-423152764dfb-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.821297 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md2wz\" (UniqueName: \"kubernetes.io/projected/2f265819-8c24-4d84-9afe-423152764dfb-kube-api-access-md2wz\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.821314 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f265819-8c24-4d84-9afe-423152764dfb-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.861763 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-config\") pod \"route-controller-manager-7d67f9fbb8-7gsgz\" (UID: \"fb7738e1-5c72-401d-ba71-9ae3b1d9d266\") " pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.863091 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-client-ca\") pod \"route-controller-manager-7d67f9fbb8-7gsgz\" (UID: \"fb7738e1-5c72-401d-ba71-9ae3b1d9d266\") " pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.879011 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-serving-cert\") pod \"route-controller-manager-7d67f9fbb8-7gsgz\" (UID: \"fb7738e1-5c72-401d-ba71-9ae3b1d9d266\") " pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.879341 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f265819-8c24-4d84-9afe-423152764dfb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2f265819-8c24-4d84-9afe-423152764dfb" (UID: "2f265819-8c24-4d84-9afe-423152764dfb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:15:08 crc kubenswrapper[4699]: I0226 11:15:08.900228 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8wsx\" (UniqueName: \"kubernetes.io/projected/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-kube-api-access-z8wsx\") pod \"route-controller-manager-7d67f9fbb8-7gsgz\" (UID: \"fb7738e1-5c72-401d-ba71-9ae3b1d9d266\") " pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" Feb 26 11:15:09 crc kubenswrapper[4699]: I0226 11:15:08.924255 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f265819-8c24-4d84-9afe-423152764dfb-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:09 crc kubenswrapper[4699]: I0226 11:15:09.164015 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" Feb 26 11:15:09 crc kubenswrapper[4699]: I0226 11:15:09.164765 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:15:09 crc kubenswrapper[4699]: I0226 11:15:09.164807 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:15:09 crc kubenswrapper[4699]: I0226 11:15:09.181076 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" event={"ID":"2f265819-8c24-4d84-9afe-423152764dfb","Type":"ContainerDied","Data":"ad9a383bdf2560f2e8e699488a34354989214933a94da177587d7b8f57d53fa3"} Feb 26 11:15:09 crc kubenswrapper[4699]: I0226 11:15:09.181147 4699 scope.go:117] "RemoveContainer" containerID="438953ae4c2dd5482a447caa172401f76f9355d9f070a2edd3604d4596edc619" Feb 26 11:15:09 crc kubenswrapper[4699]: I0226 11:15:09.181301 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf" Feb 26 11:15:09 crc kubenswrapper[4699]: I0226 11:15:09.348308 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf"] Feb 26 11:15:09 crc kubenswrapper[4699]: I0226 11:15:09.352093 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b4784554-77qxf"] Feb 26 11:15:10 crc kubenswrapper[4699]: I0226 11:15:10.386985 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f265819-8c24-4d84-9afe-423152764dfb" path="/var/lib/kubelet/pods/2f265819-8c24-4d84-9afe-423152764dfb/volumes" Feb 26 11:15:11 crc kubenswrapper[4699]: I0226 11:15:11.584764 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:15:11 crc kubenswrapper[4699]: I0226 11:15:11.584911 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:15:11 crc kubenswrapper[4699]: E0226 11:15:11.683835 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 26 11:15:11 crc kubenswrapper[4699]: E0226 11:15:11.684452 4699 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 26 11:15:11 crc kubenswrapper[4699]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 26 11:15:11 crc kubenswrapper[4699]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qq8lx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29535074-bjfld_openshift-infra(30d444da-9127-459c-97c6-cdcff5b20e67): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Feb 26 11:15:11 crc kubenswrapper[4699]: > logger="UnhandledError" Feb 26 11:15:11 crc kubenswrapper[4699]: E0226 11:15:11.686067 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29535074-bjfld" podUID="30d444da-9127-459c-97c6-cdcff5b20e67" Feb 26 11:15:11 crc kubenswrapper[4699]: I0226 11:15:11.696429 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" Feb 26 11:15:11 crc kubenswrapper[4699]: I0226 11:15:11.929234 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v87n7\" (UniqueName: \"kubernetes.io/projected/1873d943-4785-4bc5-a9c4-5a027a932464-kube-api-access-v87n7\") pod \"1873d943-4785-4bc5-a9c4-5a027a932464\" (UID: \"1873d943-4785-4bc5-a9c4-5a027a932464\") " Feb 26 11:15:11 crc kubenswrapper[4699]: I0226 11:15:11.929704 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1873d943-4785-4bc5-a9c4-5a027a932464-serving-cert\") pod \"1873d943-4785-4bc5-a9c4-5a027a932464\" (UID: \"1873d943-4785-4bc5-a9c4-5a027a932464\") " Feb 26 11:15:11 crc kubenswrapper[4699]: I0226 11:15:11.929764 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1873d943-4785-4bc5-a9c4-5a027a932464-config\") pod \"1873d943-4785-4bc5-a9c4-5a027a932464\" (UID: \"1873d943-4785-4bc5-a9c4-5a027a932464\") " Feb 26 11:15:11 crc kubenswrapper[4699]: I0226 11:15:11.929861 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1873d943-4785-4bc5-a9c4-5a027a932464-client-ca\") pod \"1873d943-4785-4bc5-a9c4-5a027a932464\" (UID: \"1873d943-4785-4bc5-a9c4-5a027a932464\") " Feb 26 11:15:11 crc kubenswrapper[4699]: I0226 11:15:11.929921 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1873d943-4785-4bc5-a9c4-5a027a932464-proxy-ca-bundles\") pod \"1873d943-4785-4bc5-a9c4-5a027a932464\" (UID: \"1873d943-4785-4bc5-a9c4-5a027a932464\") " Feb 26 11:15:11 crc kubenswrapper[4699]: I0226 11:15:11.931014 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1873d943-4785-4bc5-a9c4-5a027a932464-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1873d943-4785-4bc5-a9c4-5a027a932464" (UID: "1873d943-4785-4bc5-a9c4-5a027a932464"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:15:11 crc kubenswrapper[4699]: I0226 11:15:11.932375 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1873d943-4785-4bc5-a9c4-5a027a932464-config" (OuterVolumeSpecName: "config") pod "1873d943-4785-4bc5-a9c4-5a027a932464" (UID: "1873d943-4785-4bc5-a9c4-5a027a932464"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:15:11 crc kubenswrapper[4699]: I0226 11:15:11.947830 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1873d943-4785-4bc5-a9c4-5a027a932464-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1873d943-4785-4bc5-a9c4-5a027a932464" (UID: "1873d943-4785-4bc5-a9c4-5a027a932464"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:15:11 crc kubenswrapper[4699]: I0226 11:15:11.948754 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1873d943-4785-4bc5-a9c4-5a027a932464-client-ca" (OuterVolumeSpecName: "client-ca") pod "1873d943-4785-4bc5-a9c4-5a027a932464" (UID: "1873d943-4785-4bc5-a9c4-5a027a932464"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:15:11 crc kubenswrapper[4699]: I0226 11:15:11.952378 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1873d943-4785-4bc5-a9c4-5a027a932464-kube-api-access-v87n7" (OuterVolumeSpecName: "kube-api-access-v87n7") pod "1873d943-4785-4bc5-a9c4-5a027a932464" (UID: "1873d943-4785-4bc5-a9c4-5a027a932464"). InnerVolumeSpecName "kube-api-access-v87n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:11.970652 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7bcd6f597b-s4crp"] Feb 26 11:15:12 crc kubenswrapper[4699]: E0226 11:15:11.972414 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1873d943-4785-4bc5-a9c4-5a027a932464" containerName="controller-manager" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:11.972452 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="1873d943-4785-4bc5-a9c4-5a027a932464" containerName="controller-manager" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:11.974362 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="1873d943-4785-4bc5-a9c4-5a027a932464" containerName="controller-manager" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:11.975272 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:11.979026 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bcd6f597b-s4crp"] Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.031367 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb4d1ddf-c814-4b93-972e-bffff61f9170-client-ca\") pod \"controller-manager-7bcd6f597b-s4crp\" (UID: \"eb4d1ddf-c814-4b93-972e-bffff61f9170\") " pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.031663 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb4d1ddf-c814-4b93-972e-bffff61f9170-config\") pod \"controller-manager-7bcd6f597b-s4crp\" (UID: \"eb4d1ddf-c814-4b93-972e-bffff61f9170\") " pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.031729 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb4d1ddf-c814-4b93-972e-bffff61f9170-proxy-ca-bundles\") pod \"controller-manager-7bcd6f597b-s4crp\" (UID: \"eb4d1ddf-c814-4b93-972e-bffff61f9170\") " pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.031858 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb4d1ddf-c814-4b93-972e-bffff61f9170-serving-cert\") pod \"controller-manager-7bcd6f597b-s4crp\" (UID: \"eb4d1ddf-c814-4b93-972e-bffff61f9170\") " pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.031909 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkb74\" (UniqueName: \"kubernetes.io/projected/eb4d1ddf-c814-4b93-972e-bffff61f9170-kube-api-access-lkb74\") pod \"controller-manager-7bcd6f597b-s4crp\" (UID: \"eb4d1ddf-c814-4b93-972e-bffff61f9170\") " pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.032029 4699 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1873d943-4785-4bc5-a9c4-5a027a932464-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.032070 4699 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1873d943-4785-4bc5-a9c4-5a027a932464-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.032092 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v87n7\" (UniqueName: \"kubernetes.io/projected/1873d943-4785-4bc5-a9c4-5a027a932464-kube-api-access-v87n7\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.032105 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1873d943-4785-4bc5-a9c4-5a027a932464-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.032131 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1873d943-4785-4bc5-a9c4-5a027a932464-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.133390 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb4d1ddf-c814-4b93-972e-bffff61f9170-serving-cert\") pod \"controller-manager-7bcd6f597b-s4crp\" (UID: \"eb4d1ddf-c814-4b93-972e-bffff61f9170\") " pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.133484 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkb74\" (UniqueName: \"kubernetes.io/projected/eb4d1ddf-c814-4b93-972e-bffff61f9170-kube-api-access-lkb74\") pod \"controller-manager-7bcd6f597b-s4crp\" (UID: \"eb4d1ddf-c814-4b93-972e-bffff61f9170\") " pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.133536 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb4d1ddf-c814-4b93-972e-bffff61f9170-client-ca\") pod \"controller-manager-7bcd6f597b-s4crp\" (UID: \"eb4d1ddf-c814-4b93-972e-bffff61f9170\") " pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.133627 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb4d1ddf-c814-4b93-972e-bffff61f9170-config\") pod \"controller-manager-7bcd6f597b-s4crp\" (UID: \"eb4d1ddf-c814-4b93-972e-bffff61f9170\") " pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.133662 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb4d1ddf-c814-4b93-972e-bffff61f9170-proxy-ca-bundles\") pod \"controller-manager-7bcd6f597b-s4crp\" (UID: \"eb4d1ddf-c814-4b93-972e-bffff61f9170\") " pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.135019 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb4d1ddf-c814-4b93-972e-bffff61f9170-client-ca\") pod \"controller-manager-7bcd6f597b-s4crp\" (UID: \"eb4d1ddf-c814-4b93-972e-bffff61f9170\") " pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.135390 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb4d1ddf-c814-4b93-972e-bffff61f9170-proxy-ca-bundles\") pod \"controller-manager-7bcd6f597b-s4crp\" (UID: \"eb4d1ddf-c814-4b93-972e-bffff61f9170\") " pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.135647 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb4d1ddf-c814-4b93-972e-bffff61f9170-config\") pod \"controller-manager-7bcd6f597b-s4crp\" (UID: \"eb4d1ddf-c814-4b93-972e-bffff61f9170\") " pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.137783 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb4d1ddf-c814-4b93-972e-bffff61f9170-serving-cert\") pod \"controller-manager-7bcd6f597b-s4crp\" (UID: \"eb4d1ddf-c814-4b93-972e-bffff61f9170\") " pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.159745 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkb74\" (UniqueName: \"kubernetes.io/projected/eb4d1ddf-c814-4b93-972e-bffff61f9170-kube-api-access-lkb74\") pod \"controller-manager-7bcd6f597b-s4crp\" (UID: \"eb4d1ddf-c814-4b93-972e-bffff61f9170\") " pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.331539 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.426561 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" event={"ID":"1873d943-4785-4bc5-a9c4-5a027a932464","Type":"ContainerDied","Data":"8c28a63f20c67fc69e31c70403424ed7ca9920d8e519c0772be0d307ed557c9d"} Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.426604 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c78ff548b-nppmt" Feb 26 11:15:12 crc kubenswrapper[4699]: E0226 11:15:12.432248 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29535074-bjfld" podUID="30d444da-9127-459c-97c6-cdcff5b20e67" Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.461469 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c78ff548b-nppmt"] Feb 26 11:15:12 crc kubenswrapper[4699]: I0226 11:15:12.465534 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7c78ff548b-nppmt"] Feb 26 11:15:14 crc kubenswrapper[4699]: I0226 11:15:14.348616 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1873d943-4785-4bc5-a9c4-5a027a932464" path="/var/lib/kubelet/pods/1873d943-4785-4bc5-a9c4-5a027a932464/volumes" Feb 26 11:15:19 crc kubenswrapper[4699]: I0226 11:15:19.055520 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:15:19 crc kubenswrapper[4699]: I0226 11:15:19.056308 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:15:19 crc kubenswrapper[4699]: I0226 11:15:19.406189 4699 scope.go:117] "RemoveContainer" containerID="a08711cd4ea22926a99c6ab95e9c9d94bed1d66b8cc11c69332013859eda951b" Feb 26 11:15:20 crc kubenswrapper[4699]: I0226 11:15:20.599808 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535075-hl4g4"] Feb 26 11:15:20 crc kubenswrapper[4699]: I0226 11:15:20.616968 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 26 11:15:20 crc kubenswrapper[4699]: I0226 11:15:20.628208 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 26 11:15:21 crc kubenswrapper[4699]: I0226 11:15:21.427933 4699 ???:1] "http: TLS handshake error from 192.168.126.11:41566: no serving certificate available for the kubelet" Feb 26 11:15:26 crc kubenswrapper[4699]: E0226 11:15:26.072197 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 26 11:15:26 crc kubenswrapper[4699]: E0226 11:15:26.073169 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9z6wd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-mzgjj_openshift-marketplace(71a83978-4f86-404b-967a-0e7493ff6721): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 11:15:26 crc kubenswrapper[4699]: E0226 11:15:26.074722 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-mzgjj" podUID="71a83978-4f86-404b-967a-0e7493ff6721" Feb 26 11:15:27 crc kubenswrapper[4699]: E0226 11:15:27.633844 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 26 11:15:27 crc kubenswrapper[4699]: E0226 11:15:27.634281 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8hnhh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-fhgnz_openshift-marketplace(1389c8c4-9546-4193-8067-50db90448d4f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 11:15:27 crc kubenswrapper[4699]: E0226 11:15:27.635448 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-fhgnz" podUID="1389c8c4-9546-4193-8067-50db90448d4f" Feb 26 11:15:28 crc kubenswrapper[4699]: I0226 11:15:28.548752 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-22qbz"] Feb 26 11:15:29 crc kubenswrapper[4699]: I0226 11:15:29.056404 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:15:29 crc kubenswrapper[4699]: I0226 11:15:29.056502 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:15:39 crc kubenswrapper[4699]: I0226 11:15:39.056205 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:15:39 crc kubenswrapper[4699]: I0226 11:15:39.056750 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:15:41 crc kubenswrapper[4699]: I0226 11:15:41.585471 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:15:41 crc kubenswrapper[4699]: I0226 11:15:41.586057 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:15:41 crc kubenswrapper[4699]: I0226 11:15:41.586207 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:15:41 crc kubenswrapper[4699]: I0226 11:15:41.588220 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4"} pod="openshift-machine-config-operator/machine-config-daemon-28p79" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 11:15:41 crc kubenswrapper[4699]: I0226 11:15:41.588431 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" containerID="cri-o://0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4" gracePeriod=600 Feb 26 11:15:44 crc kubenswrapper[4699]: I0226 11:15:44.209676 4699 generic.go:334] "Generic (PLEG): container finished" podID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerID="0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4" exitCode=0 Feb 26 11:15:44 crc kubenswrapper[4699]: I0226 11:15:44.209794 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerDied","Data":"0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4"} Feb 26 11:15:44 crc kubenswrapper[4699]: E0226 11:15:44.368619 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 26 11:15:44 crc kubenswrapper[4699]: E0226 11:15:44.368815 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2tqhd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-sc9c6_openshift-marketplace(44d171ad-7d92-4c70-a686-65f60ded8a03): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 11:15:44 crc kubenswrapper[4699]: E0226 11:15:44.369998 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-sc9c6" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" Feb 26 11:15:47 crc kubenswrapper[4699]: E0226 11:15:47.092413 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-sc9c6" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" Feb 26 11:15:47 crc kubenswrapper[4699]: I0226 11:15:47.233473 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a904aa73-23d7-4994-882a-4afafe02fb82","Type":"ContainerStarted","Data":"c6277d115f7a8e7d06e98e6fbf746a8f5f67a2bf9660b521fc6a925c224a7f1a"} Feb 26 11:15:47 crc kubenswrapper[4699]: I0226 11:15:47.236019 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535075-hl4g4" event={"ID":"ed8aec36-74ad-4c69-baf8-d672010495e9","Type":"ContainerStarted","Data":"ef7f4740e98b0b8517cf802c045aedd86853560276f43770af8b78d775aa6c30"} Feb 26 11:15:47 crc kubenswrapper[4699]: I0226 11:15:47.236865 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e999a971-660e-4244-8ff3-5d41795bd7f1","Type":"ContainerStarted","Data":"8647ab112fe5d72e6317d33357b8faf5f04c7e9ece66676a3eb1dd1a578be5e7"} Feb 26 11:15:47 crc kubenswrapper[4699]: I0226 11:15:47.299433 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz"] Feb 26 11:15:47 crc kubenswrapper[4699]: W0226 11:15:47.316668 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb7738e1_5c72_401d_ba71_9ae3b1d9d266.slice/crio-5afbad926c55c3da8aaf47419aa634cd696fc133f69e9ec92350e49c1022647d WatchSource:0}: Error finding container 5afbad926c55c3da8aaf47419aa634cd696fc133f69e9ec92350e49c1022647d: Status 404 returned error can't find the container with id 5afbad926c55c3da8aaf47419aa634cd696fc133f69e9ec92350e49c1022647d Feb 26 11:15:47 crc kubenswrapper[4699]: I0226 11:15:47.393730 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bcd6f597b-s4crp"] Feb 26 11:15:48 crc kubenswrapper[4699]: I0226 11:15:48.242855 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" event={"ID":"fb7738e1-5c72-401d-ba71-9ae3b1d9d266","Type":"ContainerStarted","Data":"5afbad926c55c3da8aaf47419aa634cd696fc133f69e9ec92350e49c1022647d"} Feb 26 11:15:48 crc kubenswrapper[4699]: I0226 11:15:48.243686 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" event={"ID":"eb4d1ddf-c814-4b93-972e-bffff61f9170","Type":"ContainerStarted","Data":"2891ab62ab9758bb7b4c92c3675a2020c320b414be74c46ec576f65a5d1c42f1"} Feb 26 11:15:48 crc kubenswrapper[4699]: I0226 11:15:48.688064 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bcd6f597b-s4crp"] Feb 26 11:15:48 crc kubenswrapper[4699]: I0226 11:15:48.789065 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz"] Feb 26 11:15:49 crc kubenswrapper[4699]: I0226 11:15:49.055513 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:15:49 crc kubenswrapper[4699]: I0226 11:15:49.055577 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:15:50 crc kubenswrapper[4699]: I0226 11:15:50.257332 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" event={"ID":"eb4d1ddf-c814-4b93-972e-bffff61f9170","Type":"ContainerStarted","Data":"71dd1fba88e532cb5564144c451ba1f550ea9fc371b535e240792729f2f00e75"} Feb 26 11:15:50 crc kubenswrapper[4699]: I0226 11:15:50.258987 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a904aa73-23d7-4994-882a-4afafe02fb82","Type":"ContainerStarted","Data":"3fd7c462ca5ff5bad3835e30c617b483507570293909f8328947b3fbdead2389"} Feb 26 11:15:50 crc kubenswrapper[4699]: I0226 11:15:50.265892 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" event={"ID":"fb7738e1-5c72-401d-ba71-9ae3b1d9d266","Type":"ContainerStarted","Data":"42f38f097b8ede853f7ded75c70a15cea1617b056009499cf9728bf4be448b9c"} Feb 26 11:15:50 crc kubenswrapper[4699]: I0226 11:15:50.266014 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" podUID="fb7738e1-5c72-401d-ba71-9ae3b1d9d266" containerName="route-controller-manager" containerID="cri-o://42f38f097b8ede853f7ded75c70a15cea1617b056009499cf9728bf4be448b9c" gracePeriod=30 Feb 26 11:15:50 crc kubenswrapper[4699]: I0226 11:15:50.266712 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" Feb 26 11:15:50 crc kubenswrapper[4699]: I0226 11:15:50.278383 4699 generic.go:334] "Generic (PLEG): container finished" podID="ed8aec36-74ad-4c69-baf8-d672010495e9" containerID="1a649c81866f7635a569ca368b86ef4aadb641a91575dd77e87694a700822950" exitCode=0 Feb 26 11:15:50 crc kubenswrapper[4699]: I0226 11:15:50.278447 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535075-hl4g4" event={"ID":"ed8aec36-74ad-4c69-baf8-d672010495e9","Type":"ContainerDied","Data":"1a649c81866f7635a569ca368b86ef4aadb641a91575dd77e87694a700822950"} Feb 26 11:15:50 crc kubenswrapper[4699]: I0226 11:15:50.279636 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e999a971-660e-4244-8ff3-5d41795bd7f1","Type":"ContainerStarted","Data":"043f1a99306eeb89179e5d095bad024d7a81e0a392209e6ed07047c4d32579cd"} Feb 26 11:15:50 crc kubenswrapper[4699]: I0226 11:15:50.281283 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tcnxt" event={"ID":"72b1bc55-f48b-4d90-ab02-3a80438096b6","Type":"ContainerStarted","Data":"c4a4d68b91ccaab01354dac36137a42576c82cee0d31b4d795e6c8dc0cda8b68"} Feb 26 11:15:50 crc kubenswrapper[4699]: I0226 11:15:50.331830 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" podStartSLOduration=62.331806418 podStartE2EDuration="1m2.331806418s" podCreationTimestamp="2026-02-26 11:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:15:50.307939426 +0000 UTC m=+296.118765880" watchObservedRunningTime="2026-02-26 11:15:50.331806418 +0000 UTC m=+296.142632862" Feb 26 11:15:50 crc kubenswrapper[4699]: E0226 11:15:50.360486 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 26 11:15:50 crc kubenswrapper[4699]: E0226 11:15:50.360690 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rr725,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-s8kpz_openshift-marketplace(8c96a703-e568-4916-8035-a951ae91dc2b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 11:15:50 crc kubenswrapper[4699]: E0226 11:15:50.362744 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-s8kpz" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" Feb 26 11:15:51 crc kubenswrapper[4699]: I0226 11:15:51.080880 4699 patch_prober.go:28] interesting pod/route-controller-manager-7d67f9fbb8-7gsgz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": read tcp 10.217.0.2:58574->10.217.0.61:8443: read: connection reset by peer" start-of-body= Feb 26 11:15:51 crc kubenswrapper[4699]: I0226 11:15:51.081281 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" podUID="fb7738e1-5c72-401d-ba71-9ae3b1d9d266" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": read tcp 10.217.0.2:58574->10.217.0.61:8443: read: connection reset by peer" Feb 26 11:15:51 crc kubenswrapper[4699]: I0226 11:15:51.289170 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerStarted","Data":"650d424704999ccaef77ddc678846c35c1a480092b312ddf8beddcd52de6fa7e"} Feb 26 11:15:51 crc kubenswrapper[4699]: I0226 11:15:51.289438 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" podUID="eb4d1ddf-c814-4b93-972e-bffff61f9170" containerName="controller-manager" containerID="cri-o://71dd1fba88e532cb5564144c451ba1f550ea9fc371b535e240792729f2f00e75" gracePeriod=30 Feb 26 11:15:51 crc kubenswrapper[4699]: I0226 11:15:51.290440 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:15:51 crc kubenswrapper[4699]: I0226 11:15:51.290481 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:15:51 crc kubenswrapper[4699]: E0226 11:15:51.293085 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-s8kpz" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" Feb 26 11:15:51 crc kubenswrapper[4699]: I0226 11:15:51.322153 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" podStartSLOduration=63.322135377 podStartE2EDuration="1m3.322135377s" podCreationTimestamp="2026-02-26 11:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:15:51.320504179 +0000 UTC m=+297.131330633" watchObservedRunningTime="2026-02-26 11:15:51.322135377 +0000 UTC m=+297.132961811" Feb 26 11:15:51 crc kubenswrapper[4699]: I0226 11:15:51.339185 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=61.339161061 podStartE2EDuration="1m1.339161061s" podCreationTimestamp="2026-02-26 11:14:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:15:51.335486734 +0000 UTC m=+297.146313198" watchObservedRunningTime="2026-02-26 11:15:51.339161061 +0000 UTC m=+297.149987505" Feb 26 11:15:51 crc kubenswrapper[4699]: I0226 11:15:51.379359 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=66.379336497 podStartE2EDuration="1m6.379336497s" podCreationTimestamp="2026-02-26 11:14:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:15:51.377146923 +0000 UTC m=+297.187973357" watchObservedRunningTime="2026-02-26 11:15:51.379336497 +0000 UTC m=+297.190162941" Feb 26 11:15:51 crc kubenswrapper[4699]: E0226 11:15:51.714766 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 26 11:15:51 crc kubenswrapper[4699]: E0226 11:15:51.715296 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-44jnw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-jhgks_openshift-marketplace(6b9da973-6b5f-4485-adca-8792b0a3d256): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 11:15:51 crc kubenswrapper[4699]: E0226 11:15:51.716547 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-jhgks" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" Feb 26 11:15:51 crc kubenswrapper[4699]: E0226 11:15:51.941213 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 26 11:15:51 crc kubenswrapper[4699]: E0226 11:15:51.941407 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-699tw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-phhbz_openshift-marketplace(9ea10063-7888-400e-af1c-216cbde5a13e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 11:15:51 crc kubenswrapper[4699]: E0226 11:15:51.943632 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-phhbz" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" Feb 26 11:15:51 crc kubenswrapper[4699]: E0226 11:15:51.987563 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 26 11:15:51 crc kubenswrapper[4699]: E0226 11:15:51.987760 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7xhdp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-hrk4n_openshift-marketplace(6e7ddf51-5522-4085-8567-76c9a254ed15): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 11:15:51 crc kubenswrapper[4699]: E0226 11:15:51.988990 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-hrk4n" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" Feb 26 11:15:51 crc kubenswrapper[4699]: E0226 11:15:51.997647 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 26 11:15:51 crc kubenswrapper[4699]: E0226 11:15:51.997788 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rqrqs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-czwkc_openshift-marketplace(ac0026c3-1fad-4b34-9c42-389971f0c773): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 11:15:51 crc kubenswrapper[4699]: E0226 11:15:51.999001 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-czwkc" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.139148 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-7d67f9fbb8-7gsgz_fb7738e1-5c72-401d-ba71-9ae3b1d9d266/route-controller-manager/0.log" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.139213 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.153667 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535075-hl4g4" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.172525 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8wsx\" (UniqueName: \"kubernetes.io/projected/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-kube-api-access-z8wsx\") pod \"fb7738e1-5c72-401d-ba71-9ae3b1d9d266\" (UID: \"fb7738e1-5c72-401d-ba71-9ae3b1d9d266\") " Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.172580 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-config\") pod \"fb7738e1-5c72-401d-ba71-9ae3b1d9d266\" (UID: \"fb7738e1-5c72-401d-ba71-9ae3b1d9d266\") " Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.172725 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-serving-cert\") pod \"fb7738e1-5c72-401d-ba71-9ae3b1d9d266\" (UID: \"fb7738e1-5c72-401d-ba71-9ae3b1d9d266\") " Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.172770 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-client-ca\") pod \"fb7738e1-5c72-401d-ba71-9ae3b1d9d266\" (UID: \"fb7738e1-5c72-401d-ba71-9ae3b1d9d266\") " Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.174265 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-client-ca" (OuterVolumeSpecName: "client-ca") pod "fb7738e1-5c72-401d-ba71-9ae3b1d9d266" (UID: "fb7738e1-5c72-401d-ba71-9ae3b1d9d266"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.174934 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-config" (OuterVolumeSpecName: "config") pod "fb7738e1-5c72-401d-ba71-9ae3b1d9d266" (UID: "fb7738e1-5c72-401d-ba71-9ae3b1d9d266"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.182019 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-kube-api-access-z8wsx" (OuterVolumeSpecName: "kube-api-access-z8wsx") pod "fb7738e1-5c72-401d-ba71-9ae3b1d9d266" (UID: "fb7738e1-5c72-401d-ba71-9ae3b1d9d266"). InnerVolumeSpecName "kube-api-access-z8wsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.183781 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt"] Feb 26 11:15:52 crc kubenswrapper[4699]: E0226 11:15:52.184131 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8aec36-74ad-4c69-baf8-d672010495e9" containerName="collect-profiles" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.184149 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8aec36-74ad-4c69-baf8-d672010495e9" containerName="collect-profiles" Feb 26 11:15:52 crc kubenswrapper[4699]: E0226 11:15:52.184164 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb7738e1-5c72-401d-ba71-9ae3b1d9d266" containerName="route-controller-manager" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.184175 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb7738e1-5c72-401d-ba71-9ae3b1d9d266" containerName="route-controller-manager" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.184352 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed8aec36-74ad-4c69-baf8-d672010495e9" containerName="collect-profiles" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.184378 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb7738e1-5c72-401d-ba71-9ae3b1d9d266" containerName="route-controller-manager" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.184985 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.185994 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fb7738e1-5c72-401d-ba71-9ae3b1d9d266" (UID: "fb7738e1-5c72-401d-ba71-9ae3b1d9d266"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.197142 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt"] Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.274439 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed8aec36-74ad-4c69-baf8-d672010495e9-secret-volume\") pod \"ed8aec36-74ad-4c69-baf8-d672010495e9\" (UID: \"ed8aec36-74ad-4c69-baf8-d672010495e9\") " Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.274525 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kmjp\" (UniqueName: \"kubernetes.io/projected/ed8aec36-74ad-4c69-baf8-d672010495e9-kube-api-access-7kmjp\") pod \"ed8aec36-74ad-4c69-baf8-d672010495e9\" (UID: \"ed8aec36-74ad-4c69-baf8-d672010495e9\") " Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.274582 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed8aec36-74ad-4c69-baf8-d672010495e9-config-volume\") pod \"ed8aec36-74ad-4c69-baf8-d672010495e9\" (UID: \"ed8aec36-74ad-4c69-baf8-d672010495e9\") " Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.274807 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54vbq\" (UniqueName: \"kubernetes.io/projected/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-kube-api-access-54vbq\") pod \"route-controller-manager-5dd48cdbf5-ckczt\" (UID: \"bcf1b992-7a07-4490-bf91-a0e2f802d6aa\") " pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.274845 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-client-ca\") pod \"route-controller-manager-5dd48cdbf5-ckczt\" (UID: \"bcf1b992-7a07-4490-bf91-a0e2f802d6aa\") " pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.274890 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-config\") pod \"route-controller-manager-5dd48cdbf5-ckczt\" (UID: \"bcf1b992-7a07-4490-bf91-a0e2f802d6aa\") " pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.274929 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-serving-cert\") pod \"route-controller-manager-5dd48cdbf5-ckczt\" (UID: \"bcf1b992-7a07-4490-bf91-a0e2f802d6aa\") " pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.274974 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.274992 4699 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.275002 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8wsx\" (UniqueName: \"kubernetes.io/projected/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-kube-api-access-z8wsx\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.275014 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb7738e1-5c72-401d-ba71-9ae3b1d9d266-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.276019 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed8aec36-74ad-4c69-baf8-d672010495e9-config-volume" (OuterVolumeSpecName: "config-volume") pod "ed8aec36-74ad-4c69-baf8-d672010495e9" (UID: "ed8aec36-74ad-4c69-baf8-d672010495e9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.278975 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed8aec36-74ad-4c69-baf8-d672010495e9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ed8aec36-74ad-4c69-baf8-d672010495e9" (UID: "ed8aec36-74ad-4c69-baf8-d672010495e9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.279200 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed8aec36-74ad-4c69-baf8-d672010495e9-kube-api-access-7kmjp" (OuterVolumeSpecName: "kube-api-access-7kmjp") pod "ed8aec36-74ad-4c69-baf8-d672010495e9" (UID: "ed8aec36-74ad-4c69-baf8-d672010495e9"). InnerVolumeSpecName "kube-api-access-7kmjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.295405 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535075-hl4g4" event={"ID":"ed8aec36-74ad-4c69-baf8-d672010495e9","Type":"ContainerDied","Data":"ef7f4740e98b0b8517cf802c045aedd86853560276f43770af8b78d775aa6c30"} Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.295452 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef7f4740e98b0b8517cf802c045aedd86853560276f43770af8b78d775aa6c30" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.295491 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535075-hl4g4" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.297282 4699 generic.go:334] "Generic (PLEG): container finished" podID="e999a971-660e-4244-8ff3-5d41795bd7f1" containerID="043f1a99306eeb89179e5d095bad024d7a81e0a392209e6ed07047c4d32579cd" exitCode=0 Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.297389 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e999a971-660e-4244-8ff3-5d41795bd7f1","Type":"ContainerDied","Data":"043f1a99306eeb89179e5d095bad024d7a81e0a392209e6ed07047c4d32579cd"} Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.303632 4699 generic.go:334] "Generic (PLEG): container finished" podID="eb4d1ddf-c814-4b93-972e-bffff61f9170" containerID="71dd1fba88e532cb5564144c451ba1f550ea9fc371b535e240792729f2f00e75" exitCode=0 Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.303745 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" event={"ID":"eb4d1ddf-c814-4b93-972e-bffff61f9170","Type":"ContainerDied","Data":"71dd1fba88e532cb5564144c451ba1f550ea9fc371b535e240792729f2f00e75"} Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.305277 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-7d67f9fbb8-7gsgz_fb7738e1-5c72-401d-ba71-9ae3b1d9d266/route-controller-manager/0.log" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.305333 4699 generic.go:334] "Generic (PLEG): container finished" podID="fb7738e1-5c72-401d-ba71-9ae3b1d9d266" containerID="42f38f097b8ede853f7ded75c70a15cea1617b056009499cf9728bf4be448b9c" exitCode=255 Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.305379 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" event={"ID":"fb7738e1-5c72-401d-ba71-9ae3b1d9d266","Type":"ContainerDied","Data":"42f38f097b8ede853f7ded75c70a15cea1617b056009499cf9728bf4be448b9c"} Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.305409 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.305422 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz" event={"ID":"fb7738e1-5c72-401d-ba71-9ae3b1d9d266","Type":"ContainerDied","Data":"5afbad926c55c3da8aaf47419aa634cd696fc133f69e9ec92350e49c1022647d"} Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.305443 4699 scope.go:117] "RemoveContainer" containerID="42f38f097b8ede853f7ded75c70a15cea1617b056009499cf9728bf4be448b9c" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.333725 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.343273 4699 patch_prober.go:28] interesting pod/controller-manager-7bcd6f597b-s4crp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" start-of-body= Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.343360 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" podUID="eb4d1ddf-c814-4b93-972e-bffff61f9170" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.376830 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-serving-cert\") pod \"route-controller-manager-5dd48cdbf5-ckczt\" (UID: \"bcf1b992-7a07-4490-bf91-a0e2f802d6aa\") " pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.376990 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54vbq\" (UniqueName: \"kubernetes.io/projected/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-kube-api-access-54vbq\") pod \"route-controller-manager-5dd48cdbf5-ckczt\" (UID: \"bcf1b992-7a07-4490-bf91-a0e2f802d6aa\") " pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.377060 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-client-ca\") pod \"route-controller-manager-5dd48cdbf5-ckczt\" (UID: \"bcf1b992-7a07-4490-bf91-a0e2f802d6aa\") " pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.377305 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-config\") pod \"route-controller-manager-5dd48cdbf5-ckczt\" (UID: \"bcf1b992-7a07-4490-bf91-a0e2f802d6aa\") " pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.377388 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kmjp\" (UniqueName: \"kubernetes.io/projected/ed8aec36-74ad-4c69-baf8-d672010495e9-kube-api-access-7kmjp\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.377403 4699 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed8aec36-74ad-4c69-baf8-d672010495e9-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.377416 4699 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed8aec36-74ad-4c69-baf8-d672010495e9-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.382428 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-client-ca\") pod \"route-controller-manager-5dd48cdbf5-ckczt\" (UID: \"bcf1b992-7a07-4490-bf91-a0e2f802d6aa\") " pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.383758 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-config\") pod \"route-controller-manager-5dd48cdbf5-ckczt\" (UID: \"bcf1b992-7a07-4490-bf91-a0e2f802d6aa\") " pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.386321 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-serving-cert\") pod \"route-controller-manager-5dd48cdbf5-ckczt\" (UID: \"bcf1b992-7a07-4490-bf91-a0e2f802d6aa\") " pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.399710 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54vbq\" (UniqueName: \"kubernetes.io/projected/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-kube-api-access-54vbq\") pod \"route-controller-manager-5dd48cdbf5-ckczt\" (UID: \"bcf1b992-7a07-4490-bf91-a0e2f802d6aa\") " pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.411200 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz"] Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.414779 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d67f9fbb8-7gsgz"] Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.518919 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" Feb 26 11:15:52 crc kubenswrapper[4699]: E0226 11:15:52.580010 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-phhbz" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" Feb 26 11:15:52 crc kubenswrapper[4699]: E0226 11:15:52.580070 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-czwkc" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.580196 4699 scope.go:117] "RemoveContainer" containerID="42f38f097b8ede853f7ded75c70a15cea1617b056009499cf9728bf4be448b9c" Feb 26 11:15:52 crc kubenswrapper[4699]: E0226 11:15:52.580282 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-hrk4n" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" Feb 26 11:15:52 crc kubenswrapper[4699]: E0226 11:15:52.580408 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-jhgks" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" Feb 26 11:15:52 crc kubenswrapper[4699]: E0226 11:15:52.580568 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42f38f097b8ede853f7ded75c70a15cea1617b056009499cf9728bf4be448b9c\": container with ID starting with 42f38f097b8ede853f7ded75c70a15cea1617b056009499cf9728bf4be448b9c not found: ID does not exist" containerID="42f38f097b8ede853f7ded75c70a15cea1617b056009499cf9728bf4be448b9c" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.580686 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42f38f097b8ede853f7ded75c70a15cea1617b056009499cf9728bf4be448b9c"} err="failed to get container status \"42f38f097b8ede853f7ded75c70a15cea1617b056009499cf9728bf4be448b9c\": rpc error: code = NotFound desc = could not find container \"42f38f097b8ede853f7ded75c70a15cea1617b056009499cf9728bf4be448b9c\": container with ID starting with 42f38f097b8ede853f7ded75c70a15cea1617b056009499cf9728bf4be448b9c not found: ID does not exist" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.591585 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.681070 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkb74\" (UniqueName: \"kubernetes.io/projected/eb4d1ddf-c814-4b93-972e-bffff61f9170-kube-api-access-lkb74\") pod \"eb4d1ddf-c814-4b93-972e-bffff61f9170\" (UID: \"eb4d1ddf-c814-4b93-972e-bffff61f9170\") " Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.681747 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb4d1ddf-c814-4b93-972e-bffff61f9170-proxy-ca-bundles\") pod \"eb4d1ddf-c814-4b93-972e-bffff61f9170\" (UID: \"eb4d1ddf-c814-4b93-972e-bffff61f9170\") " Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.681832 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb4d1ddf-c814-4b93-972e-bffff61f9170-serving-cert\") pod \"eb4d1ddf-c814-4b93-972e-bffff61f9170\" (UID: \"eb4d1ddf-c814-4b93-972e-bffff61f9170\") " Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.681886 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb4d1ddf-c814-4b93-972e-bffff61f9170-config\") pod \"eb4d1ddf-c814-4b93-972e-bffff61f9170\" (UID: \"eb4d1ddf-c814-4b93-972e-bffff61f9170\") " Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.681919 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb4d1ddf-c814-4b93-972e-bffff61f9170-client-ca\") pod \"eb4d1ddf-c814-4b93-972e-bffff61f9170\" (UID: \"eb4d1ddf-c814-4b93-972e-bffff61f9170\") " Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.683335 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb4d1ddf-c814-4b93-972e-bffff61f9170-client-ca" (OuterVolumeSpecName: "client-ca") pod "eb4d1ddf-c814-4b93-972e-bffff61f9170" (UID: "eb4d1ddf-c814-4b93-972e-bffff61f9170"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.684626 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb4d1ddf-c814-4b93-972e-bffff61f9170-config" (OuterVolumeSpecName: "config") pod "eb4d1ddf-c814-4b93-972e-bffff61f9170" (UID: "eb4d1ddf-c814-4b93-972e-bffff61f9170"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.685012 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb4d1ddf-c814-4b93-972e-bffff61f9170-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "eb4d1ddf-c814-4b93-972e-bffff61f9170" (UID: "eb4d1ddf-c814-4b93-972e-bffff61f9170"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.688227 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb4d1ddf-c814-4b93-972e-bffff61f9170-kube-api-access-lkb74" (OuterVolumeSpecName: "kube-api-access-lkb74") pod "eb4d1ddf-c814-4b93-972e-bffff61f9170" (UID: "eb4d1ddf-c814-4b93-972e-bffff61f9170"). InnerVolumeSpecName "kube-api-access-lkb74". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.689081 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4d1ddf-c814-4b93-972e-bffff61f9170-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "eb4d1ddf-c814-4b93-972e-bffff61f9170" (UID: "eb4d1ddf-c814-4b93-972e-bffff61f9170"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.783528 4699 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb4d1ddf-c814-4b93-972e-bffff61f9170-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.783823 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkb74\" (UniqueName: \"kubernetes.io/projected/eb4d1ddf-c814-4b93-972e-bffff61f9170-kube-api-access-lkb74\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.783837 4699 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb4d1ddf-c814-4b93-972e-bffff61f9170-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.783848 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb4d1ddf-c814-4b93-972e-bffff61f9170-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:52 crc kubenswrapper[4699]: I0226 11:15:52.783858 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb4d1ddf-c814-4b93-972e-bffff61f9170-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:53 crc kubenswrapper[4699]: I0226 11:15:53.002568 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt"] Feb 26 11:15:53 crc kubenswrapper[4699]: W0226 11:15:53.009309 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcf1b992_7a07_4490_bf91_a0e2f802d6aa.slice/crio-fc9c825381387b5b1c159e4f7a59a6327fb0accf80fe9e5362f7a82129337743 WatchSource:0}: Error finding container fc9c825381387b5b1c159e4f7a59a6327fb0accf80fe9e5362f7a82129337743: Status 404 returned error can't find the container with id fc9c825381387b5b1c159e4f7a59a6327fb0accf80fe9e5362f7a82129337743 Feb 26 11:15:53 crc kubenswrapper[4699]: I0226 11:15:53.314612 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" event={"ID":"eb4d1ddf-c814-4b93-972e-bffff61f9170","Type":"ContainerDied","Data":"2891ab62ab9758bb7b4c92c3675a2020c320b414be74c46ec576f65a5d1c42f1"} Feb 26 11:15:53 crc kubenswrapper[4699]: I0226 11:15:53.314675 4699 scope.go:117] "RemoveContainer" containerID="71dd1fba88e532cb5564144c451ba1f550ea9fc371b535e240792729f2f00e75" Feb 26 11:15:53 crc kubenswrapper[4699]: I0226 11:15:53.314660 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bcd6f597b-s4crp" Feb 26 11:15:53 crc kubenswrapper[4699]: I0226 11:15:53.318139 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535074-bjfld" event={"ID":"30d444da-9127-459c-97c6-cdcff5b20e67","Type":"ContainerStarted","Data":"19a60f72e3a64feb9f04d813b42f9a20a08e1ed258c497a9b61b68ad603f4b5b"} Feb 26 11:15:53 crc kubenswrapper[4699]: I0226 11:15:53.320024 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" event={"ID":"bcf1b992-7a07-4490-bf91-a0e2f802d6aa","Type":"ContainerStarted","Data":"3bdc2b92ccb0248bc419987c54d581fc1cccfbe419459f4758200553b00ea095"} Feb 26 11:15:53 crc kubenswrapper[4699]: I0226 11:15:53.320062 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" event={"ID":"bcf1b992-7a07-4490-bf91-a0e2f802d6aa","Type":"ContainerStarted","Data":"fc9c825381387b5b1c159e4f7a59a6327fb0accf80fe9e5362f7a82129337743"} Feb 26 11:15:53 crc kubenswrapper[4699]: I0226 11:15:53.352952 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535074-bjfld" podStartSLOduration=2.567809177 podStartE2EDuration="1m53.352931508s" podCreationTimestamp="2026-02-26 11:14:00 +0000 UTC" firstStartedPulling="2026-02-26 11:14:01.806605069 +0000 UTC m=+187.617431503" lastFinishedPulling="2026-02-26 11:15:52.59172741 +0000 UTC m=+298.402553834" observedRunningTime="2026-02-26 11:15:53.339418156 +0000 UTC m=+299.150244600" watchObservedRunningTime="2026-02-26 11:15:53.352931508 +0000 UTC m=+299.163757952" Feb 26 11:15:53 crc kubenswrapper[4699]: I0226 11:15:53.355840 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bcd6f597b-s4crp"] Feb 26 11:15:53 crc kubenswrapper[4699]: I0226 11:15:53.359908 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7bcd6f597b-s4crp"] Feb 26 11:15:53 crc kubenswrapper[4699]: I0226 11:15:53.512315 4699 csr.go:261] certificate signing request csr-299v4 is approved, waiting to be issued Feb 26 11:15:53 crc kubenswrapper[4699]: I0226 11:15:53.520608 4699 csr.go:257] certificate signing request csr-299v4 is issued Feb 26 11:15:53 crc kubenswrapper[4699]: I0226 11:15:53.591322 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" podUID="94f9b9d1-e4e9-4cd5-8606-80f57ee5c466" containerName="oauth-openshift" containerID="cri-o://74570cc7e5f47cfb5ae78c7040168924d22c48d5892dd1918f787cd4639c996c" gracePeriod=15 Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.268849 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb4d1ddf-c814-4b93-972e-bffff61f9170" path="/var/lib/kubelet/pods/eb4d1ddf-c814-4b93-972e-bffff61f9170/volumes" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.269460 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb7738e1-5c72-401d-ba71-9ae3b1d9d266" path="/var/lib/kubelet/pods/fb7738e1-5c72-401d-ba71-9ae3b1d9d266/volumes" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.328656 4699 generic.go:334] "Generic (PLEG): container finished" podID="94f9b9d1-e4e9-4cd5-8606-80f57ee5c466" containerID="74570cc7e5f47cfb5ae78c7040168924d22c48d5892dd1918f787cd4639c996c" exitCode=0 Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.328773 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" event={"ID":"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466","Type":"ContainerDied","Data":"74570cc7e5f47cfb5ae78c7040168924d22c48d5892dd1918f787cd4639c996c"} Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.331507 4699 generic.go:334] "Generic (PLEG): container finished" podID="30d444da-9127-459c-97c6-cdcff5b20e67" containerID="19a60f72e3a64feb9f04d813b42f9a20a08e1ed258c497a9b61b68ad603f4b5b" exitCode=0 Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.331592 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535074-bjfld" event={"ID":"30d444da-9127-459c-97c6-cdcff5b20e67","Type":"ContainerDied","Data":"19a60f72e3a64feb9f04d813b42f9a20a08e1ed258c497a9b61b68ad603f4b5b"} Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.333226 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.338017 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.353611 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" podStartSLOduration=6.353592195 podStartE2EDuration="6.353592195s" podCreationTimestamp="2026-02-26 11:15:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:15:54.353361779 +0000 UTC m=+300.164188233" watchObservedRunningTime="2026-02-26 11:15:54.353592195 +0000 UTC m=+300.164418629" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.522417 4699 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-23 12:46:37.430995219 +0000 UTC Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.522472 4699 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7201h30m42.908527273s for next certificate rotation Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.611359 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk"] Feb 26 11:15:54 crc kubenswrapper[4699]: E0226 11:15:54.611616 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4d1ddf-c814-4b93-972e-bffff61f9170" containerName="controller-manager" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.611630 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4d1ddf-c814-4b93-972e-bffff61f9170" containerName="controller-manager" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.611772 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4d1ddf-c814-4b93-972e-bffff61f9170" containerName="controller-manager" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.612208 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.615686 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.615885 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.616078 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.616345 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.616576 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.616721 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.624791 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk"] Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.625170 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.711236 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d17836b-1dda-4b03-8417-7025a21b7f0f-config\") pod \"controller-manager-75fbbf96d5-6r6lk\" (UID: \"0d17836b-1dda-4b03-8417-7025a21b7f0f\") " pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.711354 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d17836b-1dda-4b03-8417-7025a21b7f0f-proxy-ca-bundles\") pod \"controller-manager-75fbbf96d5-6r6lk\" (UID: \"0d17836b-1dda-4b03-8417-7025a21b7f0f\") " pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.711404 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv9tf\" (UniqueName: \"kubernetes.io/projected/0d17836b-1dda-4b03-8417-7025a21b7f0f-kube-api-access-rv9tf\") pod \"controller-manager-75fbbf96d5-6r6lk\" (UID: \"0d17836b-1dda-4b03-8417-7025a21b7f0f\") " pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.711447 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d17836b-1dda-4b03-8417-7025a21b7f0f-serving-cert\") pod \"controller-manager-75fbbf96d5-6r6lk\" (UID: \"0d17836b-1dda-4b03-8417-7025a21b7f0f\") " pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.711473 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d17836b-1dda-4b03-8417-7025a21b7f0f-client-ca\") pod \"controller-manager-75fbbf96d5-6r6lk\" (UID: \"0d17836b-1dda-4b03-8417-7025a21b7f0f\") " pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.813442 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d17836b-1dda-4b03-8417-7025a21b7f0f-config\") pod \"controller-manager-75fbbf96d5-6r6lk\" (UID: \"0d17836b-1dda-4b03-8417-7025a21b7f0f\") " pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.814006 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d17836b-1dda-4b03-8417-7025a21b7f0f-proxy-ca-bundles\") pod \"controller-manager-75fbbf96d5-6r6lk\" (UID: \"0d17836b-1dda-4b03-8417-7025a21b7f0f\") " pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.814057 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv9tf\" (UniqueName: \"kubernetes.io/projected/0d17836b-1dda-4b03-8417-7025a21b7f0f-kube-api-access-rv9tf\") pod \"controller-manager-75fbbf96d5-6r6lk\" (UID: \"0d17836b-1dda-4b03-8417-7025a21b7f0f\") " pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.814102 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d17836b-1dda-4b03-8417-7025a21b7f0f-serving-cert\") pod \"controller-manager-75fbbf96d5-6r6lk\" (UID: \"0d17836b-1dda-4b03-8417-7025a21b7f0f\") " pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.814139 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d17836b-1dda-4b03-8417-7025a21b7f0f-client-ca\") pod \"controller-manager-75fbbf96d5-6r6lk\" (UID: \"0d17836b-1dda-4b03-8417-7025a21b7f0f\") " pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.814839 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d17836b-1dda-4b03-8417-7025a21b7f0f-config\") pod \"controller-manager-75fbbf96d5-6r6lk\" (UID: \"0d17836b-1dda-4b03-8417-7025a21b7f0f\") " pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.815681 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d17836b-1dda-4b03-8417-7025a21b7f0f-client-ca\") pod \"controller-manager-75fbbf96d5-6r6lk\" (UID: \"0d17836b-1dda-4b03-8417-7025a21b7f0f\") " pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.816054 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d17836b-1dda-4b03-8417-7025a21b7f0f-proxy-ca-bundles\") pod \"controller-manager-75fbbf96d5-6r6lk\" (UID: \"0d17836b-1dda-4b03-8417-7025a21b7f0f\") " pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.822048 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d17836b-1dda-4b03-8417-7025a21b7f0f-serving-cert\") pod \"controller-manager-75fbbf96d5-6r6lk\" (UID: \"0d17836b-1dda-4b03-8417-7025a21b7f0f\") " pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.831523 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv9tf\" (UniqueName: \"kubernetes.io/projected/0d17836b-1dda-4b03-8417-7025a21b7f0f-kube-api-access-rv9tf\") pod \"controller-manager-75fbbf96d5-6r6lk\" (UID: \"0d17836b-1dda-4b03-8417-7025a21b7f0f\") " pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" Feb 26 11:15:54 crc kubenswrapper[4699]: I0226 11:15:54.966543 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" Feb 26 11:15:55 crc kubenswrapper[4699]: E0226 11:15:55.282655 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:15:55 crc kubenswrapper[4699]: E0226 11:15:55.298453 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:15:55 crc kubenswrapper[4699]: E0226 11:15:55.309424 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:15:55 crc kubenswrapper[4699]: I0226 11:15:55.522938 4699 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-13 03:12:07.136687041 +0000 UTC Feb 26 11:15:55 crc kubenswrapper[4699]: I0226 11:15:55.523000 4699 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6951h56m11.613690571s for next certificate rotation Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.056170 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-tcnxt" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.056411 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.057181 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.056884 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.057264 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.057905 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.057999 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.156739 4699 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-22qbz container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" start-of-body= Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.156813 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" podUID="94f9b9d1-e4e9-4cd5-8606-80f57ee5c466" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.208690 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.289439 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e999a971-660e-4244-8ff3-5d41795bd7f1-kubelet-dir\") pod \"e999a971-660e-4244-8ff3-5d41795bd7f1\" (UID: \"e999a971-660e-4244-8ff3-5d41795bd7f1\") " Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.289572 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e999a971-660e-4244-8ff3-5d41795bd7f1-kube-api-access\") pod \"e999a971-660e-4244-8ff3-5d41795bd7f1\" (UID: \"e999a971-660e-4244-8ff3-5d41795bd7f1\") " Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.289566 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e999a971-660e-4244-8ff3-5d41795bd7f1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e999a971-660e-4244-8ff3-5d41795bd7f1" (UID: "e999a971-660e-4244-8ff3-5d41795bd7f1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.289982 4699 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e999a971-660e-4244-8ff3-5d41795bd7f1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.296026 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e999a971-660e-4244-8ff3-5d41795bd7f1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e999a971-660e-4244-8ff3-5d41795bd7f1" (UID: "e999a971-660e-4244-8ff3-5d41795bd7f1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.356353 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535074-bjfld" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.366567 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.366637 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e999a971-660e-4244-8ff3-5d41795bd7f1","Type":"ContainerDied","Data":"8647ab112fe5d72e6317d33357b8faf5f04c7e9ece66676a3eb1dd1a578be5e7"} Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.366834 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8647ab112fe5d72e6317d33357b8faf5f04c7e9ece66676a3eb1dd1a578be5e7" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.369915 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535074-bjfld" event={"ID":"30d444da-9127-459c-97c6-cdcff5b20e67","Type":"ContainerDied","Data":"18076c1c5e0cfb7ca48ea66321abbc8359663b222708aa29c8481673d9c4ff5c"} Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.369972 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18076c1c5e0cfb7ca48ea66321abbc8359663b222708aa29c8481673d9c4ff5c" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.370016 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535074-bjfld" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.390691 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq8lx\" (UniqueName: \"kubernetes.io/projected/30d444da-9127-459c-97c6-cdcff5b20e67-kube-api-access-qq8lx\") pod \"30d444da-9127-459c-97c6-cdcff5b20e67\" (UID: \"30d444da-9127-459c-97c6-cdcff5b20e67\") " Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.391127 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e999a971-660e-4244-8ff3-5d41795bd7f1-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.396085 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30d444da-9127-459c-97c6-cdcff5b20e67-kube-api-access-qq8lx" (OuterVolumeSpecName: "kube-api-access-qq8lx") pod "30d444da-9127-459c-97c6-cdcff5b20e67" (UID: "30d444da-9127-459c-97c6-cdcff5b20e67"). InnerVolumeSpecName "kube-api-access-qq8lx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.492367 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq8lx\" (UniqueName: \"kubernetes.io/projected/30d444da-9127-459c-97c6-cdcff5b20e67-kube-api-access-qq8lx\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.682091 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.746393 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk"] Feb 26 11:15:59 crc kubenswrapper[4699]: W0226 11:15:59.750901 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d17836b_1dda_4b03_8417_7025a21b7f0f.slice/crio-49b49b629531d2d3d10e7e6211aa14269de0c64c913e3b2b4346368fd08d5a2b WatchSource:0}: Error finding container 49b49b629531d2d3d10e7e6211aa14269de0c64c913e3b2b4346368fd08d5a2b: Status 404 returned error can't find the container with id 49b49b629531d2d3d10e7e6211aa14269de0c64c913e3b2b4346368fd08d5a2b Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.796581 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-idp-0-file-data\") pod \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.796629 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-template-login\") pod \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.796674 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-audit-dir\") pod \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.796701 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-audit-policies\") pod \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.796732 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-template-provider-selection\") pod \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.796766 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-service-ca\") pod \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.796799 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466" (UID: "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.796827 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-trusted-ca-bundle\") pod \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.796961 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-router-certs\") pod \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.796993 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-session\") pod \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.797029 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-serving-cert\") pod \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.797056 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-cliconfig\") pod \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.797093 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-template-error\") pod \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.797180 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-ocp-branding-template\") pod \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.797252 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7knj\" (UniqueName: \"kubernetes.io/projected/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-kube-api-access-q7knj\") pod \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\" (UID: \"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466\") " Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.797691 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466" (UID: "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.797830 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466" (UID: "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.798182 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466" (UID: "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.798313 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466" (UID: "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.799019 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.799042 4699 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.802408 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466" (UID: "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.802623 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466" (UID: "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.803004 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466" (UID: "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.803340 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466" (UID: "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.803408 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466" (UID: "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.803431 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-kube-api-access-q7knj" (OuterVolumeSpecName: "kube-api-access-q7knj") pod "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466" (UID: "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466"). InnerVolumeSpecName "kube-api-access-q7knj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.803536 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466" (UID: "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.804018 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466" (UID: "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.805659 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466" (UID: "94f9b9d1-e4e9-4cd5-8606-80f57ee5c466"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.900048 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7knj\" (UniqueName: \"kubernetes.io/projected/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-kube-api-access-q7knj\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.900098 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.900127 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.900142 4699 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.900155 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.900170 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.900181 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.900193 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.900205 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.900217 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.900229 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 26 11:15:59 crc kubenswrapper[4699]: I0226 11:15:59.900242 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 26 11:16:00 crc kubenswrapper[4699]: I0226 11:16:00.137364 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535076-rv9x5"] Feb 26 11:16:00 crc kubenswrapper[4699]: E0226 11:16:00.137697 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e999a971-660e-4244-8ff3-5d41795bd7f1" containerName="pruner" Feb 26 11:16:00 crc kubenswrapper[4699]: I0226 11:16:00.137713 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="e999a971-660e-4244-8ff3-5d41795bd7f1" containerName="pruner" Feb 26 11:16:00 crc kubenswrapper[4699]: E0226 11:16:00.137725 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94f9b9d1-e4e9-4cd5-8606-80f57ee5c466" containerName="oauth-openshift" Feb 26 11:16:00 crc kubenswrapper[4699]: I0226 11:16:00.137734 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="94f9b9d1-e4e9-4cd5-8606-80f57ee5c466" containerName="oauth-openshift" Feb 26 11:16:00 crc kubenswrapper[4699]: E0226 11:16:00.137761 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30d444da-9127-459c-97c6-cdcff5b20e67" containerName="oc" Feb 26 11:16:00 crc kubenswrapper[4699]: I0226 11:16:00.137771 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d444da-9127-459c-97c6-cdcff5b20e67" containerName="oc" Feb 26 11:16:00 crc kubenswrapper[4699]: I0226 11:16:00.137905 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="94f9b9d1-e4e9-4cd5-8606-80f57ee5c466" containerName="oauth-openshift" Feb 26 11:16:00 crc kubenswrapper[4699]: I0226 11:16:00.137940 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="e999a971-660e-4244-8ff3-5d41795bd7f1" containerName="pruner" Feb 26 11:16:00 crc kubenswrapper[4699]: I0226 11:16:00.137950 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="30d444da-9127-459c-97c6-cdcff5b20e67" containerName="oc" Feb 26 11:16:00 crc kubenswrapper[4699]: I0226 11:16:00.138510 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" Feb 26 11:16:00 crc kubenswrapper[4699]: I0226 11:16:00.141299 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 11:16:00 crc kubenswrapper[4699]: I0226 11:16:00.141552 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 11:16:00 crc kubenswrapper[4699]: I0226 11:16:00.142945 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 11:16:00 crc kubenswrapper[4699]: I0226 11:16:00.148903 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535076-rv9x5"] Feb 26 11:16:00 crc kubenswrapper[4699]: I0226 11:16:00.204162 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knl5z\" (UniqueName: \"kubernetes.io/projected/0d9d78c8-4193-47a8-9ed9-208f6dc25831-kube-api-access-knl5z\") pod \"auto-csr-approver-29535076-rv9x5\" (UID: \"0d9d78c8-4193-47a8-9ed9-208f6dc25831\") " pod="openshift-infra/auto-csr-approver-29535076-rv9x5" Feb 26 11:16:00 crc kubenswrapper[4699]: I0226 11:16:00.305613 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knl5z\" (UniqueName: \"kubernetes.io/projected/0d9d78c8-4193-47a8-9ed9-208f6dc25831-kube-api-access-knl5z\") pod \"auto-csr-approver-29535076-rv9x5\" (UID: \"0d9d78c8-4193-47a8-9ed9-208f6dc25831\") " pod="openshift-infra/auto-csr-approver-29535076-rv9x5" Feb 26 11:16:00 crc kubenswrapper[4699]: I0226 11:16:00.327442 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knl5z\" (UniqueName: \"kubernetes.io/projected/0d9d78c8-4193-47a8-9ed9-208f6dc25831-kube-api-access-knl5z\") pod \"auto-csr-approver-29535076-rv9x5\" (UID: \"0d9d78c8-4193-47a8-9ed9-208f6dc25831\") " pod="openshift-infra/auto-csr-approver-29535076-rv9x5" Feb 26 11:16:00 crc kubenswrapper[4699]: I0226 11:16:00.378319 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" event={"ID":"0d17836b-1dda-4b03-8417-7025a21b7f0f","Type":"ContainerStarted","Data":"24a7d1423c70d2629cc84a6d309405af35280dc1c8f65cb9cbedd726c5936739"} Feb 26 11:16:00 crc kubenswrapper[4699]: I0226 11:16:00.378375 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" event={"ID":"0d17836b-1dda-4b03-8417-7025a21b7f0f","Type":"ContainerStarted","Data":"49b49b629531d2d3d10e7e6211aa14269de0c64c913e3b2b4346368fd08d5a2b"} Feb 26 11:16:00 crc kubenswrapper[4699]: I0226 11:16:00.379918 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" event={"ID":"94f9b9d1-e4e9-4cd5-8606-80f57ee5c466","Type":"ContainerDied","Data":"9d7ac90385fbaeacd88791e44cd5f3dbc802f7727daac69d69660d2d1079d013"} Feb 26 11:16:00 crc kubenswrapper[4699]: I0226 11:16:00.379962 4699 scope.go:117] "RemoveContainer" containerID="74570cc7e5f47cfb5ae78c7040168924d22c48d5892dd1918f787cd4639c996c" Feb 26 11:16:00 crc kubenswrapper[4699]: I0226 11:16:00.380145 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-22qbz" Feb 26 11:16:00 crc kubenswrapper[4699]: I0226 11:16:00.411355 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-22qbz"] Feb 26 11:16:00 crc kubenswrapper[4699]: I0226 11:16:00.416975 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-22qbz"] Feb 26 11:16:00 crc kubenswrapper[4699]: I0226 11:16:00.497522 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" Feb 26 11:16:01 crc kubenswrapper[4699]: I0226 11:16:01.023108 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535076-rv9x5"] Feb 26 11:16:01 crc kubenswrapper[4699]: I0226 11:16:01.386124 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" event={"ID":"0d9d78c8-4193-47a8-9ed9-208f6dc25831","Type":"ContainerStarted","Data":"d9a56a2a86268af382b046874040445b48c2975a953e5d204ab0a77f6c325fdc"} Feb 26 11:16:01 crc kubenswrapper[4699]: I0226 11:16:01.391176 4699 generic.go:334] "Generic (PLEG): container finished" podID="1389c8c4-9546-4193-8067-50db90448d4f" containerID="576debb0d3d58f5281816cda92fedce6f78492ddc1301cf006959585594f82b9" exitCode=0 Feb 26 11:16:01 crc kubenswrapper[4699]: I0226 11:16:01.391245 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhgnz" event={"ID":"1389c8c4-9546-4193-8067-50db90448d4f","Type":"ContainerDied","Data":"576debb0d3d58f5281816cda92fedce6f78492ddc1301cf006959585594f82b9"} Feb 26 11:16:01 crc kubenswrapper[4699]: I0226 11:16:01.395929 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzgjj" event={"ID":"71a83978-4f86-404b-967a-0e7493ff6721","Type":"ContainerStarted","Data":"f41fa5d8badc750f1371bec0896b93547f2bd25c6f1942a17a10cfb9c1edba94"} Feb 26 11:16:01 crc kubenswrapper[4699]: I0226 11:16:01.395970 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" Feb 26 11:16:01 crc kubenswrapper[4699]: I0226 11:16:01.406928 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" Feb 26 11:16:01 crc kubenswrapper[4699]: I0226 11:16:01.469932 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" podStartSLOduration=13.469915361 podStartE2EDuration="13.469915361s" podCreationTimestamp="2026-02-26 11:15:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:16:01.46470423 +0000 UTC m=+307.275530664" watchObservedRunningTime="2026-02-26 11:16:01.469915361 +0000 UTC m=+307.280741795" Feb 26 11:16:02 crc kubenswrapper[4699]: I0226 11:16:02.268646 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94f9b9d1-e4e9-4cd5-8606-80f57ee5c466" path="/var/lib/kubelet/pods/94f9b9d1-e4e9-4cd5-8606-80f57ee5c466/volumes" Feb 26 11:16:02 crc kubenswrapper[4699]: I0226 11:16:02.402225 4699 generic.go:334] "Generic (PLEG): container finished" podID="71a83978-4f86-404b-967a-0e7493ff6721" containerID="f41fa5d8badc750f1371bec0896b93547f2bd25c6f1942a17a10cfb9c1edba94" exitCode=0 Feb 26 11:16:02 crc kubenswrapper[4699]: I0226 11:16:02.402333 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzgjj" event={"ID":"71a83978-4f86-404b-967a-0e7493ff6721","Type":"ContainerDied","Data":"f41fa5d8badc750f1371bec0896b93547f2bd25c6f1942a17a10cfb9c1edba94"} Feb 26 11:16:06 crc kubenswrapper[4699]: I0226 11:16:06.260694 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:16:06 crc kubenswrapper[4699]: I0226 11:16:06.437525 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhgnz" event={"ID":"1389c8c4-9546-4193-8067-50db90448d4f","Type":"ContainerStarted","Data":"b8eedef066fa8aaa6df130360e7ae91b6c35c65386cf0e1eb331ae24b87e6305"} Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.464993 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fhgnz" podStartSLOduration=6.040778383 podStartE2EDuration="2m0.464974119s" podCreationTimestamp="2026-02-26 11:14:07 +0000 UTC" firstStartedPulling="2026-02-26 11:14:11.064554627 +0000 UTC m=+196.875381061" lastFinishedPulling="2026-02-26 11:16:05.488750363 +0000 UTC m=+311.299576797" observedRunningTime="2026-02-26 11:16:07.462864418 +0000 UTC m=+313.273690872" watchObservedRunningTime="2026-02-26 11:16:07.464974119 +0000 UTC m=+313.275800573" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.618452 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-f54c45747-bbg8s"] Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.619498 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.625432 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.625596 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.625674 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.625812 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.627192 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.627474 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.627522 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.627572 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.627614 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.627856 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.628103 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.628226 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.634345 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.639178 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.641936 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-f54c45747-bbg8s"] Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.692933 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.796134 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqjbv\" (UniqueName: \"kubernetes.io/projected/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-kube-api-access-zqjbv\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.796267 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.796362 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-system-session\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.796393 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.796418 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.796508 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.796618 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-system-router-certs\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.796683 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.796721 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.796774 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-user-template-error\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.796803 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-user-template-login\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.796844 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-audit-policies\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.796875 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-system-service-ca\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.796930 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-audit-dir\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.897980 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.898046 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-system-session\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.898067 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.898085 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.898135 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.898162 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-system-router-certs\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.898846 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.899454 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.899592 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.899680 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-user-template-error\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.899688 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.899732 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-user-template-login\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.899763 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-audit-policies\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.899902 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-system-service-ca\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.899986 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-audit-dir\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.900062 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqjbv\" (UniqueName: \"kubernetes.io/projected/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-kube-api-access-zqjbv\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.900508 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-audit-policies\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.900570 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-audit-dir\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.900976 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-system-service-ca\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.904904 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-system-router-certs\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.905495 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.905549 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-system-session\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.906334 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-user-template-error\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.906541 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.908562 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-user-template-login\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.919672 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.919913 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.921266 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqjbv\" (UniqueName: \"kubernetes.io/projected/c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a-kube-api-access-zqjbv\") pod \"oauth-openshift-f54c45747-bbg8s\" (UID: \"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a\") " pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:07 crc kubenswrapper[4699]: I0226 11:16:07.938219 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:08 crc kubenswrapper[4699]: I0226 11:16:08.117492 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fhgnz" Feb 26 11:16:08 crc kubenswrapper[4699]: I0226 11:16:08.117555 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fhgnz" Feb 26 11:16:08 crc kubenswrapper[4699]: I0226 11:16:08.798394 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk"] Feb 26 11:16:08 crc kubenswrapper[4699]: I0226 11:16:08.798955 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" podUID="0d17836b-1dda-4b03-8417-7025a21b7f0f" containerName="controller-manager" containerID="cri-o://24a7d1423c70d2629cc84a6d309405af35280dc1c8f65cb9cbedd726c5936739" gracePeriod=30 Feb 26 11:16:08 crc kubenswrapper[4699]: I0226 11:16:08.829176 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt"] Feb 26 11:16:08 crc kubenswrapper[4699]: I0226 11:16:08.829765 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" podUID="bcf1b992-7a07-4490-bf91-a0e2f802d6aa" containerName="route-controller-manager" containerID="cri-o://3bdc2b92ccb0248bc419987c54d581fc1cccfbe419459f4758200553b00ea095" gracePeriod=30 Feb 26 11:16:09 crc kubenswrapper[4699]: I0226 11:16:09.056180 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:16:09 crc kubenswrapper[4699]: I0226 11:16:09.056245 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:16:09 crc kubenswrapper[4699]: I0226 11:16:09.056183 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-tcnxt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 26 11:16:09 crc kubenswrapper[4699]: I0226 11:16:09.056527 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-tcnxt" podUID="72b1bc55-f48b-4d90-ab02-3a80438096b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 26 11:16:09 crc kubenswrapper[4699]: I0226 11:16:09.260554 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:16:10 crc kubenswrapper[4699]: I0226 11:16:10.260347 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:16:10 crc kubenswrapper[4699]: I0226 11:16:10.469664 4699 generic.go:334] "Generic (PLEG): container finished" podID="bcf1b992-7a07-4490-bf91-a0e2f802d6aa" containerID="3bdc2b92ccb0248bc419987c54d581fc1cccfbe419459f4758200553b00ea095" exitCode=0 Feb 26 11:16:10 crc kubenswrapper[4699]: I0226 11:16:10.469748 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" event={"ID":"bcf1b992-7a07-4490-bf91-a0e2f802d6aa","Type":"ContainerDied","Data":"3bdc2b92ccb0248bc419987c54d581fc1cccfbe419459f4758200553b00ea095"} Feb 26 11:16:10 crc kubenswrapper[4699]: I0226 11:16:10.470803 4699 generic.go:334] "Generic (PLEG): container finished" podID="0d17836b-1dda-4b03-8417-7025a21b7f0f" containerID="24a7d1423c70d2629cc84a6d309405af35280dc1c8f65cb9cbedd726c5936739" exitCode=0 Feb 26 11:16:10 crc kubenswrapper[4699]: I0226 11:16:10.470829 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" event={"ID":"0d17836b-1dda-4b03-8417-7025a21b7f0f","Type":"ContainerDied","Data":"24a7d1423c70d2629cc84a6d309405af35280dc1c8f65cb9cbedd726c5936739"} Feb 26 11:16:11 crc kubenswrapper[4699]: I0226 11:16:11.133197 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-fhgnz" podUID="1389c8c4-9546-4193-8067-50db90448d4f" containerName="registry-server" probeResult="failure" output=< Feb 26 11:16:11 crc kubenswrapper[4699]: timeout: failed to connect service ":50051" within 1s Feb 26 11:16:11 crc kubenswrapper[4699]: > Feb 26 11:16:12 crc kubenswrapper[4699]: I0226 11:16:12.520661 4699 patch_prober.go:28] interesting pod/route-controller-manager-5dd48cdbf5-ckczt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" start-of-body= Feb 26 11:16:12 crc kubenswrapper[4699]: I0226 11:16:12.520751 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" podUID="bcf1b992-7a07-4490-bf91-a0e2f802d6aa" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" Feb 26 11:16:14 crc kubenswrapper[4699]: I0226 11:16:14.968046 4699 patch_prober.go:28] interesting pod/controller-manager-75fbbf96d5-6r6lk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" start-of-body= Feb 26 11:16:14 crc kubenswrapper[4699]: I0226 11:16:14.968489 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" podUID="0d17836b-1dda-4b03-8417-7025a21b7f0f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" Feb 26 11:16:15 crc kubenswrapper[4699]: I0226 11:16:15.855522 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" Feb 26 11:16:15 crc kubenswrapper[4699]: I0226 11:16:15.864836 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" Feb 26 11:16:15 crc kubenswrapper[4699]: I0226 11:16:15.888564 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66fb947f99-29sbz"] Feb 26 11:16:15 crc kubenswrapper[4699]: E0226 11:16:15.888880 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d17836b-1dda-4b03-8417-7025a21b7f0f" containerName="controller-manager" Feb 26 11:16:15 crc kubenswrapper[4699]: I0226 11:16:15.888904 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d17836b-1dda-4b03-8417-7025a21b7f0f" containerName="controller-manager" Feb 26 11:16:15 crc kubenswrapper[4699]: E0226 11:16:15.888922 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf1b992-7a07-4490-bf91-a0e2f802d6aa" containerName="route-controller-manager" Feb 26 11:16:15 crc kubenswrapper[4699]: I0226 11:16:15.888933 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf1b992-7a07-4490-bf91-a0e2f802d6aa" containerName="route-controller-manager" Feb 26 11:16:15 crc kubenswrapper[4699]: I0226 11:16:15.889072 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcf1b992-7a07-4490-bf91-a0e2f802d6aa" containerName="route-controller-manager" Feb 26 11:16:15 crc kubenswrapper[4699]: I0226 11:16:15.889093 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d17836b-1dda-4b03-8417-7025a21b7f0f" containerName="controller-manager" Feb 26 11:16:15 crc kubenswrapper[4699]: I0226 11:16:15.889674 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66fb947f99-29sbz" Feb 26 11:16:15 crc kubenswrapper[4699]: I0226 11:16:15.909401 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66fb947f99-29sbz"] Feb 26 11:16:15 crc kubenswrapper[4699]: I0226 11:16:15.945587 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54vbq\" (UniqueName: \"kubernetes.io/projected/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-kube-api-access-54vbq\") pod \"bcf1b992-7a07-4490-bf91-a0e2f802d6aa\" (UID: \"bcf1b992-7a07-4490-bf91-a0e2f802d6aa\") " Feb 26 11:16:15 crc kubenswrapper[4699]: I0226 11:16:15.945676 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d17836b-1dda-4b03-8417-7025a21b7f0f-proxy-ca-bundles\") pod \"0d17836b-1dda-4b03-8417-7025a21b7f0f\" (UID: \"0d17836b-1dda-4b03-8417-7025a21b7f0f\") " Feb 26 11:16:15 crc kubenswrapper[4699]: I0226 11:16:15.945914 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/869c67c3-005d-47f5-9dc7-9f253c523541-serving-cert\") pod \"route-controller-manager-66fb947f99-29sbz\" (UID: \"869c67c3-005d-47f5-9dc7-9f253c523541\") " pod="openshift-route-controller-manager/route-controller-manager-66fb947f99-29sbz" Feb 26 11:16:15 crc kubenswrapper[4699]: I0226 11:16:15.946085 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/869c67c3-005d-47f5-9dc7-9f253c523541-config\") pod \"route-controller-manager-66fb947f99-29sbz\" (UID: \"869c67c3-005d-47f5-9dc7-9f253c523541\") " pod="openshift-route-controller-manager/route-controller-manager-66fb947f99-29sbz" Feb 26 11:16:15 crc kubenswrapper[4699]: I0226 11:16:15.946274 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/869c67c3-005d-47f5-9dc7-9f253c523541-client-ca\") pod \"route-controller-manager-66fb947f99-29sbz\" (UID: \"869c67c3-005d-47f5-9dc7-9f253c523541\") " pod="openshift-route-controller-manager/route-controller-manager-66fb947f99-29sbz" Feb 26 11:16:15 crc kubenswrapper[4699]: I0226 11:16:15.946388 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s28g8\" (UniqueName: \"kubernetes.io/projected/869c67c3-005d-47f5-9dc7-9f253c523541-kube-api-access-s28g8\") pod \"route-controller-manager-66fb947f99-29sbz\" (UID: \"869c67c3-005d-47f5-9dc7-9f253c523541\") " pod="openshift-route-controller-manager/route-controller-manager-66fb947f99-29sbz" Feb 26 11:16:15 crc kubenswrapper[4699]: I0226 11:16:15.947495 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d17836b-1dda-4b03-8417-7025a21b7f0f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0d17836b-1dda-4b03-8417-7025a21b7f0f" (UID: "0d17836b-1dda-4b03-8417-7025a21b7f0f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:16:15 crc kubenswrapper[4699]: I0226 11:16:15.951619 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-kube-api-access-54vbq" (OuterVolumeSpecName: "kube-api-access-54vbq") pod "bcf1b992-7a07-4490-bf91-a0e2f802d6aa" (UID: "bcf1b992-7a07-4490-bf91-a0e2f802d6aa"). InnerVolumeSpecName "kube-api-access-54vbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.006499 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" event={"ID":"bcf1b992-7a07-4490-bf91-a0e2f802d6aa","Type":"ContainerDied","Data":"fc9c825381387b5b1c159e4f7a59a6327fb0accf80fe9e5362f7a82129337743"} Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.006558 4699 scope.go:117] "RemoveContainer" containerID="3bdc2b92ccb0248bc419987c54d581fc1cccfbe419459f4758200553b00ea095" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.006520 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.008255 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" event={"ID":"0d17836b-1dda-4b03-8417-7025a21b7f0f","Type":"ContainerDied","Data":"49b49b629531d2d3d10e7e6211aa14269de0c64c913e3b2b4346368fd08d5a2b"} Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.008344 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.047156 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d17836b-1dda-4b03-8417-7025a21b7f0f-config\") pod \"0d17836b-1dda-4b03-8417-7025a21b7f0f\" (UID: \"0d17836b-1dda-4b03-8417-7025a21b7f0f\") " Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.047248 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d17836b-1dda-4b03-8417-7025a21b7f0f-client-ca\") pod \"0d17836b-1dda-4b03-8417-7025a21b7f0f\" (UID: \"0d17836b-1dda-4b03-8417-7025a21b7f0f\") " Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.047276 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-serving-cert\") pod \"bcf1b992-7a07-4490-bf91-a0e2f802d6aa\" (UID: \"bcf1b992-7a07-4490-bf91-a0e2f802d6aa\") " Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.047314 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d17836b-1dda-4b03-8417-7025a21b7f0f-serving-cert\") pod \"0d17836b-1dda-4b03-8417-7025a21b7f0f\" (UID: \"0d17836b-1dda-4b03-8417-7025a21b7f0f\") " Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.047348 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv9tf\" (UniqueName: \"kubernetes.io/projected/0d17836b-1dda-4b03-8417-7025a21b7f0f-kube-api-access-rv9tf\") pod \"0d17836b-1dda-4b03-8417-7025a21b7f0f\" (UID: \"0d17836b-1dda-4b03-8417-7025a21b7f0f\") " Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.047384 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-config\") pod \"bcf1b992-7a07-4490-bf91-a0e2f802d6aa\" (UID: \"bcf1b992-7a07-4490-bf91-a0e2f802d6aa\") " Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.047421 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-client-ca\") pod \"bcf1b992-7a07-4490-bf91-a0e2f802d6aa\" (UID: \"bcf1b992-7a07-4490-bf91-a0e2f802d6aa\") " Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.047699 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/869c67c3-005d-47f5-9dc7-9f253c523541-client-ca\") pod \"route-controller-manager-66fb947f99-29sbz\" (UID: \"869c67c3-005d-47f5-9dc7-9f253c523541\") " pod="openshift-route-controller-manager/route-controller-manager-66fb947f99-29sbz" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.047790 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s28g8\" (UniqueName: \"kubernetes.io/projected/869c67c3-005d-47f5-9dc7-9f253c523541-kube-api-access-s28g8\") pod \"route-controller-manager-66fb947f99-29sbz\" (UID: \"869c67c3-005d-47f5-9dc7-9f253c523541\") " pod="openshift-route-controller-manager/route-controller-manager-66fb947f99-29sbz" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.047842 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/869c67c3-005d-47f5-9dc7-9f253c523541-serving-cert\") pod \"route-controller-manager-66fb947f99-29sbz\" (UID: \"869c67c3-005d-47f5-9dc7-9f253c523541\") " pod="openshift-route-controller-manager/route-controller-manager-66fb947f99-29sbz" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.047880 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d17836b-1dda-4b03-8417-7025a21b7f0f-client-ca" (OuterVolumeSpecName: "client-ca") pod "0d17836b-1dda-4b03-8417-7025a21b7f0f" (UID: "0d17836b-1dda-4b03-8417-7025a21b7f0f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.047951 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/869c67c3-005d-47f5-9dc7-9f253c523541-config\") pod \"route-controller-manager-66fb947f99-29sbz\" (UID: \"869c67c3-005d-47f5-9dc7-9f253c523541\") " pod="openshift-route-controller-manager/route-controller-manager-66fb947f99-29sbz" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.048018 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54vbq\" (UniqueName: \"kubernetes.io/projected/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-kube-api-access-54vbq\") on node \"crc\" DevicePath \"\"" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.048036 4699 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d17836b-1dda-4b03-8417-7025a21b7f0f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.048051 4699 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d17836b-1dda-4b03-8417-7025a21b7f0f-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.048087 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d17836b-1dda-4b03-8417-7025a21b7f0f-config" (OuterVolumeSpecName: "config") pod "0d17836b-1dda-4b03-8417-7025a21b7f0f" (UID: "0d17836b-1dda-4b03-8417-7025a21b7f0f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.049372 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-config" (OuterVolumeSpecName: "config") pod "bcf1b992-7a07-4490-bf91-a0e2f802d6aa" (UID: "bcf1b992-7a07-4490-bf91-a0e2f802d6aa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.049422 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-client-ca" (OuterVolumeSpecName: "client-ca") pod "bcf1b992-7a07-4490-bf91-a0e2f802d6aa" (UID: "bcf1b992-7a07-4490-bf91-a0e2f802d6aa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.049798 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/869c67c3-005d-47f5-9dc7-9f253c523541-client-ca\") pod \"route-controller-manager-66fb947f99-29sbz\" (UID: \"869c67c3-005d-47f5-9dc7-9f253c523541\") " pod="openshift-route-controller-manager/route-controller-manager-66fb947f99-29sbz" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.050157 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/869c67c3-005d-47f5-9dc7-9f253c523541-config\") pod \"route-controller-manager-66fb947f99-29sbz\" (UID: \"869c67c3-005d-47f5-9dc7-9f253c523541\") " pod="openshift-route-controller-manager/route-controller-manager-66fb947f99-29sbz" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.052143 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bcf1b992-7a07-4490-bf91-a0e2f802d6aa" (UID: "bcf1b992-7a07-4490-bf91-a0e2f802d6aa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.052325 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d17836b-1dda-4b03-8417-7025a21b7f0f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0d17836b-1dda-4b03-8417-7025a21b7f0f" (UID: "0d17836b-1dda-4b03-8417-7025a21b7f0f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.052926 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d17836b-1dda-4b03-8417-7025a21b7f0f-kube-api-access-rv9tf" (OuterVolumeSpecName: "kube-api-access-rv9tf") pod "0d17836b-1dda-4b03-8417-7025a21b7f0f" (UID: "0d17836b-1dda-4b03-8417-7025a21b7f0f"). InnerVolumeSpecName "kube-api-access-rv9tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.053230 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/869c67c3-005d-47f5-9dc7-9f253c523541-serving-cert\") pod \"route-controller-manager-66fb947f99-29sbz\" (UID: \"869c67c3-005d-47f5-9dc7-9f253c523541\") " pod="openshift-route-controller-manager/route-controller-manager-66fb947f99-29sbz" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.068309 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s28g8\" (UniqueName: \"kubernetes.io/projected/869c67c3-005d-47f5-9dc7-9f253c523541-kube-api-access-s28g8\") pod \"route-controller-manager-66fb947f99-29sbz\" (UID: \"869c67c3-005d-47f5-9dc7-9f253c523541\") " pod="openshift-route-controller-manager/route-controller-manager-66fb947f99-29sbz" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.149833 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d17836b-1dda-4b03-8417-7025a21b7f0f-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.149882 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.149893 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d17836b-1dda-4b03-8417-7025a21b7f0f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.149902 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv9tf\" (UniqueName: \"kubernetes.io/projected/0d17836b-1dda-4b03-8417-7025a21b7f0f-kube-api-access-rv9tf\") on node \"crc\" DevicePath \"\"" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.149912 4699 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.149920 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcf1b992-7a07-4490-bf91-a0e2f802d6aa-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.214548 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66fb947f99-29sbz" Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.668134 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk"] Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.677599 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-75fbbf96d5-6r6lk"] Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.682518 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt"] Feb 26 11:16:16 crc kubenswrapper[4699]: I0226 11:16:16.687610 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dd48cdbf5-ckczt"] Feb 26 11:16:17 crc kubenswrapper[4699]: I0226 11:16:17.741906 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-f54c45747-bbg8s"] Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.109215 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx"] Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.110035 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.115771 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.117361 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.118621 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.120495 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.124412 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx"] Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.124641 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.125677 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.129006 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.173009 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xm8v\" (UniqueName: \"kubernetes.io/projected/92e41a97-a913-4bed-87e8-1d3f55e0aa1a-kube-api-access-8xm8v\") pod \"controller-manager-6cbf55bfdf-xlnsx\" (UID: \"92e41a97-a913-4bed-87e8-1d3f55e0aa1a\") " pod="openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.173224 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92e41a97-a913-4bed-87e8-1d3f55e0aa1a-proxy-ca-bundles\") pod \"controller-manager-6cbf55bfdf-xlnsx\" (UID: \"92e41a97-a913-4bed-87e8-1d3f55e0aa1a\") " pod="openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.173406 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/92e41a97-a913-4bed-87e8-1d3f55e0aa1a-client-ca\") pod \"controller-manager-6cbf55bfdf-xlnsx\" (UID: \"92e41a97-a913-4bed-87e8-1d3f55e0aa1a\") " pod="openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.173488 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92e41a97-a913-4bed-87e8-1d3f55e0aa1a-serving-cert\") pod \"controller-manager-6cbf55bfdf-xlnsx\" (UID: \"92e41a97-a913-4bed-87e8-1d3f55e0aa1a\") " pod="openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.173539 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e41a97-a913-4bed-87e8-1d3f55e0aa1a-config\") pod \"controller-manager-6cbf55bfdf-xlnsx\" (UID: \"92e41a97-a913-4bed-87e8-1d3f55e0aa1a\") " pod="openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.229430 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fhgnz" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.271098 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d17836b-1dda-4b03-8417-7025a21b7f0f" path="/var/lib/kubelet/pods/0d17836b-1dda-4b03-8417-7025a21b7f0f/volumes" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.271965 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcf1b992-7a07-4490-bf91-a0e2f802d6aa" path="/var/lib/kubelet/pods/bcf1b992-7a07-4490-bf91-a0e2f802d6aa/volumes" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.274775 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/92e41a97-a913-4bed-87e8-1d3f55e0aa1a-client-ca\") pod \"controller-manager-6cbf55bfdf-xlnsx\" (UID: \"92e41a97-a913-4bed-87e8-1d3f55e0aa1a\") " pod="openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.274833 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92e41a97-a913-4bed-87e8-1d3f55e0aa1a-serving-cert\") pod \"controller-manager-6cbf55bfdf-xlnsx\" (UID: \"92e41a97-a913-4bed-87e8-1d3f55e0aa1a\") " pod="openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.274879 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e41a97-a913-4bed-87e8-1d3f55e0aa1a-config\") pod \"controller-manager-6cbf55bfdf-xlnsx\" (UID: \"92e41a97-a913-4bed-87e8-1d3f55e0aa1a\") " pod="openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.275047 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xm8v\" (UniqueName: \"kubernetes.io/projected/92e41a97-a913-4bed-87e8-1d3f55e0aa1a-kube-api-access-8xm8v\") pod \"controller-manager-6cbf55bfdf-xlnsx\" (UID: \"92e41a97-a913-4bed-87e8-1d3f55e0aa1a\") " pod="openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.275601 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92e41a97-a913-4bed-87e8-1d3f55e0aa1a-proxy-ca-bundles\") pod \"controller-manager-6cbf55bfdf-xlnsx\" (UID: \"92e41a97-a913-4bed-87e8-1d3f55e0aa1a\") " pod="openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.276611 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/92e41a97-a913-4bed-87e8-1d3f55e0aa1a-client-ca\") pod \"controller-manager-6cbf55bfdf-xlnsx\" (UID: \"92e41a97-a913-4bed-87e8-1d3f55e0aa1a\") " pod="openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.319415 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92e41a97-a913-4bed-87e8-1d3f55e0aa1a-serving-cert\") pod \"controller-manager-6cbf55bfdf-xlnsx\" (UID: \"92e41a97-a913-4bed-87e8-1d3f55e0aa1a\") " pod="openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.319837 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92e41a97-a913-4bed-87e8-1d3f55e0aa1a-proxy-ca-bundles\") pod \"controller-manager-6cbf55bfdf-xlnsx\" (UID: \"92e41a97-a913-4bed-87e8-1d3f55e0aa1a\") " pod="openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.319606 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e41a97-a913-4bed-87e8-1d3f55e0aa1a-config\") pod \"controller-manager-6cbf55bfdf-xlnsx\" (UID: \"92e41a97-a913-4bed-87e8-1d3f55e0aa1a\") " pod="openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.322640 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xm8v\" (UniqueName: \"kubernetes.io/projected/92e41a97-a913-4bed-87e8-1d3f55e0aa1a-kube-api-access-8xm8v\") pod \"controller-manager-6cbf55bfdf-xlnsx\" (UID: \"92e41a97-a913-4bed-87e8-1d3f55e0aa1a\") " pod="openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.338950 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fhgnz" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.459603 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx" Feb 26 11:16:18 crc kubenswrapper[4699]: I0226 11:16:18.471399 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fhgnz"] Feb 26 11:16:19 crc kubenswrapper[4699]: I0226 11:16:19.288946 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-tcnxt" Feb 26 11:16:19 crc kubenswrapper[4699]: I0226 11:16:19.995103 4699 scope.go:117] "RemoveContainer" containerID="24a7d1423c70d2629cc84a6d309405af35280dc1c8f65cb9cbedd726c5936739" Feb 26 11:16:20 crc kubenswrapper[4699]: I0226 11:16:20.036165 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" event={"ID":"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a","Type":"ContainerStarted","Data":"3d6fd9756c134f401a126d34c967c09686961803acbfb7150e119a16e1b25167"} Feb 26 11:16:20 crc kubenswrapper[4699]: I0226 11:16:20.036430 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fhgnz" podUID="1389c8c4-9546-4193-8067-50db90448d4f" containerName="registry-server" containerID="cri-o://b8eedef066fa8aaa6df130360e7ae91b6c35c65386cf0e1eb331ae24b87e6305" gracePeriod=2 Feb 26 11:16:20 crc kubenswrapper[4699]: I0226 11:16:20.605464 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66fb947f99-29sbz"] Feb 26 11:16:20 crc kubenswrapper[4699]: I0226 11:16:20.711452 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx"] Feb 26 11:16:21 crc kubenswrapper[4699]: I0226 11:16:21.064703 4699 generic.go:334] "Generic (PLEG): container finished" podID="1389c8c4-9546-4193-8067-50db90448d4f" containerID="b8eedef066fa8aaa6df130360e7ae91b6c35c65386cf0e1eb331ae24b87e6305" exitCode=0 Feb 26 11:16:21 crc kubenswrapper[4699]: I0226 11:16:21.065025 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhgnz" event={"ID":"1389c8c4-9546-4193-8067-50db90448d4f","Type":"ContainerDied","Data":"b8eedef066fa8aaa6df130360e7ae91b6c35c65386cf0e1eb331ae24b87e6305"} Feb 26 11:16:21 crc kubenswrapper[4699]: I0226 11:16:21.065059 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhgnz" event={"ID":"1389c8c4-9546-4193-8067-50db90448d4f","Type":"ContainerDied","Data":"f819282439bed9f63847862724adeef5b7b347bc240dc1336ee32c15da7bf7cc"} Feb 26 11:16:21 crc kubenswrapper[4699]: I0226 11:16:21.065075 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f819282439bed9f63847862724adeef5b7b347bc240dc1336ee32c15da7bf7cc" Feb 26 11:16:21 crc kubenswrapper[4699]: I0226 11:16:21.066619 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx" event={"ID":"92e41a97-a913-4bed-87e8-1d3f55e0aa1a","Type":"ContainerStarted","Data":"82ebdc1d3bcdfe014c120b07518e64401ba8d256cb93f01f647bf2ee46fa985c"} Feb 26 11:16:21 crc kubenswrapper[4699]: I0226 11:16:21.068779 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66fb947f99-29sbz" event={"ID":"869c67c3-005d-47f5-9dc7-9f253c523541","Type":"ContainerStarted","Data":"5b93c6ed5a55a83728c711bd95b5222e62a3d5d60fd870e0d69e6c3896a973c9"} Feb 26 11:16:21 crc kubenswrapper[4699]: I0226 11:16:21.071157 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sc9c6" event={"ID":"44d171ad-7d92-4c70-a686-65f60ded8a03","Type":"ContainerStarted","Data":"d27dda8ede66374aa47b77a60b930fa0b6c4e065e9c9b269dc3e8dd85fa02ece"} Feb 26 11:16:21 crc kubenswrapper[4699]: I0226 11:16:21.075515 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8kpz" event={"ID":"8c96a703-e568-4916-8035-a951ae91dc2b","Type":"ContainerStarted","Data":"7480103b052e67e1c14af93c5ed9ab5b5c3150d0a1dbb5d35641a39bc2cc9515"} Feb 26 11:16:21 crc kubenswrapper[4699]: I0226 11:16:21.078199 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzgjj" event={"ID":"71a83978-4f86-404b-967a-0e7493ff6721","Type":"ContainerStarted","Data":"c40aadf0dd4571fbdbc8a87295a7b711ac5b2622ecf8b10a9b30c71872d6459a"} Feb 26 11:16:21 crc kubenswrapper[4699]: I0226 11:16:21.080332 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-czwkc" event={"ID":"ac0026c3-1fad-4b34-9c42-389971f0c773","Type":"ContainerStarted","Data":"919888fa21cfe39704e1b0c864c73cd7cdeeac94e5ee1bb4c79246202be61323"} Feb 26 11:16:21 crc kubenswrapper[4699]: I0226 11:16:21.082390 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phhbz" event={"ID":"9ea10063-7888-400e-af1c-216cbde5a13e","Type":"ContainerStarted","Data":"c429ee05cb01901447a5e3bded424d4a0427e987ffd209a1f29754bcb9be9b4d"} Feb 26 11:16:21 crc kubenswrapper[4699]: I0226 11:16:21.313883 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mzgjj" podStartSLOduration=5.321535885 podStartE2EDuration="2m14.313850579s" podCreationTimestamp="2026-02-26 11:14:07 +0000 UTC" firstStartedPulling="2026-02-26 11:14:11.064536677 +0000 UTC m=+196.875363111" lastFinishedPulling="2026-02-26 11:16:20.056851371 +0000 UTC m=+325.867677805" observedRunningTime="2026-02-26 11:16:21.259452287 +0000 UTC m=+327.070278721" watchObservedRunningTime="2026-02-26 11:16:21.313850579 +0000 UTC m=+327.124677023" Feb 26 11:16:22 crc kubenswrapper[4699]: I0226 11:16:22.258891 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrk4n" event={"ID":"6e7ddf51-5522-4085-8567-76c9a254ed15","Type":"ContainerStarted","Data":"e63934f65b729d4f1b8b668dbe9b4795f057f647c6b7a160c5e82634ad1de5fd"} Feb 26 11:16:22 crc kubenswrapper[4699]: I0226 11:16:22.374630 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhgnz" Feb 26 11:16:22 crc kubenswrapper[4699]: I0226 11:16:22.711894 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1389c8c4-9546-4193-8067-50db90448d4f-utilities\") pod \"1389c8c4-9546-4193-8067-50db90448d4f\" (UID: \"1389c8c4-9546-4193-8067-50db90448d4f\") " Feb 26 11:16:22 crc kubenswrapper[4699]: I0226 11:16:22.711986 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hnhh\" (UniqueName: \"kubernetes.io/projected/1389c8c4-9546-4193-8067-50db90448d4f-kube-api-access-8hnhh\") pod \"1389c8c4-9546-4193-8067-50db90448d4f\" (UID: \"1389c8c4-9546-4193-8067-50db90448d4f\") " Feb 26 11:16:22 crc kubenswrapper[4699]: I0226 11:16:22.712035 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1389c8c4-9546-4193-8067-50db90448d4f-catalog-content\") pod \"1389c8c4-9546-4193-8067-50db90448d4f\" (UID: \"1389c8c4-9546-4193-8067-50db90448d4f\") " Feb 26 11:16:22 crc kubenswrapper[4699]: I0226 11:16:22.712651 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1389c8c4-9546-4193-8067-50db90448d4f-utilities" (OuterVolumeSpecName: "utilities") pod "1389c8c4-9546-4193-8067-50db90448d4f" (UID: "1389c8c4-9546-4193-8067-50db90448d4f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:16:22 crc kubenswrapper[4699]: I0226 11:16:22.728296 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1389c8c4-9546-4193-8067-50db90448d4f-kube-api-access-8hnhh" (OuterVolumeSpecName: "kube-api-access-8hnhh") pod "1389c8c4-9546-4193-8067-50db90448d4f" (UID: "1389c8c4-9546-4193-8067-50db90448d4f"). InnerVolumeSpecName "kube-api-access-8hnhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:16:22 crc kubenswrapper[4699]: I0226 11:16:22.791107 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1389c8c4-9546-4193-8067-50db90448d4f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1389c8c4-9546-4193-8067-50db90448d4f" (UID: "1389c8c4-9546-4193-8067-50db90448d4f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:16:22 crc kubenswrapper[4699]: I0226 11:16:22.813889 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1389c8c4-9546-4193-8067-50db90448d4f-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:16:22 crc kubenswrapper[4699]: I0226 11:16:22.813938 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hnhh\" (UniqueName: \"kubernetes.io/projected/1389c8c4-9546-4193-8067-50db90448d4f-kube-api-access-8hnhh\") on node \"crc\" DevicePath \"\"" Feb 26 11:16:22 crc kubenswrapper[4699]: I0226 11:16:22.813950 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1389c8c4-9546-4193-8067-50db90448d4f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:16:23 crc kubenswrapper[4699]: I0226 11:16:23.477512 4699 generic.go:334] "Generic (PLEG): container finished" podID="8c96a703-e568-4916-8035-a951ae91dc2b" containerID="7480103b052e67e1c14af93c5ed9ab5b5c3150d0a1dbb5d35641a39bc2cc9515" exitCode=0 Feb 26 11:16:23 crc kubenswrapper[4699]: I0226 11:16:23.477579 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8kpz" event={"ID":"8c96a703-e568-4916-8035-a951ae91dc2b","Type":"ContainerDied","Data":"7480103b052e67e1c14af93c5ed9ab5b5c3150d0a1dbb5d35641a39bc2cc9515"} Feb 26 11:16:23 crc kubenswrapper[4699]: I0226 11:16:23.482142 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jhgks" event={"ID":"6b9da973-6b5f-4485-adca-8792b0a3d256","Type":"ContainerStarted","Data":"7d653e44fd8d815b615ce9635176302fd8a0ad6d3f93420c0c7d85da3992bebc"} Feb 26 11:16:23 crc kubenswrapper[4699]: I0226 11:16:23.487793 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhgnz" Feb 26 11:16:23 crc kubenswrapper[4699]: I0226 11:16:23.487980 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx" event={"ID":"92e41a97-a913-4bed-87e8-1d3f55e0aa1a","Type":"ContainerStarted","Data":"6acfdb78f4a78fdad4245053d38c321d68fa1332b885aac9d01d5287f64dbb26"} Feb 26 11:16:23 crc kubenswrapper[4699]: I0226 11:16:23.489176 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx" Feb 26 11:16:23 crc kubenswrapper[4699]: I0226 11:16:23.490842 4699 patch_prober.go:28] interesting pod/controller-manager-6cbf55bfdf-xlnsx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" start-of-body= Feb 26 11:16:23 crc kubenswrapper[4699]: I0226 11:16:23.490890 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx" podUID="92e41a97-a913-4bed-87e8-1d3f55e0aa1a" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" Feb 26 11:16:23 crc kubenswrapper[4699]: I0226 11:16:23.605053 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx" podStartSLOduration=15.605026387 podStartE2EDuration="15.605026387s" podCreationTimestamp="2026-02-26 11:16:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:16:23.600826728 +0000 UTC m=+329.411653182" watchObservedRunningTime="2026-02-26 11:16:23.605026387 +0000 UTC m=+329.415852831" Feb 26 11:16:25 crc kubenswrapper[4699]: E0226 11:16:25.514583 4699 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.254s" Feb 26 11:16:25 crc kubenswrapper[4699]: I0226 11:16:25.531406 4699 generic.go:334] "Generic (PLEG): container finished" podID="ac0026c3-1fad-4b34-9c42-389971f0c773" containerID="919888fa21cfe39704e1b0c864c73cd7cdeeac94e5ee1bb4c79246202be61323" exitCode=0 Feb 26 11:16:25 crc kubenswrapper[4699]: I0226 11:16:25.531528 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-czwkc" event={"ID":"ac0026c3-1fad-4b34-9c42-389971f0c773","Type":"ContainerDied","Data":"919888fa21cfe39704e1b0c864c73cd7cdeeac94e5ee1bb4c79246202be61323"} Feb 26 11:16:25 crc kubenswrapper[4699]: I0226 11:16:25.536753 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66fb947f99-29sbz" event={"ID":"869c67c3-005d-47f5-9dc7-9f253c523541","Type":"ContainerStarted","Data":"07b45cc6373fa8988168f1cc8f5d8abc09ef0f4b1efdd7c69a6a543665bba754"} Feb 26 11:16:25 crc kubenswrapper[4699]: I0226 11:16:25.541334 4699 generic.go:334] "Generic (PLEG): container finished" podID="6e7ddf51-5522-4085-8567-76c9a254ed15" containerID="e63934f65b729d4f1b8b668dbe9b4795f057f647c6b7a160c5e82634ad1de5fd" exitCode=0 Feb 26 11:16:25 crc kubenswrapper[4699]: I0226 11:16:25.541412 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrk4n" event={"ID":"6e7ddf51-5522-4085-8567-76c9a254ed15","Type":"ContainerDied","Data":"e63934f65b729d4f1b8b668dbe9b4795f057f647c6b7a160c5e82634ad1de5fd"} Feb 26 11:16:25 crc kubenswrapper[4699]: I0226 11:16:25.547425 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" event={"ID":"c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a","Type":"ContainerStarted","Data":"2ce288c961079d54b5964b5eb4891bde97dbf19c66bf3a9774202810b5d5b79a"} Feb 26 11:16:25 crc kubenswrapper[4699]: I0226 11:16:25.547461 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:25 crc kubenswrapper[4699]: I0226 11:16:25.562289 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6cbf55bfdf-xlnsx" Feb 26 11:16:25 crc kubenswrapper[4699]: I0226 11:16:25.810909 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" podStartSLOduration=57.810888891 podStartE2EDuration="57.810888891s" podCreationTimestamp="2026-02-26 11:15:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:16:25.808959986 +0000 UTC m=+331.619786430" watchObservedRunningTime="2026-02-26 11:16:25.810888891 +0000 UTC m=+331.621715335" Feb 26 11:16:25 crc kubenswrapper[4699]: I0226 11:16:25.888706 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fhgnz"] Feb 26 11:16:25 crc kubenswrapper[4699]: I0226 11:16:25.896556 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fhgnz"] Feb 26 11:16:26 crc kubenswrapper[4699]: I0226 11:16:26.440093 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1389c8c4-9546-4193-8067-50db90448d4f" path="/var/lib/kubelet/pods/1389c8c4-9546-4193-8067-50db90448d4f/volumes" Feb 26 11:16:26 crc kubenswrapper[4699]: I0226 11:16:26.477015 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" Feb 26 11:16:26 crc kubenswrapper[4699]: I0226 11:16:26.616569 4699 generic.go:334] "Generic (PLEG): container finished" podID="9ea10063-7888-400e-af1c-216cbde5a13e" containerID="c429ee05cb01901447a5e3bded424d4a0427e987ffd209a1f29754bcb9be9b4d" exitCode=0 Feb 26 11:16:26 crc kubenswrapper[4699]: I0226 11:16:26.618315 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phhbz" event={"ID":"9ea10063-7888-400e-af1c-216cbde5a13e","Type":"ContainerDied","Data":"c429ee05cb01901447a5e3bded424d4a0427e987ffd209a1f29754bcb9be9b4d"} Feb 26 11:16:26 crc kubenswrapper[4699]: I0226 11:16:26.621417 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-66fb947f99-29sbz" Feb 26 11:16:26 crc kubenswrapper[4699]: I0226 11:16:26.948598 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-66fb947f99-29sbz" Feb 26 11:16:27 crc kubenswrapper[4699]: I0226 11:16:27.219219 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-66fb947f99-29sbz" podStartSLOduration=19.219191128 podStartE2EDuration="19.219191128s" podCreationTimestamp="2026-02-26 11:16:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:16:26.849234671 +0000 UTC m=+332.660061115" watchObservedRunningTime="2026-02-26 11:16:27.219191128 +0000 UTC m=+333.030017562" Feb 26 11:16:27 crc kubenswrapper[4699]: I0226 11:16:27.609020 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mzgjj" Feb 26 11:16:27 crc kubenswrapper[4699]: I0226 11:16:27.609622 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mzgjj" Feb 26 11:16:27 crc kubenswrapper[4699]: I0226 11:16:27.968379 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" event={"ID":"0d9d78c8-4193-47a8-9ed9-208f6dc25831","Type":"ContainerStarted","Data":"000757444f955626a5cade194e8afdfce85b9f484def8b4bc1703641245c47c3"} Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:27.995528 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" podStartSLOduration=13.250882132 podStartE2EDuration="27.995505611s" podCreationTimestamp="2026-02-26 11:16:00 +0000 UTC" firstStartedPulling="2026-02-26 11:16:01.047136633 +0000 UTC m=+306.857963067" lastFinishedPulling="2026-02-26 11:16:15.791760112 +0000 UTC m=+321.602586546" observedRunningTime="2026-02-26 11:16:27.992576757 +0000 UTC m=+333.803403211" watchObservedRunningTime="2026-02-26 11:16:27.995505611 +0000 UTC m=+333.806332045" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.181802 4699 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 26 11:16:28 crc kubenswrapper[4699]: E0226 11:16:28.182824 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1389c8c4-9546-4193-8067-50db90448d4f" containerName="extract-utilities" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.182859 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="1389c8c4-9546-4193-8067-50db90448d4f" containerName="extract-utilities" Feb 26 11:16:28 crc kubenswrapper[4699]: E0226 11:16:28.182897 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1389c8c4-9546-4193-8067-50db90448d4f" containerName="extract-content" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.182911 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="1389c8c4-9546-4193-8067-50db90448d4f" containerName="extract-content" Feb 26 11:16:28 crc kubenswrapper[4699]: E0226 11:16:28.182932 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1389c8c4-9546-4193-8067-50db90448d4f" containerName="registry-server" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.182954 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="1389c8c4-9546-4193-8067-50db90448d4f" containerName="registry-server" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.183471 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="1389c8c4-9546-4193-8067-50db90448d4f" containerName="registry-server" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.186592 4699 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.191340 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://b29557dbd4105e597cfdc9d373556a23fe5f57a9d5ebf3c8bf9db2cd72c81886" gracePeriod=15 Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.192378 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec" gracePeriod=15 Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.192779 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6" gracePeriod=15 Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.193727 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356" gracePeriod=15 Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.204095 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e" gracePeriod=15 Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.212476 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.213856 4699 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 11:16:28 crc kubenswrapper[4699]: E0226 11:16:28.216315 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.216349 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 11:16:28 crc kubenswrapper[4699]: E0226 11:16:28.216364 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.216374 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 26 11:16:28 crc kubenswrapper[4699]: E0226 11:16:28.216385 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.216393 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 11:16:28 crc kubenswrapper[4699]: E0226 11:16:28.216402 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.216409 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 26 11:16:28 crc kubenswrapper[4699]: E0226 11:16:28.216424 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.216431 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 26 11:16:28 crc kubenswrapper[4699]: E0226 11:16:28.216440 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.216467 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 26 11:16:28 crc kubenswrapper[4699]: E0226 11:16:28.216486 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.216494 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 11:16:28 crc kubenswrapper[4699]: E0226 11:16:28.216512 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.216521 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.216719 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.216739 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.216754 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.216770 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.216779 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.216787 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.216799 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.216807 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 26 11:16:28 crc kubenswrapper[4699]: E0226 11:16:28.218967 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.219016 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 11:16:28 crc kubenswrapper[4699]: E0226 11:16:28.219046 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.219062 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.220174 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.242563 4699 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 26 11:16:28 crc kubenswrapper[4699]: E0226 11:16:28.274082 4699 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.213:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.382684 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.383335 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.383395 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.383447 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.383476 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.383548 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.383613 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.383763 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.485593 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.485837 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.485881 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.485907 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.485935 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.485984 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.486033 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.486083 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.486250 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.486339 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.486387 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.486422 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.486445 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.486482 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.486514 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.486550 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.574946 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.982044 4699 generic.go:334] "Generic (PLEG): container finished" podID="a904aa73-23d7-4994-882a-4afafe02fb82" containerID="3fd7c462ca5ff5bad3835e30c617b483507570293909f8328947b3fbdead2389" exitCode=0 Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.982200 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a904aa73-23d7-4994-882a-4afafe02fb82","Type":"ContainerDied","Data":"3fd7c462ca5ff5bad3835e30c617b483507570293909f8328947b3fbdead2389"} Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.983403 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.987387 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.991156 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.992531 4699 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b29557dbd4105e597cfdc9d373556a23fe5f57a9d5ebf3c8bf9db2cd72c81886" exitCode=0 Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.992567 4699 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec" exitCode=0 Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.992578 4699 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6" exitCode=0 Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.992592 4699 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356" exitCode=2 Feb 26 11:16:28 crc kubenswrapper[4699]: I0226 11:16:28.994483 4699 scope.go:117] "RemoveContainer" containerID="eb572790b6587d3d0bd02995a8bc9efc2f98a626a79ac288ee6925e50809fcf0" Feb 26 11:16:29 crc kubenswrapper[4699]: I0226 11:16:29.013461 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-mzgjj" podUID="71a83978-4f86-404b-967a-0e7493ff6721" containerName="registry-server" probeResult="failure" output=< Feb 26 11:16:29 crc kubenswrapper[4699]: timeout: failed to connect service ":50051" within 1s Feb 26 11:16:29 crc kubenswrapper[4699]: > Feb 26 11:16:29 crc kubenswrapper[4699]: E0226 11:16:29.014198 4699 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.213:6443: connect: connection refused" event=< Feb 26 11:16:29 crc kubenswrapper[4699]: &Event{ObjectMeta:{certified-operators-mzgjj.1897c7bc14fd9270 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:certified-operators-mzgjj,UID:71a83978-4f86-404b-967a-0e7493ff6721,APIVersion:v1,ResourceVersion:28011,FieldPath:spec.containers{registry-server},},Reason:Unhealthy,Message:Startup probe failed: timeout: failed to connect service ":50051" within 1s Feb 26 11:16:29 crc kubenswrapper[4699]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:16:29.013521008 +0000 UTC m=+334.824347442,LastTimestamp:2026-02-26 11:16:29.013521008 +0000 UTC m=+334.824347442,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 11:16:29 crc kubenswrapper[4699]: > Feb 26 11:16:30 crc kubenswrapper[4699]: I0226 11:16:30.001671 4699 generic.go:334] "Generic (PLEG): container finished" podID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" containerID="000757444f955626a5cade194e8afdfce85b9f484def8b4bc1703641245c47c3" exitCode=0 Feb 26 11:16:30 crc kubenswrapper[4699]: I0226 11:16:30.001853 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" event={"ID":"0d9d78c8-4193-47a8-9ed9-208f6dc25831","Type":"ContainerDied","Data":"000757444f955626a5cade194e8afdfce85b9f484def8b4bc1703641245c47c3"} Feb 26 11:16:30 crc kubenswrapper[4699]: I0226 11:16:30.005199 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:30 crc kubenswrapper[4699]: I0226 11:16:30.005803 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:30 crc kubenswrapper[4699]: E0226 11:16:30.953714 4699 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-conmon-9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b9da973_6b5f_4485_adca_8792b0a3d256.slice/crio-conmon-7d653e44fd8d815b615ce9635176302fd8a0ad6d3f93420c0c7d85da3992bebc.scope\": RecentStats: unable to find data in memory cache]" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.015655 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.018202 4699 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e" exitCode=0 Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.022226 4699 generic.go:334] "Generic (PLEG): container finished" podID="6b9da973-6b5f-4485-adca-8792b0a3d256" containerID="7d653e44fd8d815b615ce9635176302fd8a0ad6d3f93420c0c7d85da3992bebc" exitCode=0 Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.022345 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jhgks" event={"ID":"6b9da973-6b5f-4485-adca-8792b0a3d256","Type":"ContainerDied","Data":"7d653e44fd8d815b615ce9635176302fd8a0ad6d3f93420c0c7d85da3992bebc"} Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.023313 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.023676 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.023669 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"fed17fa17d1e38cfae5f233da9ff311539827646f8cade76c9ff17fc397c01f8"} Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.024315 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.025340 4699 generic.go:334] "Generic (PLEG): container finished" podID="44d171ad-7d92-4c70-a686-65f60ded8a03" containerID="d27dda8ede66374aa47b77a60b930fa0b6c4e065e9c9b269dc3e8dd85fa02ece" exitCode=0 Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.025424 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sc9c6" event={"ID":"44d171ad-7d92-4c70-a686-65f60ded8a03","Type":"ContainerDied","Data":"d27dda8ede66374aa47b77a60b930fa0b6c4e065e9c9b269dc3e8dd85fa02ece"} Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.026410 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.026746 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.026991 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.027334 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a904aa73-23d7-4994-882a-4afafe02fb82","Type":"ContainerDied","Data":"c6277d115f7a8e7d06e98e6fbf746a8f5f67a2bf9660b521fc6a925c224a7f1a"} Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.027351 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.027414 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6277d115f7a8e7d06e98e6fbf746a8f5f67a2bf9660b521fc6a925c224a7f1a" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.109700 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.110530 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.111035 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.111273 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.111556 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.151196 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a904aa73-23d7-4994-882a-4afafe02fb82-kubelet-dir\") pod \"a904aa73-23d7-4994-882a-4afafe02fb82\" (UID: \"a904aa73-23d7-4994-882a-4afafe02fb82\") " Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.151258 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a904aa73-23d7-4994-882a-4afafe02fb82-kube-api-access\") pod \"a904aa73-23d7-4994-882a-4afafe02fb82\" (UID: \"a904aa73-23d7-4994-882a-4afafe02fb82\") " Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.151294 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a904aa73-23d7-4994-882a-4afafe02fb82-var-lock\") pod \"a904aa73-23d7-4994-882a-4afafe02fb82\" (UID: \"a904aa73-23d7-4994-882a-4afafe02fb82\") " Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.151295 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a904aa73-23d7-4994-882a-4afafe02fb82-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a904aa73-23d7-4994-882a-4afafe02fb82" (UID: "a904aa73-23d7-4994-882a-4afafe02fb82"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.151530 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a904aa73-23d7-4994-882a-4afafe02fb82-var-lock" (OuterVolumeSpecName: "var-lock") pod "a904aa73-23d7-4994-882a-4afafe02fb82" (UID: "a904aa73-23d7-4994-882a-4afafe02fb82"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.151957 4699 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a904aa73-23d7-4994-882a-4afafe02fb82-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.151980 4699 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a904aa73-23d7-4994-882a-4afafe02fb82-var-lock\") on node \"crc\" DevicePath \"\"" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.158493 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a904aa73-23d7-4994-882a-4afafe02fb82-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a904aa73-23d7-4994-882a-4afafe02fb82" (UID: "a904aa73-23d7-4994-882a-4afafe02fb82"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.253588 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a904aa73-23d7-4994-882a-4afafe02fb82-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.369782 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.379874 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.380888 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.381296 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.381735 4699 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.381941 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.382170 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.557583 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.557667 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.557786 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.558247 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.558286 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.558303 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.659872 4699 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.659923 4699 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.659936 4699 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.697141 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.697741 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.698236 4699 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.699428 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.701520 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.702088 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.862040 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knl5z\" (UniqueName: \"kubernetes.io/projected/0d9d78c8-4193-47a8-9ed9-208f6dc25831-kube-api-access-knl5z\") pod \"0d9d78c8-4193-47a8-9ed9-208f6dc25831\" (UID: \"0d9d78c8-4193-47a8-9ed9-208f6dc25831\") " Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.873596 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d9d78c8-4193-47a8-9ed9-208f6dc25831-kube-api-access-knl5z" (OuterVolumeSpecName: "kube-api-access-knl5z") pod "0d9d78c8-4193-47a8-9ed9-208f6dc25831" (UID: "0d9d78c8-4193-47a8-9ed9-208f6dc25831"). InnerVolumeSpecName "kube-api-access-knl5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:16:31 crc kubenswrapper[4699]: I0226 11:16:31.964959 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knl5z\" (UniqueName: \"kubernetes.io/projected/0d9d78c8-4193-47a8-9ed9-208f6dc25831-kube-api-access-knl5z\") on node \"crc\" DevicePath \"\"" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.040586 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phhbz" event={"ID":"9ea10063-7888-400e-af1c-216cbde5a13e","Type":"ContainerStarted","Data":"4459df84e7aab7535bf4732238c87c4da5222e3237b69439fff20886a1ea7688"} Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.042220 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.042477 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.042791 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.043266 4699 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.043542 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.044048 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.049996 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.058745 4699 scope.go:117] "RemoveContainer" containerID="b29557dbd4105e597cfdc9d373556a23fe5f57a9d5ebf3c8bf9db2cd72c81886" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.058809 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.064735 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-czwkc" event={"ID":"ac0026c3-1fad-4b34-9c42-389971f0c773","Type":"ContainerStarted","Data":"b3876e98ecf09cf7959a550cdff441956e5f5211a25918e3b5b1ff888aa1faca"} Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.066034 4699 status_manager.go:851] "Failed to get status for pod" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" pod="openshift-marketplace/community-operators-czwkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-czwkc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.066239 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.066442 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.066624 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.066774 4699 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.066924 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.067423 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.070069 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"faaf0aadacd79051543cdb9cfcd026bfc89d0e173e7f3faae8b64f52a92a3ab3"} Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.070533 4699 status_manager.go:851] "Failed to get status for pod" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" pod="openshift-marketplace/community-operators-czwkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-czwkc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: E0226 11:16:32.070659 4699 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.213:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.070775 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.072461 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.073527 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.073937 4699 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.074296 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.074751 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.076546 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrk4n" event={"ID":"6e7ddf51-5522-4085-8567-76c9a254ed15","Type":"ContainerStarted","Data":"aa39a04716f8cdcc694931265437b30c1cb1c3615ab11016472ce3e95c18688b"} Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.077238 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.077617 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.077994 4699 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.078338 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.078775 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.079247 4699 status_manager.go:851] "Failed to get status for pod" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" pod="openshift-marketplace/redhat-marketplace-hrk4n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrk4n\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.079639 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.079846 4699 status_manager.go:851] "Failed to get status for pod" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" pod="openshift-marketplace/community-operators-czwkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-czwkc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.081385 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.081374 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" event={"ID":"0d9d78c8-4193-47a8-9ed9-208f6dc25831","Type":"ContainerDied","Data":"d9a56a2a86268af382b046874040445b48c2975a953e5d204ab0a77f6c325fdc"} Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.081513 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9a56a2a86268af382b046874040445b48c2975a953e5d204ab0a77f6c325fdc" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.095353 4699 status_manager.go:851] "Failed to get status for pod" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" pod="openshift-marketplace/community-operators-czwkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-czwkc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.098174 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.098878 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.105323 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.105869 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.106409 4699 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.106850 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.107284 4699 status_manager.go:851] "Failed to get status for pod" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" pod="openshift-marketplace/redhat-marketplace-hrk4n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrk4n\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.107649 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.107777 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8kpz" event={"ID":"8c96a703-e568-4916-8035-a951ae91dc2b","Type":"ContainerStarted","Data":"bca562e2d2fabc5097841d6398d6c6b6a6779605566f4dd3173111ee1e8c04f3"} Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.111253 4699 scope.go:117] "RemoveContainer" containerID="ca0f1499c0047da376575db5cd438b18458dd638860a008df3b8341f2658d9ec" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.117514 4699 status_manager.go:851] "Failed to get status for pod" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" pod="openshift-marketplace/community-operators-czwkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-czwkc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.117861 4699 status_manager.go:851] "Failed to get status for pod" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" pod="openshift-marketplace/redhat-marketplace-s8kpz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s8kpz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.122529 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.124585 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.124818 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.125045 4699 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.125285 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.125508 4699 status_manager.go:851] "Failed to get status for pod" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" pod="openshift-marketplace/redhat-marketplace-hrk4n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrk4n\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.125730 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.127176 4699 status_manager.go:851] "Failed to get status for pod" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" pod="openshift-marketplace/redhat-marketplace-hrk4n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrk4n\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.129320 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.129581 4699 status_manager.go:851] "Failed to get status for pod" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" pod="openshift-marketplace/community-operators-czwkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-czwkc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.129808 4699 status_manager.go:851] "Failed to get status for pod" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" pod="openshift-marketplace/redhat-marketplace-s8kpz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s8kpz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.130099 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.130369 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.130635 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.130889 4699 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.131260 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.179478 4699 scope.go:117] "RemoveContainer" containerID="9756b395ca3363066f10f261108f31b35a960097dfd0720250c5dad7e95cf7f6" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.190315 4699 status_manager.go:851] "Failed to get status for pod" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" pod="openshift-marketplace/redhat-marketplace-hrk4n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrk4n\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.190731 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.191161 4699 status_manager.go:851] "Failed to get status for pod" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" pod="openshift-marketplace/community-operators-czwkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-czwkc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.191359 4699 status_manager.go:851] "Failed to get status for pod" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" pod="openshift-marketplace/redhat-marketplace-s8kpz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s8kpz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.191569 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.191921 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.195469 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.196010 4699 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.196445 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.208572 4699 scope.go:117] "RemoveContainer" containerID="084a46a1304a8c639e21ab541eedf94adf1f3bd63960a1699426ebc68cddd356" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.234492 4699 scope.go:117] "RemoveContainer" containerID="9324369466fb6495b3f10e66136a5aaf920f50540ab5452b235060a1e229769e" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.271064 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 26 11:16:32 crc kubenswrapper[4699]: I0226 11:16:32.272089 4699 scope.go:117] "RemoveContainer" containerID="2490762714e27b6a745ba229aafbeca2b808eed346d6bd4de519a72944055035" Feb 26 11:16:32 crc kubenswrapper[4699]: E0226 11:16:32.912424 4699 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.213:6443: connect: connection refused" event=< Feb 26 11:16:32 crc kubenswrapper[4699]: &Event{ObjectMeta:{certified-operators-mzgjj.1897c7bc14fd9270 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:certified-operators-mzgjj,UID:71a83978-4f86-404b-967a-0e7493ff6721,APIVersion:v1,ResourceVersion:28011,FieldPath:spec.containers{registry-server},},Reason:Unhealthy,Message:Startup probe failed: timeout: failed to connect service ":50051" within 1s Feb 26 11:16:32 crc kubenswrapper[4699]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 11:16:29.013521008 +0000 UTC m=+334.824347442,LastTimestamp:2026-02-26 11:16:29.013521008 +0000 UTC m=+334.824347442,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 11:16:32 crc kubenswrapper[4699]: > Feb 26 11:16:33 crc kubenswrapper[4699]: I0226 11:16:33.127248 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sc9c6" event={"ID":"44d171ad-7d92-4c70-a686-65f60ded8a03","Type":"ContainerStarted","Data":"704c8ba25f50ac5c881bb9d05eb872ee3851c9a21d28c2acf7a27a400acbebe0"} Feb 26 11:16:33 crc kubenswrapper[4699]: I0226 11:16:33.128751 4699 status_manager.go:851] "Failed to get status for pod" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" pod="openshift-marketplace/redhat-marketplace-s8kpz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s8kpz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:33 crc kubenswrapper[4699]: I0226 11:16:33.129840 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:33 crc kubenswrapper[4699]: I0226 11:16:33.130704 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:33 crc kubenswrapper[4699]: I0226 11:16:33.132150 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:33 crc kubenswrapper[4699]: I0226 11:16:33.133255 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:33 crc kubenswrapper[4699]: I0226 11:16:33.133710 4699 status_manager.go:851] "Failed to get status for pod" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" pod="openshift-marketplace/redhat-marketplace-hrk4n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrk4n\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:33 crc kubenswrapper[4699]: I0226 11:16:33.134433 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:33 crc kubenswrapper[4699]: I0226 11:16:33.135583 4699 status_manager.go:851] "Failed to get status for pod" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" pod="openshift-marketplace/community-operators-czwkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-czwkc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:33 crc kubenswrapper[4699]: I0226 11:16:33.142283 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jhgks" event={"ID":"6b9da973-6b5f-4485-adca-8792b0a3d256","Type":"ContainerStarted","Data":"58ec65080ac68341b08f4272194fe62d85383a27766f002151749856e7c508e7"} Feb 26 11:16:33 crc kubenswrapper[4699]: I0226 11:16:33.146351 4699 status_manager.go:851] "Failed to get status for pod" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" pod="openshift-marketplace/community-operators-czwkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-czwkc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:33 crc kubenswrapper[4699]: E0226 11:16:33.146417 4699 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.213:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 11:16:33 crc kubenswrapper[4699]: I0226 11:16:33.146774 4699 status_manager.go:851] "Failed to get status for pod" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" pod="openshift-marketplace/redhat-marketplace-s8kpz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s8kpz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:33 crc kubenswrapper[4699]: I0226 11:16:33.147381 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:33 crc kubenswrapper[4699]: I0226 11:16:33.151028 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:33 crc kubenswrapper[4699]: I0226 11:16:33.154349 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:33 crc kubenswrapper[4699]: I0226 11:16:33.155684 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:33 crc kubenswrapper[4699]: I0226 11:16:33.156740 4699 status_manager.go:851] "Failed to get status for pod" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" pod="openshift-marketplace/redhat-marketplace-hrk4n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrk4n\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:33 crc kubenswrapper[4699]: I0226 11:16:33.158017 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:34 crc kubenswrapper[4699]: I0226 11:16:34.406249 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:16:34 crc kubenswrapper[4699]: I0226 11:16:34.406334 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:16:34 crc kubenswrapper[4699]: I0226 11:16:34.406373 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:16:34 crc kubenswrapper[4699]: I0226 11:16:34.406412 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:16:34 crc kubenswrapper[4699]: I0226 11:16:34.415210 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:16:34 crc kubenswrapper[4699]: I0226 11:16:34.415292 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:16:34 crc kubenswrapper[4699]: I0226 11:16:34.415897 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:16:34 crc kubenswrapper[4699]: I0226 11:16:34.447372 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:16:34 crc kubenswrapper[4699]: I0226 11:16:34.462588 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:16:34 crc kubenswrapper[4699]: I0226 11:16:34.466074 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:16:34 crc kubenswrapper[4699]: I0226 11:16:34.561246 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:16:35 crc kubenswrapper[4699]: E0226 11:16:35.229307 4699 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 26 11:16:35 crc kubenswrapper[4699]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_network-check-source-55646444c4-trplf_openshift-network-diagnostics_9d751cbb-f2e2-430d-9754-c882a5e924a5_0(4b9c64eccc53ca6b1d5079baf6d1ddc69de002cf2d24471b40f9a87fa976f150): error adding pod openshift-network-diagnostics_network-check-source-55646444c4-trplf to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4b9c64eccc53ca6b1d5079baf6d1ddc69de002cf2d24471b40f9a87fa976f150" Netns:"/var/run/netns/133c11d8-7a50-4785-9ef4-18402b3d7555" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-diagnostics;K8S_POD_NAME=network-check-source-55646444c4-trplf;K8S_POD_INFRA_CONTAINER_ID=4b9c64eccc53ca6b1d5079baf6d1ddc69de002cf2d24471b40f9a87fa976f150;K8S_POD_UID=9d751cbb-f2e2-430d-9754-c882a5e924a5" Path:"" ERRORED: error configuring pod [openshift-network-diagnostics/network-check-source-55646444c4-trplf] networking: Multus: [openshift-network-diagnostics/network-check-source-55646444c4-trplf/9d751cbb-f2e2-430d-9754-c882a5e924a5]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod network-check-source-55646444c4-trplf in out of cluster comm: SetNetworkStatus: failed to update the pod network-check-source-55646444c4-trplf in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf?timeout=1m0s": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:16:35 crc kubenswrapper[4699]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 26 11:16:35 crc kubenswrapper[4699]: > Feb 26 11:16:35 crc kubenswrapper[4699]: E0226 11:16:35.229654 4699 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 26 11:16:35 crc kubenswrapper[4699]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_network-check-source-55646444c4-trplf_openshift-network-diagnostics_9d751cbb-f2e2-430d-9754-c882a5e924a5_0(4b9c64eccc53ca6b1d5079baf6d1ddc69de002cf2d24471b40f9a87fa976f150): error adding pod openshift-network-diagnostics_network-check-source-55646444c4-trplf to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4b9c64eccc53ca6b1d5079baf6d1ddc69de002cf2d24471b40f9a87fa976f150" Netns:"/var/run/netns/133c11d8-7a50-4785-9ef4-18402b3d7555" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-diagnostics;K8S_POD_NAME=network-check-source-55646444c4-trplf;K8S_POD_INFRA_CONTAINER_ID=4b9c64eccc53ca6b1d5079baf6d1ddc69de002cf2d24471b40f9a87fa976f150;K8S_POD_UID=9d751cbb-f2e2-430d-9754-c882a5e924a5" Path:"" ERRORED: error configuring pod [openshift-network-diagnostics/network-check-source-55646444c4-trplf] networking: Multus: [openshift-network-diagnostics/network-check-source-55646444c4-trplf/9d751cbb-f2e2-430d-9754-c882a5e924a5]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod network-check-source-55646444c4-trplf in out of cluster comm: SetNetworkStatus: failed to update the pod network-check-source-55646444c4-trplf in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf?timeout=1m0s": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:16:35 crc kubenswrapper[4699]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 26 11:16:35 crc kubenswrapper[4699]: > pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:16:35 crc kubenswrapper[4699]: E0226 11:16:35.229676 4699 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 26 11:16:35 crc kubenswrapper[4699]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_network-check-source-55646444c4-trplf_openshift-network-diagnostics_9d751cbb-f2e2-430d-9754-c882a5e924a5_0(4b9c64eccc53ca6b1d5079baf6d1ddc69de002cf2d24471b40f9a87fa976f150): error adding pod openshift-network-diagnostics_network-check-source-55646444c4-trplf to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4b9c64eccc53ca6b1d5079baf6d1ddc69de002cf2d24471b40f9a87fa976f150" Netns:"/var/run/netns/133c11d8-7a50-4785-9ef4-18402b3d7555" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-diagnostics;K8S_POD_NAME=network-check-source-55646444c4-trplf;K8S_POD_INFRA_CONTAINER_ID=4b9c64eccc53ca6b1d5079baf6d1ddc69de002cf2d24471b40f9a87fa976f150;K8S_POD_UID=9d751cbb-f2e2-430d-9754-c882a5e924a5" Path:"" ERRORED: error configuring pod [openshift-network-diagnostics/network-check-source-55646444c4-trplf] networking: Multus: [openshift-network-diagnostics/network-check-source-55646444c4-trplf/9d751cbb-f2e2-430d-9754-c882a5e924a5]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod network-check-source-55646444c4-trplf in out of cluster comm: SetNetworkStatus: failed to update the pod network-check-source-55646444c4-trplf in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf?timeout=1m0s": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:16:35 crc kubenswrapper[4699]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 26 11:16:35 crc kubenswrapper[4699]: > pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:16:35 crc kubenswrapper[4699]: E0226 11:16:35.229735 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_network-check-source-55646444c4-trplf_openshift-network-diagnostics_9d751cbb-f2e2-430d-9754-c882a5e924a5_0(4b9c64eccc53ca6b1d5079baf6d1ddc69de002cf2d24471b40f9a87fa976f150): error adding pod openshift-network-diagnostics_network-check-source-55646444c4-trplf to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"4b9c64eccc53ca6b1d5079baf6d1ddc69de002cf2d24471b40f9a87fa976f150\\\" Netns:\\\"/var/run/netns/133c11d8-7a50-4785-9ef4-18402b3d7555\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-diagnostics;K8S_POD_NAME=network-check-source-55646444c4-trplf;K8S_POD_INFRA_CONTAINER_ID=4b9c64eccc53ca6b1d5079baf6d1ddc69de002cf2d24471b40f9a87fa976f150;K8S_POD_UID=9d751cbb-f2e2-430d-9754-c882a5e924a5\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-network-diagnostics/network-check-source-55646444c4-trplf] networking: Multus: [openshift-network-diagnostics/network-check-source-55646444c4-trplf/9d751cbb-f2e2-430d-9754-c882a5e924a5]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod network-check-source-55646444c4-trplf in out of cluster comm: SetNetworkStatus: failed to update the pod network-check-source-55646444c4-trplf in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf?timeout=1m0s\\\": dial tcp 38.102.83.213:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:16:35 crc kubenswrapper[4699]: E0226 11:16:35.447457 4699 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 26 11:16:35 crc kubenswrapper[4699]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_network-check-target-xd92c_openshift-network-diagnostics_3b6479f0-333b-4a96-9adf-2099afdc2447_0(4aeb54c841185c5491db25eb2ef7e833fb793475dad5f13ba58008593b2e6bae): error adding pod openshift-network-diagnostics_network-check-target-xd92c to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4aeb54c841185c5491db25eb2ef7e833fb793475dad5f13ba58008593b2e6bae" Netns:"/var/run/netns/2766d39c-b661-4580-8fc6-ec348c624ecd" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-diagnostics;K8S_POD_NAME=network-check-target-xd92c;K8S_POD_INFRA_CONTAINER_ID=4aeb54c841185c5491db25eb2ef7e833fb793475dad5f13ba58008593b2e6bae;K8S_POD_UID=3b6479f0-333b-4a96-9adf-2099afdc2447" Path:"" ERRORED: error configuring pod [openshift-network-diagnostics/network-check-target-xd92c] networking: Multus: [openshift-network-diagnostics/network-check-target-xd92c/3b6479f0-333b-4a96-9adf-2099afdc2447]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod network-check-target-xd92c in out of cluster comm: SetNetworkStatus: failed to update the pod network-check-target-xd92c in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c?timeout=1m0s": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:16:35 crc kubenswrapper[4699]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 26 11:16:35 crc kubenswrapper[4699]: > Feb 26 11:16:35 crc kubenswrapper[4699]: E0226 11:16:35.447547 4699 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 26 11:16:35 crc kubenswrapper[4699]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_network-check-target-xd92c_openshift-network-diagnostics_3b6479f0-333b-4a96-9adf-2099afdc2447_0(4aeb54c841185c5491db25eb2ef7e833fb793475dad5f13ba58008593b2e6bae): error adding pod openshift-network-diagnostics_network-check-target-xd92c to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4aeb54c841185c5491db25eb2ef7e833fb793475dad5f13ba58008593b2e6bae" Netns:"/var/run/netns/2766d39c-b661-4580-8fc6-ec348c624ecd" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-diagnostics;K8S_POD_NAME=network-check-target-xd92c;K8S_POD_INFRA_CONTAINER_ID=4aeb54c841185c5491db25eb2ef7e833fb793475dad5f13ba58008593b2e6bae;K8S_POD_UID=3b6479f0-333b-4a96-9adf-2099afdc2447" Path:"" ERRORED: error configuring pod [openshift-network-diagnostics/network-check-target-xd92c] networking: Multus: [openshift-network-diagnostics/network-check-target-xd92c/3b6479f0-333b-4a96-9adf-2099afdc2447]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod network-check-target-xd92c in out of cluster comm: SetNetworkStatus: failed to update the pod network-check-target-xd92c in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c?timeout=1m0s": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:16:35 crc kubenswrapper[4699]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 26 11:16:35 crc kubenswrapper[4699]: > pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:16:35 crc kubenswrapper[4699]: E0226 11:16:35.447568 4699 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 26 11:16:35 crc kubenswrapper[4699]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_network-check-target-xd92c_openshift-network-diagnostics_3b6479f0-333b-4a96-9adf-2099afdc2447_0(4aeb54c841185c5491db25eb2ef7e833fb793475dad5f13ba58008593b2e6bae): error adding pod openshift-network-diagnostics_network-check-target-xd92c to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4aeb54c841185c5491db25eb2ef7e833fb793475dad5f13ba58008593b2e6bae" Netns:"/var/run/netns/2766d39c-b661-4580-8fc6-ec348c624ecd" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-diagnostics;K8S_POD_NAME=network-check-target-xd92c;K8S_POD_INFRA_CONTAINER_ID=4aeb54c841185c5491db25eb2ef7e833fb793475dad5f13ba58008593b2e6bae;K8S_POD_UID=3b6479f0-333b-4a96-9adf-2099afdc2447" Path:"" ERRORED: error configuring pod [openshift-network-diagnostics/network-check-target-xd92c] networking: Multus: [openshift-network-diagnostics/network-check-target-xd92c/3b6479f0-333b-4a96-9adf-2099afdc2447]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod network-check-target-xd92c in out of cluster comm: SetNetworkStatus: failed to update the pod network-check-target-xd92c in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c?timeout=1m0s": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:16:35 crc kubenswrapper[4699]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 26 11:16:35 crc kubenswrapper[4699]: > pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:16:35 crc kubenswrapper[4699]: E0226 11:16:35.447628 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"network-check-target-xd92c_openshift-network-diagnostics(3b6479f0-333b-4a96-9adf-2099afdc2447)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"network-check-target-xd92c_openshift-network-diagnostics(3b6479f0-333b-4a96-9adf-2099afdc2447)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_network-check-target-xd92c_openshift-network-diagnostics_3b6479f0-333b-4a96-9adf-2099afdc2447_0(4aeb54c841185c5491db25eb2ef7e833fb793475dad5f13ba58008593b2e6bae): error adding pod openshift-network-diagnostics_network-check-target-xd92c to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"4aeb54c841185c5491db25eb2ef7e833fb793475dad5f13ba58008593b2e6bae\\\" Netns:\\\"/var/run/netns/2766d39c-b661-4580-8fc6-ec348c624ecd\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-diagnostics;K8S_POD_NAME=network-check-target-xd92c;K8S_POD_INFRA_CONTAINER_ID=4aeb54c841185c5491db25eb2ef7e833fb793475dad5f13ba58008593b2e6bae;K8S_POD_UID=3b6479f0-333b-4a96-9adf-2099afdc2447\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-network-diagnostics/network-check-target-xd92c] networking: Multus: [openshift-network-diagnostics/network-check-target-xd92c/3b6479f0-333b-4a96-9adf-2099afdc2447]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod network-check-target-xd92c in out of cluster comm: SetNetworkStatus: failed to update the pod network-check-target-xd92c in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c?timeout=1m0s\\\": dial tcp 38.102.83.213:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 11:16:35 crc kubenswrapper[4699]: E0226 11:16:35.530140 4699 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 26 11:16:35 crc kubenswrapper[4699]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-85b44fc459-gdk6g_openshift-network-console_5fe485a1-e14f-4c09-b5b9-f252bc42b7e8_0(096a117dc4643f6623dddd247c83762dc7b1f2ccba0b0277af31a471398f3096): error adding pod openshift-network-console_networking-console-plugin-85b44fc459-gdk6g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"096a117dc4643f6623dddd247c83762dc7b1f2ccba0b0277af31a471398f3096" Netns:"/var/run/netns/fceeb3d9-4f37-4878-b6c0-5410f26fe14e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-85b44fc459-gdk6g;K8S_POD_INFRA_CONTAINER_ID=096a117dc4643f6623dddd247c83762dc7b1f2ccba0b0277af31a471398f3096;K8S_POD_UID=5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] networking: Multus: [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-85b44fc459-gdk6g?timeout=1m0s": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:16:35 crc kubenswrapper[4699]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 26 11:16:35 crc kubenswrapper[4699]: > Feb 26 11:16:35 crc kubenswrapper[4699]: E0226 11:16:35.530226 4699 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 26 11:16:35 crc kubenswrapper[4699]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-85b44fc459-gdk6g_openshift-network-console_5fe485a1-e14f-4c09-b5b9-f252bc42b7e8_0(096a117dc4643f6623dddd247c83762dc7b1f2ccba0b0277af31a471398f3096): error adding pod openshift-network-console_networking-console-plugin-85b44fc459-gdk6g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"096a117dc4643f6623dddd247c83762dc7b1f2ccba0b0277af31a471398f3096" Netns:"/var/run/netns/fceeb3d9-4f37-4878-b6c0-5410f26fe14e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-85b44fc459-gdk6g;K8S_POD_INFRA_CONTAINER_ID=096a117dc4643f6623dddd247c83762dc7b1f2ccba0b0277af31a471398f3096;K8S_POD_UID=5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] networking: Multus: [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-85b44fc459-gdk6g?timeout=1m0s": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:16:35 crc kubenswrapper[4699]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 26 11:16:35 crc kubenswrapper[4699]: > pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:16:35 crc kubenswrapper[4699]: E0226 11:16:35.530249 4699 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 26 11:16:35 crc kubenswrapper[4699]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-85b44fc459-gdk6g_openshift-network-console_5fe485a1-e14f-4c09-b5b9-f252bc42b7e8_0(096a117dc4643f6623dddd247c83762dc7b1f2ccba0b0277af31a471398f3096): error adding pod openshift-network-console_networking-console-plugin-85b44fc459-gdk6g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"096a117dc4643f6623dddd247c83762dc7b1f2ccba0b0277af31a471398f3096" Netns:"/var/run/netns/fceeb3d9-4f37-4878-b6c0-5410f26fe14e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-85b44fc459-gdk6g;K8S_POD_INFRA_CONTAINER_ID=096a117dc4643f6623dddd247c83762dc7b1f2ccba0b0277af31a471398f3096;K8S_POD_UID=5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] networking: Multus: [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-85b44fc459-gdk6g?timeout=1m0s": dial tcp 38.102.83.213:6443: connect: connection refused Feb 26 11:16:35 crc kubenswrapper[4699]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 26 11:16:35 crc kubenswrapper[4699]: > pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:16:35 crc kubenswrapper[4699]: E0226 11:16:35.530352 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"networking-console-plugin-85b44fc459-gdk6g_openshift-network-console(5fe485a1-e14f-4c09-b5b9-f252bc42b7e8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"networking-console-plugin-85b44fc459-gdk6g_openshift-network-console(5fe485a1-e14f-4c09-b5b9-f252bc42b7e8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-85b44fc459-gdk6g_openshift-network-console_5fe485a1-e14f-4c09-b5b9-f252bc42b7e8_0(096a117dc4643f6623dddd247c83762dc7b1f2ccba0b0277af31a471398f3096): error adding pod openshift-network-console_networking-console-plugin-85b44fc459-gdk6g to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"096a117dc4643f6623dddd247c83762dc7b1f2ccba0b0277af31a471398f3096\\\" Netns:\\\"/var/run/netns/fceeb3d9-4f37-4878-b6c0-5410f26fe14e\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-85b44fc459-gdk6g;K8S_POD_INFRA_CONTAINER_ID=096a117dc4643f6623dddd247c83762dc7b1f2ccba0b0277af31a471398f3096;K8S_POD_UID=5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] networking: Multus: [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-85b44fc459-gdk6g?timeout=1m0s\\\": dial tcp 38.102.83.213:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 11:16:36 crc kubenswrapper[4699]: I0226 11:16:36.263694 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:36 crc kubenswrapper[4699]: I0226 11:16:36.264149 4699 status_manager.go:851] "Failed to get status for pod" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" pod="openshift-marketplace/redhat-marketplace-hrk4n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrk4n\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:36 crc kubenswrapper[4699]: I0226 11:16:36.264758 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:36 crc kubenswrapper[4699]: I0226 11:16:36.265163 4699 status_manager.go:851] "Failed to get status for pod" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" pod="openshift-marketplace/community-operators-czwkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-czwkc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:36 crc kubenswrapper[4699]: I0226 11:16:36.265646 4699 status_manager.go:851] "Failed to get status for pod" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" pod="openshift-marketplace/redhat-marketplace-s8kpz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s8kpz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:36 crc kubenswrapper[4699]: I0226 11:16:36.265908 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:36 crc kubenswrapper[4699]: I0226 11:16:36.266211 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:36 crc kubenswrapper[4699]: I0226 11:16:36.266537 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:36 crc kubenswrapper[4699]: E0226 11:16:36.505340 4699 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:36 crc kubenswrapper[4699]: E0226 11:16:36.506443 4699 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:36 crc kubenswrapper[4699]: E0226 11:16:36.507205 4699 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:36 crc kubenswrapper[4699]: E0226 11:16:36.507537 4699 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:36 crc kubenswrapper[4699]: E0226 11:16:36.507802 4699 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:36 crc kubenswrapper[4699]: I0226 11:16:36.507826 4699 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 26 11:16:36 crc kubenswrapper[4699]: E0226 11:16:36.508238 4699 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="200ms" Feb 26 11:16:36 crc kubenswrapper[4699]: E0226 11:16:36.708653 4699 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="400ms" Feb 26 11:16:37 crc kubenswrapper[4699]: E0226 11:16:37.109921 4699 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="800ms" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.632096 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mzgjj" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.632587 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.632981 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.633566 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.633863 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.634206 4699 status_manager.go:851] "Failed to get status for pod" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" pod="openshift-marketplace/redhat-marketplace-hrk4n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrk4n\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.634554 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.634880 4699 status_manager.go:851] "Failed to get status for pod" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" pod="openshift-marketplace/community-operators-czwkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-czwkc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.635198 4699 status_manager.go:851] "Failed to get status for pod" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" pod="openshift-marketplace/redhat-marketplace-s8kpz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s8kpz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.635568 4699 status_manager.go:851] "Failed to get status for pod" podUID="71a83978-4f86-404b-967a-0e7493ff6721" pod="openshift-marketplace/certified-operators-mzgjj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mzgjj\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.681437 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-czwkc" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.681508 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-czwkc" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.684369 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mzgjj" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.685435 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.685940 4699 status_manager.go:851] "Failed to get status for pod" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" pod="openshift-marketplace/redhat-marketplace-hrk4n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrk4n\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.686435 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.695423 4699 status_manager.go:851] "Failed to get status for pod" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" pod="openshift-marketplace/community-operators-czwkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-czwkc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.696081 4699 status_manager.go:851] "Failed to get status for pod" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" pod="openshift-marketplace/redhat-marketplace-s8kpz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s8kpz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.696620 4699 status_manager.go:851] "Failed to get status for pod" podUID="71a83978-4f86-404b-967a-0e7493ff6721" pod="openshift-marketplace/certified-operators-mzgjj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mzgjj\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.697183 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.698389 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.698713 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.762057 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-czwkc" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.762726 4699 status_manager.go:851] "Failed to get status for pod" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" pod="openshift-marketplace/community-operators-czwkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-czwkc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.763211 4699 status_manager.go:851] "Failed to get status for pod" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" pod="openshift-marketplace/redhat-marketplace-s8kpz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s8kpz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.763562 4699 status_manager.go:851] "Failed to get status for pod" podUID="71a83978-4f86-404b-967a-0e7493ff6721" pod="openshift-marketplace/certified-operators-mzgjj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mzgjj\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.763912 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.764193 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.764464 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.764768 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.765025 4699 status_manager.go:851] "Failed to get status for pod" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" pod="openshift-marketplace/redhat-marketplace-hrk4n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrk4n\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.765296 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: E0226 11:16:37.910872 4699 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="1.6s" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.938192 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-phhbz" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.938240 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-phhbz" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.986842 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-phhbz" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.987336 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.987709 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.988185 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.988431 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.988723 4699 status_manager.go:851] "Failed to get status for pod" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" pod="openshift-marketplace/redhat-marketplace-hrk4n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrk4n\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.988998 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.989235 4699 status_manager.go:851] "Failed to get status for pod" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" pod="openshift-marketplace/community-operators-czwkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-czwkc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.989517 4699 status_manager.go:851] "Failed to get status for pod" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" pod="openshift-marketplace/redhat-marketplace-s8kpz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s8kpz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:37 crc kubenswrapper[4699]: I0226 11:16:37.989827 4699 status_manager.go:851] "Failed to get status for pod" podUID="71a83978-4f86-404b-967a-0e7493ff6721" pod="openshift-marketplace/certified-operators-mzgjj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mzgjj\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:38 crc kubenswrapper[4699]: I0226 11:16:38.224623 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-czwkc" Feb 26 11:16:38 crc kubenswrapper[4699]: I0226 11:16:38.225808 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:38 crc kubenswrapper[4699]: I0226 11:16:38.226657 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:38 crc kubenswrapper[4699]: I0226 11:16:38.227090 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:38 crc kubenswrapper[4699]: I0226 11:16:38.227387 4699 status_manager.go:851] "Failed to get status for pod" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" pod="openshift-marketplace/redhat-marketplace-hrk4n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrk4n\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:38 crc kubenswrapper[4699]: I0226 11:16:38.227502 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-phhbz" Feb 26 11:16:38 crc kubenswrapper[4699]: I0226 11:16:38.227764 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:38 crc kubenswrapper[4699]: I0226 11:16:38.228306 4699 status_manager.go:851] "Failed to get status for pod" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" pod="openshift-marketplace/community-operators-czwkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-czwkc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:38 crc kubenswrapper[4699]: I0226 11:16:38.228558 4699 status_manager.go:851] "Failed to get status for pod" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" pod="openshift-marketplace/redhat-marketplace-s8kpz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s8kpz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:38 crc kubenswrapper[4699]: I0226 11:16:38.228927 4699 status_manager.go:851] "Failed to get status for pod" podUID="71a83978-4f86-404b-967a-0e7493ff6721" pod="openshift-marketplace/certified-operators-mzgjj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mzgjj\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:38 crc kubenswrapper[4699]: I0226 11:16:38.229333 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:38 crc kubenswrapper[4699]: I0226 11:16:38.229761 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:38 crc kubenswrapper[4699]: I0226 11:16:38.230097 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:38 crc kubenswrapper[4699]: I0226 11:16:38.230375 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:38 crc kubenswrapper[4699]: I0226 11:16:38.230603 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:38 crc kubenswrapper[4699]: I0226 11:16:38.230987 4699 status_manager.go:851] "Failed to get status for pod" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" pod="openshift-marketplace/redhat-marketplace-hrk4n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrk4n\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:38 crc kubenswrapper[4699]: I0226 11:16:38.231368 4699 status_manager.go:851] "Failed to get status for pod" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" pod="openshift-marketplace/community-operators-czwkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-czwkc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:38 crc kubenswrapper[4699]: I0226 11:16:38.231725 4699 status_manager.go:851] "Failed to get status for pod" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" pod="openshift-marketplace/redhat-marketplace-s8kpz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s8kpz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:38 crc kubenswrapper[4699]: I0226 11:16:38.232292 4699 status_manager.go:851] "Failed to get status for pod" podUID="71a83978-4f86-404b-967a-0e7493ff6721" pod="openshift-marketplace/certified-operators-mzgjj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mzgjj\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:38 crc kubenswrapper[4699]: I0226 11:16:38.232595 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.260136 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.260719 4699 status_manager.go:851] "Failed to get status for pod" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" pod="openshift-marketplace/redhat-marketplace-hrk4n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrk4n\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.261380 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.261664 4699 status_manager.go:851] "Failed to get status for pod" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" pod="openshift-marketplace/community-operators-czwkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-czwkc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.261905 4699 status_manager.go:851] "Failed to get status for pod" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" pod="openshift-marketplace/redhat-marketplace-s8kpz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s8kpz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.262625 4699 status_manager.go:851] "Failed to get status for pod" podUID="71a83978-4f86-404b-967a-0e7493ff6721" pod="openshift-marketplace/certified-operators-mzgjj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mzgjj\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.263174 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.263455 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.263725 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.264190 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.274632 4699 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ee33485d-044d-4356-a626-df5e4625a4f2" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.274682 4699 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ee33485d-044d-4356-a626-df5e4625a4f2" Feb 26 11:16:39 crc kubenswrapper[4699]: E0226 11:16:39.275207 4699 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.275585 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:16:39 crc kubenswrapper[4699]: E0226 11:16:39.512242 4699 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="3.2s" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.878239 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s8kpz" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.878318 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s8kpz" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.927930 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s8kpz" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.928442 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.928957 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.929200 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.929383 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.929548 4699 status_manager.go:851] "Failed to get status for pod" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" pod="openshift-marketplace/redhat-marketplace-hrk4n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrk4n\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.929756 4699 status_manager.go:851] "Failed to get status for pod" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" pod="openshift-marketplace/community-operators-czwkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-czwkc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.929924 4699 status_manager.go:851] "Failed to get status for pod" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" pod="openshift-marketplace/redhat-marketplace-s8kpz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s8kpz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.930068 4699 status_manager.go:851] "Failed to get status for pod" podUID="71a83978-4f86-404b-967a-0e7493ff6721" pod="openshift-marketplace/certified-operators-mzgjj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mzgjj\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:39 crc kubenswrapper[4699]: I0226 11:16:39.930435 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.197627 4699 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="48cf4b0cee4da5e5cbcf8557432f0b1f33a2168dd67b4f462eebcce08f058e73" exitCode=0 Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.197736 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"48cf4b0cee4da5e5cbcf8557432f0b1f33a2168dd67b4f462eebcce08f058e73"} Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.197770 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"481a819fe1f992a44f7b1ffb594159a9ca1def1223405b7ef17bae7b584fa892"} Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.198267 4699 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ee33485d-044d-4356-a626-df5e4625a4f2" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.198293 4699 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ee33485d-044d-4356-a626-df5e4625a4f2" Feb 26 11:16:40 crc kubenswrapper[4699]: E0226 11:16:40.198782 4699 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.199040 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.199464 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.199952 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.200279 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.201090 4699 status_manager.go:851] "Failed to get status for pod" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" pod="openshift-marketplace/redhat-marketplace-hrk4n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrk4n\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.201558 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.203921 4699 status_manager.go:851] "Failed to get status for pod" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" pod="openshift-marketplace/community-operators-czwkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-czwkc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.204345 4699 status_manager.go:851] "Failed to get status for pod" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" pod="openshift-marketplace/redhat-marketplace-s8kpz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s8kpz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.204676 4699 status_manager.go:851] "Failed to get status for pod" podUID="71a83978-4f86-404b-967a-0e7493ff6721" pod="openshift-marketplace/certified-operators-mzgjj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mzgjj\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.224580 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hrk4n" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.224671 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hrk4n" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.246736 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s8kpz" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.247375 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.247728 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.248224 4699 status_manager.go:851] "Failed to get status for pod" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" pod="openshift-marketplace/redhat-marketplace-hrk4n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrk4n\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.248498 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.248812 4699 status_manager.go:851] "Failed to get status for pod" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" pod="openshift-marketplace/community-operators-czwkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-czwkc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.249194 4699 status_manager.go:851] "Failed to get status for pod" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" pod="openshift-marketplace/redhat-marketplace-s8kpz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s8kpz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.249495 4699 status_manager.go:851] "Failed to get status for pod" podUID="71a83978-4f86-404b-967a-0e7493ff6721" pod="openshift-marketplace/certified-operators-mzgjj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mzgjj\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.249727 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.250053 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.270098 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hrk4n" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.270599 4699 status_manager.go:851] "Failed to get status for pod" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" pod="openshift-marketplace/community-operators-czwkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-czwkc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.270774 4699 status_manager.go:851] "Failed to get status for pod" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" pod="openshift-marketplace/redhat-marketplace-s8kpz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s8kpz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.270935 4699 status_manager.go:851] "Failed to get status for pod" podUID="71a83978-4f86-404b-967a-0e7493ff6721" pod="openshift-marketplace/certified-operators-mzgjj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mzgjj\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.271099 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.271311 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.271475 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.271636 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.271963 4699 status_manager.go:851] "Failed to get status for pod" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" pod="openshift-marketplace/redhat-marketplace-hrk4n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrk4n\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.272155 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.768964 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sc9c6" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.769030 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sc9c6" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.811010 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sc9c6" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.812348 4699 status_manager.go:851] "Failed to get status for pod" podUID="71a83978-4f86-404b-967a-0e7493ff6721" pod="openshift-marketplace/certified-operators-mzgjj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mzgjj\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.813166 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.813952 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.814360 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.814688 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.814967 4699 status_manager.go:851] "Failed to get status for pod" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" pod="openshift-marketplace/redhat-marketplace-hrk4n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrk4n\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.815406 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.815983 4699 status_manager.go:851] "Failed to get status for pod" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" pod="openshift-marketplace/community-operators-czwkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-czwkc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.816533 4699 status_manager.go:851] "Failed to get status for pod" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" pod="openshift-marketplace/redhat-marketplace-s8kpz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s8kpz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.959272 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jhgks" Feb 26 11:16:40 crc kubenswrapper[4699]: I0226 11:16:40.960221 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jhgks" Feb 26 11:16:41 crc kubenswrapper[4699]: I0226 11:16:41.001224 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jhgks" Feb 26 11:16:41 crc kubenswrapper[4699]: I0226 11:16:41.002160 4699 status_manager.go:851] "Failed to get status for pod" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:41 crc kubenswrapper[4699]: I0226 11:16:41.002626 4699 status_manager.go:851] "Failed to get status for pod" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" pod="openshift-infra/auto-csr-approver-29535076-rv9x5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535076-rv9x5\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:41 crc kubenswrapper[4699]: I0226 11:16:41.002918 4699 status_manager.go:851] "Failed to get status for pod" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" pod="openshift-marketplace/redhat-marketplace-hrk4n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrk4n\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:41 crc kubenswrapper[4699]: I0226 11:16:41.003276 4699 status_manager.go:851] "Failed to get status for pod" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" pod="openshift-marketplace/redhat-operators-jhgks" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jhgks\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:41 crc kubenswrapper[4699]: I0226 11:16:41.003519 4699 status_manager.go:851] "Failed to get status for pod" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" pod="openshift-marketplace/community-operators-czwkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-czwkc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:41 crc kubenswrapper[4699]: I0226 11:16:41.003792 4699 status_manager.go:851] "Failed to get status for pod" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" pod="openshift-marketplace/redhat-marketplace-s8kpz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s8kpz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:41 crc kubenswrapper[4699]: I0226 11:16:41.004215 4699 status_manager.go:851] "Failed to get status for pod" podUID="71a83978-4f86-404b-967a-0e7493ff6721" pod="openshift-marketplace/certified-operators-mzgjj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mzgjj\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:41 crc kubenswrapper[4699]: I0226 11:16:41.004577 4699 status_manager.go:851] "Failed to get status for pod" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" pod="openshift-marketplace/certified-operators-phhbz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-phhbz\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:41 crc kubenswrapper[4699]: I0226 11:16:41.004836 4699 status_manager.go:851] "Failed to get status for pod" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" pod="openshift-marketplace/redhat-operators-sc9c6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sc9c6\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 26 11:16:41 crc kubenswrapper[4699]: I0226 11:16:41.206710 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a1843c698a251c7f80d3727f1184692d6c1af6b5dced3224a5cd37e295f94ef1"} Feb 26 11:16:41 crc kubenswrapper[4699]: I0226 11:16:41.249986 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sc9c6" Feb 26 11:16:41 crc kubenswrapper[4699]: I0226 11:16:41.251885 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jhgks" Feb 26 11:16:41 crc kubenswrapper[4699]: I0226 11:16:41.257108 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hrk4n" Feb 26 11:16:42 crc kubenswrapper[4699]: I0226 11:16:42.234283 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5cb7b42b2f5c346f9bff2b95ce8298fcf0e6608034e456bdd599fe9085cdf98e"} Feb 26 11:16:42 crc kubenswrapper[4699]: I0226 11:16:42.234667 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ca226e16abe2a265999c17c48d3416798a8bdb915eca4c716f1810a970605169"} Feb 26 11:16:43 crc kubenswrapper[4699]: I0226 11:16:43.261950 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 11:16:43 crc kubenswrapper[4699]: I0226 11:16:43.262954 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 26 11:16:43 crc kubenswrapper[4699]: I0226 11:16:43.263009 4699 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="bcbdf473c08abfc93be6ee643eb86aebdaf8cae59cbe4c844b800862b15f7434" exitCode=1 Feb 26 11:16:43 crc kubenswrapper[4699]: I0226 11:16:43.263078 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"bcbdf473c08abfc93be6ee643eb86aebdaf8cae59cbe4c844b800862b15f7434"} Feb 26 11:16:43 crc kubenswrapper[4699]: I0226 11:16:43.263436 4699 scope.go:117] "RemoveContainer" containerID="bcbdf473c08abfc93be6ee643eb86aebdaf8cae59cbe4c844b800862b15f7434" Feb 26 11:16:43 crc kubenswrapper[4699]: I0226 11:16:43.266614 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"95a41b7dc7585b88734e5f6819f1c30b7b13f4540c1937bb19c1f6585ca5ee27"} Feb 26 11:16:43 crc kubenswrapper[4699]: I0226 11:16:43.266667 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"22ea1c1238fe39be89eb8deeaff7ea021e9a6e811f07e3d77c21759cac2c4689"} Feb 26 11:16:43 crc kubenswrapper[4699]: I0226 11:16:43.266888 4699 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ee33485d-044d-4356-a626-df5e4625a4f2" Feb 26 11:16:43 crc kubenswrapper[4699]: I0226 11:16:43.266920 4699 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ee33485d-044d-4356-a626-df5e4625a4f2" Feb 26 11:16:43 crc kubenswrapper[4699]: I0226 11:16:43.319783 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:16:44 crc kubenswrapper[4699]: I0226 11:16:44.275686 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 11:16:44 crc kubenswrapper[4699]: I0226 11:16:44.276156 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:16:44 crc kubenswrapper[4699]: I0226 11:16:44.276201 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:16:44 crc kubenswrapper[4699]: I0226 11:16:44.276568 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 26 11:16:44 crc kubenswrapper[4699]: I0226 11:16:44.276619 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d8326aeadcc826fe0424355bd287bf65d6610bc258e44863eed96368db20aa6a"} Feb 26 11:16:44 crc kubenswrapper[4699]: I0226 11:16:44.281324 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:16:45 crc kubenswrapper[4699]: I0226 11:16:45.171612 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:16:46 crc kubenswrapper[4699]: I0226 11:16:46.260645 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:16:46 crc kubenswrapper[4699]: I0226 11:16:46.261172 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:16:46 crc kubenswrapper[4699]: W0226 11:16:46.690333 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-00bc26edafd1327bd975f1a97124eb62afc42bc7231a3d55b2cab5b23d8e9df2 WatchSource:0}: Error finding container 00bc26edafd1327bd975f1a97124eb62afc42bc7231a3d55b2cab5b23d8e9df2: Status 404 returned error can't find the container with id 00bc26edafd1327bd975f1a97124eb62afc42bc7231a3d55b2cab5b23d8e9df2 Feb 26 11:16:47 crc kubenswrapper[4699]: I0226 11:16:47.295244 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"00bc26edafd1327bd975f1a97124eb62afc42bc7231a3d55b2cab5b23d8e9df2"} Feb 26 11:16:48 crc kubenswrapper[4699]: I0226 11:16:48.277712 4699 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:16:48 crc kubenswrapper[4699]: I0226 11:16:48.303471 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"3c4d331786b860f4598df47c583b55022d7939fed3acfc4cee5bc14012a5f9df"} Feb 26 11:16:48 crc kubenswrapper[4699]: I0226 11:16:48.303691 4699 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ee33485d-044d-4356-a626-df5e4625a4f2" Feb 26 11:16:48 crc kubenswrapper[4699]: I0226 11:16:48.303713 4699 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ee33485d-044d-4356-a626-df5e4625a4f2" Feb 26 11:16:48 crc kubenswrapper[4699]: I0226 11:16:48.303716 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:16:48 crc kubenswrapper[4699]: I0226 11:16:48.303837 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:16:48 crc kubenswrapper[4699]: I0226 11:16:48.308016 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:16:48 crc kubenswrapper[4699]: I0226 11:16:48.323844 4699 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="808de523-c5cd-43c6-9394-190a9608a367" Feb 26 11:16:49 crc kubenswrapper[4699]: I0226 11:16:49.307836 4699 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ee33485d-044d-4356-a626-df5e4625a4f2" Feb 26 11:16:49 crc kubenswrapper[4699]: I0226 11:16:49.307865 4699 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ee33485d-044d-4356-a626-df5e4625a4f2" Feb 26 11:16:50 crc kubenswrapper[4699]: I0226 11:16:50.259912 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:16:50 crc kubenswrapper[4699]: I0226 11:16:50.260899 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 11:16:50 crc kubenswrapper[4699]: I0226 11:16:50.261299 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:16:50 crc kubenswrapper[4699]: I0226 11:16:50.261788 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 11:16:50 crc kubenswrapper[4699]: W0226 11:16:50.691585 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-da2cc71a1861730623d03d023ddc99bf68c166cd895e60b2f3c4083fe9be5b3e WatchSource:0}: Error finding container da2cc71a1861730623d03d023ddc99bf68c166cd895e60b2f3c4083fe9be5b3e: Status 404 returned error can't find the container with id da2cc71a1861730623d03d023ddc99bf68c166cd895e60b2f3c4083fe9be5b3e Feb 26 11:16:51 crc kubenswrapper[4699]: I0226 11:16:51.324345 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"bb583ed365d8e16810e128b490773c9fa9cc1d3d176995bd5e50e802cb8d7f97"} Feb 26 11:16:51 crc kubenswrapper[4699]: I0226 11:16:51.325076 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"da2cc71a1861730623d03d023ddc99bf68c166cd895e60b2f3c4083fe9be5b3e"} Feb 26 11:16:51 crc kubenswrapper[4699]: I0226 11:16:51.331063 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"52ec8357ee858714f10f2561b43d88e0ba4c9d95314995e90ecdffb9dc1b0781"} Feb 26 11:16:51 crc kubenswrapper[4699]: I0226 11:16:51.331097 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8458cde859ed037ea958d3cd57ef1865030356ac78f9c293ef891e98f868d877"} Feb 26 11:16:53 crc kubenswrapper[4699]: I0226 11:16:53.320316 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:16:53 crc kubenswrapper[4699]: I0226 11:16:53.320543 4699 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 26 11:16:53 crc kubenswrapper[4699]: I0226 11:16:53.320696 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 26 11:16:53 crc kubenswrapper[4699]: I0226 11:16:53.345595 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Feb 26 11:16:53 crc kubenswrapper[4699]: I0226 11:16:53.345646 4699 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="52ec8357ee858714f10f2561b43d88e0ba4c9d95314995e90ecdffb9dc1b0781" exitCode=255 Feb 26 11:16:53 crc kubenswrapper[4699]: I0226 11:16:53.345679 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"52ec8357ee858714f10f2561b43d88e0ba4c9d95314995e90ecdffb9dc1b0781"} Feb 26 11:16:53 crc kubenswrapper[4699]: I0226 11:16:53.346179 4699 scope.go:117] "RemoveContainer" containerID="52ec8357ee858714f10f2561b43d88e0ba4c9d95314995e90ecdffb9dc1b0781" Feb 26 11:16:54 crc kubenswrapper[4699]: I0226 11:16:54.354505 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Feb 26 11:16:54 crc kubenswrapper[4699]: I0226 11:16:54.354896 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6185db4f6fa07a6ece323e079cb5de2c410483b56be6a8ee0c600f7ca82c1ef1"} Feb 26 11:16:55 crc kubenswrapper[4699]: I0226 11:16:55.361777 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Feb 26 11:16:55 crc kubenswrapper[4699]: I0226 11:16:55.362191 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Feb 26 11:16:55 crc kubenswrapper[4699]: I0226 11:16:55.362223 4699 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="6185db4f6fa07a6ece323e079cb5de2c410483b56be6a8ee0c600f7ca82c1ef1" exitCode=255 Feb 26 11:16:55 crc kubenswrapper[4699]: I0226 11:16:55.362249 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"6185db4f6fa07a6ece323e079cb5de2c410483b56be6a8ee0c600f7ca82c1ef1"} Feb 26 11:16:55 crc kubenswrapper[4699]: I0226 11:16:55.362281 4699 scope.go:117] "RemoveContainer" containerID="52ec8357ee858714f10f2561b43d88e0ba4c9d95314995e90ecdffb9dc1b0781" Feb 26 11:16:55 crc kubenswrapper[4699]: I0226 11:16:55.362884 4699 scope.go:117] "RemoveContainer" containerID="6185db4f6fa07a6ece323e079cb5de2c410483b56be6a8ee0c600f7ca82c1ef1" Feb 26 11:16:55 crc kubenswrapper[4699]: E0226 11:16:55.363164 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 11:16:56 crc kubenswrapper[4699]: I0226 11:16:56.290913 4699 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="808de523-c5cd-43c6-9394-190a9608a367" Feb 26 11:16:56 crc kubenswrapper[4699]: I0226 11:16:56.369838 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Feb 26 11:16:58 crc kubenswrapper[4699]: I0226 11:16:58.016830 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 26 11:16:59 crc kubenswrapper[4699]: I0226 11:16:59.510633 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 26 11:17:00 crc kubenswrapper[4699]: I0226 11:17:00.185856 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 11:17:00 crc kubenswrapper[4699]: I0226 11:17:00.543791 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 26 11:17:00 crc kubenswrapper[4699]: I0226 11:17:00.683760 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 26 11:17:00 crc kubenswrapper[4699]: I0226 11:17:00.884791 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 26 11:17:01 crc kubenswrapper[4699]: I0226 11:17:01.214634 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 11:17:01 crc kubenswrapper[4699]: I0226 11:17:01.264158 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 26 11:17:01 crc kubenswrapper[4699]: I0226 11:17:01.326949 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 26 11:17:01 crc kubenswrapper[4699]: I0226 11:17:01.418174 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 26 11:17:01 crc kubenswrapper[4699]: I0226 11:17:01.445057 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 26 11:17:01 crc kubenswrapper[4699]: I0226 11:17:01.650854 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 26 11:17:01 crc kubenswrapper[4699]: I0226 11:17:01.711327 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 26 11:17:01 crc kubenswrapper[4699]: I0226 11:17:01.846489 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 26 11:17:01 crc kubenswrapper[4699]: I0226 11:17:01.889473 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 26 11:17:01 crc kubenswrapper[4699]: I0226 11:17:01.919410 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 26 11:17:01 crc kubenswrapper[4699]: I0226 11:17:01.929727 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 26 11:17:02 crc kubenswrapper[4699]: I0226 11:17:02.012931 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 26 11:17:02 crc kubenswrapper[4699]: I0226 11:17:02.117335 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 26 11:17:02 crc kubenswrapper[4699]: I0226 11:17:02.244452 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 26 11:17:02 crc kubenswrapper[4699]: I0226 11:17:02.274078 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 26 11:17:02 crc kubenswrapper[4699]: I0226 11:17:02.313761 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 26 11:17:02 crc kubenswrapper[4699]: I0226 11:17:02.382297 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 26 11:17:02 crc kubenswrapper[4699]: I0226 11:17:02.415432 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 26 11:17:02 crc kubenswrapper[4699]: I0226 11:17:02.435732 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 26 11:17:02 crc kubenswrapper[4699]: I0226 11:17:02.608662 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 26 11:17:02 crc kubenswrapper[4699]: I0226 11:17:02.955867 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 26 11:17:03 crc kubenswrapper[4699]: I0226 11:17:03.015433 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 26 11:17:03 crc kubenswrapper[4699]: I0226 11:17:03.194498 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 26 11:17:03 crc kubenswrapper[4699]: I0226 11:17:03.320159 4699 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 26 11:17:03 crc kubenswrapper[4699]: I0226 11:17:03.320237 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 26 11:17:03 crc kubenswrapper[4699]: I0226 11:17:03.333368 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 26 11:17:03 crc kubenswrapper[4699]: I0226 11:17:03.350871 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 26 11:17:03 crc kubenswrapper[4699]: I0226 11:17:03.451946 4699 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 26 11:17:03 crc kubenswrapper[4699]: I0226 11:17:03.522229 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 26 11:17:03 crc kubenswrapper[4699]: I0226 11:17:03.613889 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 26 11:17:03 crc kubenswrapper[4699]: I0226 11:17:03.750223 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 26 11:17:04 crc kubenswrapper[4699]: I0226 11:17:04.088343 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 26 11:17:04 crc kubenswrapper[4699]: I0226 11:17:04.100907 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 26 11:17:04 crc kubenswrapper[4699]: I0226 11:17:04.141444 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 26 11:17:04 crc kubenswrapper[4699]: I0226 11:17:04.169721 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 26 11:17:04 crc kubenswrapper[4699]: I0226 11:17:04.177419 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 26 11:17:04 crc kubenswrapper[4699]: I0226 11:17:04.227633 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 26 11:17:04 crc kubenswrapper[4699]: I0226 11:17:04.303773 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 26 11:17:04 crc kubenswrapper[4699]: I0226 11:17:04.428861 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 26 11:17:04 crc kubenswrapper[4699]: I0226 11:17:04.460838 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 26 11:17:04 crc kubenswrapper[4699]: I0226 11:17:04.498599 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 26 11:17:04 crc kubenswrapper[4699]: I0226 11:17:04.514653 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 26 11:17:04 crc kubenswrapper[4699]: I0226 11:17:04.593005 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 26 11:17:04 crc kubenswrapper[4699]: I0226 11:17:04.643868 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 26 11:17:04 crc kubenswrapper[4699]: I0226 11:17:04.645171 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 26 11:17:04 crc kubenswrapper[4699]: I0226 11:17:04.675954 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 26 11:17:04 crc kubenswrapper[4699]: I0226 11:17:04.801043 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 26 11:17:04 crc kubenswrapper[4699]: I0226 11:17:04.923514 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 26 11:17:04 crc kubenswrapper[4699]: I0226 11:17:04.939040 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 11:17:04 crc kubenswrapper[4699]: I0226 11:17:04.962132 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 26 11:17:05 crc kubenswrapper[4699]: I0226 11:17:05.055036 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 26 11:17:05 crc kubenswrapper[4699]: I0226 11:17:05.055860 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 26 11:17:05 crc kubenswrapper[4699]: I0226 11:17:05.070573 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 26 11:17:05 crc kubenswrapper[4699]: I0226 11:17:05.120370 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 26 11:17:05 crc kubenswrapper[4699]: I0226 11:17:05.139073 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 26 11:17:05 crc kubenswrapper[4699]: I0226 11:17:05.219380 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 26 11:17:05 crc kubenswrapper[4699]: I0226 11:17:05.286586 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 26 11:17:05 crc kubenswrapper[4699]: I0226 11:17:05.385461 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 11:17:05 crc kubenswrapper[4699]: I0226 11:17:05.405645 4699 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 26 11:17:05 crc kubenswrapper[4699]: I0226 11:17:05.425389 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 26 11:17:05 crc kubenswrapper[4699]: I0226 11:17:05.526487 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 26 11:17:05 crc kubenswrapper[4699]: I0226 11:17:05.666859 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 26 11:17:05 crc kubenswrapper[4699]: I0226 11:17:05.701488 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 11:17:05 crc kubenswrapper[4699]: I0226 11:17:05.788315 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 26 11:17:05 crc kubenswrapper[4699]: I0226 11:17:05.825303 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 26 11:17:05 crc kubenswrapper[4699]: I0226 11:17:05.852715 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.012869 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.019782 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.043552 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.053291 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.053881 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.059401 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.095170 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.145597 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.152755 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.166246 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.175755 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.261822 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.372166 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.418812 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.463181 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.548695 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.555080 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.719102 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.719190 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.744349 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.824562 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.825912 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.834888 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.837038 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.893772 4699 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.896083 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s8kpz" podStartSLOduration=39.400673462 podStartE2EDuration="2m57.896057454s" podCreationTimestamp="2026-02-26 11:14:09 +0000 UTC" firstStartedPulling="2026-02-26 11:14:12.21895184 +0000 UTC m=+198.029778274" lastFinishedPulling="2026-02-26 11:16:30.714335832 +0000 UTC m=+336.525162266" observedRunningTime="2026-02-26 11:16:47.611460029 +0000 UTC m=+353.422286473" watchObservedRunningTime="2026-02-26 11:17:06.896057454 +0000 UTC m=+372.706883888" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.896671 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sc9c6" podStartSLOduration=37.07970643 podStartE2EDuration="2m56.896665591s" podCreationTimestamp="2026-02-26 11:14:10 +0000 UTC" firstStartedPulling="2026-02-26 11:14:12.18032772 +0000 UTC m=+197.991154154" lastFinishedPulling="2026-02-26 11:16:31.997286881 +0000 UTC m=+337.808113315" observedRunningTime="2026-02-26 11:16:47.684248556 +0000 UTC m=+353.495075010" watchObservedRunningTime="2026-02-26 11:17:06.896665591 +0000 UTC m=+372.707492015" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.898046 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jhgks" podStartSLOduration=37.140931362 podStartE2EDuration="2m56.898037061s" podCreationTimestamp="2026-02-26 11:14:10 +0000 UTC" firstStartedPulling="2026-02-26 11:14:12.296659914 +0000 UTC m=+198.107486348" lastFinishedPulling="2026-02-26 11:16:32.053765593 +0000 UTC m=+337.864592047" observedRunningTime="2026-02-26 11:16:47.766610326 +0000 UTC m=+353.577436780" watchObservedRunningTime="2026-02-26 11:17:06.898037061 +0000 UTC m=+372.708863515" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.898657 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-phhbz" podStartSLOduration=39.172124348 podStartE2EDuration="2m59.898647198s" podCreationTimestamp="2026-02-26 11:14:07 +0000 UTC" firstStartedPulling="2026-02-26 11:14:11.114712168 +0000 UTC m=+196.925538602" lastFinishedPulling="2026-02-26 11:16:31.841235018 +0000 UTC m=+337.652061452" observedRunningTime="2026-02-26 11:16:47.663796682 +0000 UTC m=+353.474623126" watchObservedRunningTime="2026-02-26 11:17:06.898647198 +0000 UTC m=+372.709473642" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.898999 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-czwkc" podStartSLOduration=39.833540777 podStartE2EDuration="2m59.898992988s" podCreationTimestamp="2026-02-26 11:14:07 +0000 UTC" firstStartedPulling="2026-02-26 11:14:11.090009139 +0000 UTC m=+196.900835573" lastFinishedPulling="2026-02-26 11:16:31.15546134 +0000 UTC m=+336.966287784" observedRunningTime="2026-02-26 11:16:47.5943219 +0000 UTC m=+353.405148334" watchObservedRunningTime="2026-02-26 11:17:06.898992988 +0000 UTC m=+372.709819422" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.899869 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hrk4n" podStartSLOduration=38.40462806 podStartE2EDuration="2m57.899860393s" podCreationTimestamp="2026-02-26 11:14:09 +0000 UTC" firstStartedPulling="2026-02-26 11:14:12.188606254 +0000 UTC m=+197.999432688" lastFinishedPulling="2026-02-26 11:16:31.683838567 +0000 UTC m=+337.494665021" observedRunningTime="2026-02-26 11:16:47.735403415 +0000 UTC m=+353.546229859" watchObservedRunningTime="2026-02-26 11:17:06.899860393 +0000 UTC m=+372.710686827" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.900662 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.900721 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.905314 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.929530 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=18.929502098 podStartE2EDuration="18.929502098s" podCreationTimestamp="2026-02-26 11:16:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:17:06.925073902 +0000 UTC m=+372.735900346" watchObservedRunningTime="2026-02-26 11:17:06.929502098 +0000 UTC m=+372.740328532" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.964862 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.968036 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.972305 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.975954 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 26 11:17:06 crc kubenswrapper[4699]: I0226 11:17:06.979538 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.009962 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.041746 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.067781 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.072725 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.134067 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.161967 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.176782 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.192696 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.382761 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.430059 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.508974 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.553100 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.566989 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.600299 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.632306 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.679399 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.705188 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.724260 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.739241 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.826412 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.859038 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.879560 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.923610 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.931451 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.988029 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.988847 4699 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 26 11:17:07 crc kubenswrapper[4699]: I0226 11:17:07.994485 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.031629 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.091166 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.128268 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.130233 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.266586 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.287837 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.288280 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.334025 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.387083 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.387679 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.441123 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.441461 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.507754 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.530379 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.533421 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.621990 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.659533 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.679757 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.703881 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.717879 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.722889 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.723427 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.772819 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.789881 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.814642 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.821019 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.845615 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.855898 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.872156 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.884799 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 26 11:17:08 crc kubenswrapper[4699]: I0226 11:17:08.890058 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 26 11:17:09 crc kubenswrapper[4699]: I0226 11:17:09.063175 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 26 11:17:09 crc kubenswrapper[4699]: I0226 11:17:09.136772 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 26 11:17:09 crc kubenswrapper[4699]: I0226 11:17:09.145917 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 26 11:17:09 crc kubenswrapper[4699]: I0226 11:17:09.178531 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 26 11:17:09 crc kubenswrapper[4699]: I0226 11:17:09.181423 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 26 11:17:09 crc kubenswrapper[4699]: I0226 11:17:09.182539 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 26 11:17:09 crc kubenswrapper[4699]: I0226 11:17:09.261049 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 26 11:17:09 crc kubenswrapper[4699]: I0226 11:17:09.294541 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 26 11:17:09 crc kubenswrapper[4699]: I0226 11:17:09.399028 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 26 11:17:09 crc kubenswrapper[4699]: I0226 11:17:09.429409 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 26 11:17:09 crc kubenswrapper[4699]: I0226 11:17:09.536811 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 26 11:17:09 crc kubenswrapper[4699]: I0226 11:17:09.552194 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 26 11:17:09 crc kubenswrapper[4699]: I0226 11:17:09.581157 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 26 11:17:09 crc kubenswrapper[4699]: I0226 11:17:09.581430 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 26 11:17:09 crc kubenswrapper[4699]: I0226 11:17:09.992193 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.001349 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.002690 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.007960 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.201547 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.224566 4699 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.244342 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.260984 4699 scope.go:117] "RemoveContainer" containerID="6185db4f6fa07a6ece323e079cb5de2c410483b56be6a8ee0c600f7ca82c1ef1" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.314951 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.399412 4699 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.400078 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://faaf0aadacd79051543cdb9cfcd026bfc89d0e173e7f3faae8b64f52a92a3ab3" gracePeriod=5 Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.447698 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.451506 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.489616 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.518548 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.526609 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.574864 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.606999 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.613794 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.657233 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.790142 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.838873 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.855856 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.879429 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.881462 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.916654 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.923221 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.926792 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 26 11:17:10 crc kubenswrapper[4699]: I0226 11:17:10.984506 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 26 11:17:11 crc kubenswrapper[4699]: I0226 11:17:11.004425 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 26 11:17:11 crc kubenswrapper[4699]: I0226 11:17:11.025255 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 26 11:17:11 crc kubenswrapper[4699]: I0226 11:17:11.068045 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 26 11:17:11 crc kubenswrapper[4699]: I0226 11:17:11.107844 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 26 11:17:11 crc kubenswrapper[4699]: I0226 11:17:11.267643 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 26 11:17:11 crc kubenswrapper[4699]: I0226 11:17:11.379462 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 26 11:17:11 crc kubenswrapper[4699]: I0226 11:17:11.472273 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 26 11:17:11 crc kubenswrapper[4699]: I0226 11:17:11.510942 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 11:17:11 crc kubenswrapper[4699]: I0226 11:17:11.526267 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 26 11:17:11 crc kubenswrapper[4699]: I0226 11:17:11.657954 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 26 11:17:11 crc kubenswrapper[4699]: I0226 11:17:11.729397 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 26 11:17:11 crc kubenswrapper[4699]: I0226 11:17:11.742810 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 26 11:17:11 crc kubenswrapper[4699]: I0226 11:17:11.914869 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 26 11:17:11 crc kubenswrapper[4699]: I0226 11:17:11.958538 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 26 11:17:12 crc kubenswrapper[4699]: I0226 11:17:12.084936 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 26 11:17:12 crc kubenswrapper[4699]: I0226 11:17:12.122312 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 11:17:12 crc kubenswrapper[4699]: I0226 11:17:12.137641 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 26 11:17:12 crc kubenswrapper[4699]: I0226 11:17:12.163390 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 26 11:17:12 crc kubenswrapper[4699]: I0226 11:17:12.165093 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 26 11:17:12 crc kubenswrapper[4699]: I0226 11:17:12.351543 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 26 11:17:12 crc kubenswrapper[4699]: I0226 11:17:12.357688 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 26 11:17:12 crc kubenswrapper[4699]: I0226 11:17:12.365434 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 26 11:17:12 crc kubenswrapper[4699]: I0226 11:17:12.432818 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 26 11:17:12 crc kubenswrapper[4699]: I0226 11:17:12.458400 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 26 11:17:12 crc kubenswrapper[4699]: I0226 11:17:12.463424 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Feb 26 11:17:12 crc kubenswrapper[4699]: I0226 11:17:12.463478 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4d3fcdf2fd98cdbc61858f6f5382fbea38fad4c688754a8c137d17494350094e"} Feb 26 11:17:12 crc kubenswrapper[4699]: I0226 11:17:12.564830 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 26 11:17:12 crc kubenswrapper[4699]: I0226 11:17:12.781161 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 26 11:17:13 crc kubenswrapper[4699]: I0226 11:17:13.280994 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 11:17:13 crc kubenswrapper[4699]: I0226 11:17:13.320131 4699 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 26 11:17:13 crc kubenswrapper[4699]: I0226 11:17:13.320181 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 26 11:17:13 crc kubenswrapper[4699]: I0226 11:17:13.320230 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:17:13 crc kubenswrapper[4699]: I0226 11:17:13.320816 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"d8326aeadcc826fe0424355bd287bf65d6610bc258e44863eed96368db20aa6a"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 26 11:17:13 crc kubenswrapper[4699]: I0226 11:17:13.320927 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://d8326aeadcc826fe0424355bd287bf65d6610bc258e44863eed96368db20aa6a" gracePeriod=30 Feb 26 11:17:13 crc kubenswrapper[4699]: I0226 11:17:13.362080 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 26 11:17:13 crc kubenswrapper[4699]: I0226 11:17:13.497723 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 26 11:17:13 crc kubenswrapper[4699]: I0226 11:17:13.563378 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 26 11:17:13 crc kubenswrapper[4699]: I0226 11:17:13.644053 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 26 11:17:13 crc kubenswrapper[4699]: I0226 11:17:13.678267 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 26 11:17:13 crc kubenswrapper[4699]: I0226 11:17:13.723665 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 26 11:17:13 crc kubenswrapper[4699]: I0226 11:17:13.754657 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 26 11:17:13 crc kubenswrapper[4699]: I0226 11:17:13.754862 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 26 11:17:13 crc kubenswrapper[4699]: I0226 11:17:13.761906 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 26 11:17:13 crc kubenswrapper[4699]: I0226 11:17:13.764137 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 26 11:17:13 crc kubenswrapper[4699]: I0226 11:17:13.900027 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 26 11:17:13 crc kubenswrapper[4699]: I0226 11:17:13.979317 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 26 11:17:13 crc kubenswrapper[4699]: I0226 11:17:13.980153 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 26 11:17:14 crc kubenswrapper[4699]: I0226 11:17:14.006001 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 26 11:17:14 crc kubenswrapper[4699]: I0226 11:17:14.008960 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 26 11:17:14 crc kubenswrapper[4699]: I0226 11:17:14.176577 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 26 11:17:14 crc kubenswrapper[4699]: I0226 11:17:14.184500 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 26 11:17:14 crc kubenswrapper[4699]: I0226 11:17:14.184807 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 26 11:17:14 crc kubenswrapper[4699]: I0226 11:17:14.226075 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 26 11:17:14 crc kubenswrapper[4699]: I0226 11:17:14.367914 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 26 11:17:14 crc kubenswrapper[4699]: I0226 11:17:14.498002 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 26 11:17:14 crc kubenswrapper[4699]: I0226 11:17:14.584151 4699 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 26 11:17:14 crc kubenswrapper[4699]: I0226 11:17:14.703864 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 26 11:17:14 crc kubenswrapper[4699]: I0226 11:17:14.878218 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 26 11:17:15 crc kubenswrapper[4699]: I0226 11:17:15.490096 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 26 11:17:15 crc kubenswrapper[4699]: I0226 11:17:15.490329 4699 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="faaf0aadacd79051543cdb9cfcd026bfc89d0e173e7f3faae8b64f52a92a3ab3" exitCode=137 Feb 26 11:17:15 crc kubenswrapper[4699]: I0226 11:17:15.996257 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 26 11:17:15 crc kubenswrapper[4699]: I0226 11:17:15.996336 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 11:17:16 crc kubenswrapper[4699]: I0226 11:17:16.070304 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 26 11:17:16 crc kubenswrapper[4699]: I0226 11:17:16.087483 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 11:17:16 crc kubenswrapper[4699]: I0226 11:17:16.087582 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 11:17:16 crc kubenswrapper[4699]: I0226 11:17:16.087586 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:17:16 crc kubenswrapper[4699]: I0226 11:17:16.087653 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 11:17:16 crc kubenswrapper[4699]: I0226 11:17:16.087787 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:17:16 crc kubenswrapper[4699]: I0226 11:17:16.087818 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 11:17:16 crc kubenswrapper[4699]: I0226 11:17:16.087840 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:17:16 crc kubenswrapper[4699]: I0226 11:17:16.087864 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 11:17:16 crc kubenswrapper[4699]: I0226 11:17:16.087939 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:17:16 crc kubenswrapper[4699]: I0226 11:17:16.088163 4699 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 26 11:17:16 crc kubenswrapper[4699]: I0226 11:17:16.088178 4699 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 26 11:17:16 crc kubenswrapper[4699]: I0226 11:17:16.088186 4699 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 26 11:17:16 crc kubenswrapper[4699]: I0226 11:17:16.088213 4699 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 26 11:17:16 crc kubenswrapper[4699]: I0226 11:17:16.099439 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:17:16 crc kubenswrapper[4699]: I0226 11:17:16.129417 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 11:17:16 crc kubenswrapper[4699]: I0226 11:17:16.189745 4699 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 26 11:17:16 crc kubenswrapper[4699]: I0226 11:17:16.266630 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 26 11:17:16 crc kubenswrapper[4699]: I0226 11:17:16.375083 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 26 11:17:16 crc kubenswrapper[4699]: I0226 11:17:16.498617 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 26 11:17:16 crc kubenswrapper[4699]: I0226 11:17:16.498725 4699 scope.go:117] "RemoveContainer" containerID="faaf0aadacd79051543cdb9cfcd026bfc89d0e173e7f3faae8b64f52a92a3ab3" Feb 26 11:17:16 crc kubenswrapper[4699]: I0226 11:17:16.498761 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 11:17:24 crc kubenswrapper[4699]: I0226 11:17:24.476104 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 11:17:43 crc kubenswrapper[4699]: I0226 11:17:43.668351 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 26 11:17:43 crc kubenswrapper[4699]: I0226 11:17:43.671366 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 11:17:43 crc kubenswrapper[4699]: I0226 11:17:43.673177 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 26 11:17:43 crc kubenswrapper[4699]: I0226 11:17:43.673224 4699 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d8326aeadcc826fe0424355bd287bf65d6610bc258e44863eed96368db20aa6a" exitCode=137 Feb 26 11:17:43 crc kubenswrapper[4699]: I0226 11:17:43.673257 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d8326aeadcc826fe0424355bd287bf65d6610bc258e44863eed96368db20aa6a"} Feb 26 11:17:43 crc kubenswrapper[4699]: I0226 11:17:43.673290 4699 scope.go:117] "RemoveContainer" containerID="bcbdf473c08abfc93be6ee643eb86aebdaf8cae59cbe4c844b800862b15f7434" Feb 26 11:17:44 crc kubenswrapper[4699]: I0226 11:17:44.684029 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 26 11:17:44 crc kubenswrapper[4699]: I0226 11:17:44.686680 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 11:17:44 crc kubenswrapper[4699]: I0226 11:17:44.686761 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ab6a01aa53c295261f294f7fc7e981271d68ba8a24167aed4f9a82edd5da1265"} Feb 26 11:17:45 crc kubenswrapper[4699]: I0226 11:17:45.171261 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:17:53 crc kubenswrapper[4699]: I0226 11:17:53.320531 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:17:53 crc kubenswrapper[4699]: I0226 11:17:53.325243 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:17:55 crc kubenswrapper[4699]: I0226 11:17:55.176514 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 11:17:59 crc kubenswrapper[4699]: I0226 11:17:59.909337 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-24qnt"] Feb 26 11:17:59 crc kubenswrapper[4699]: E0226 11:17:59.911348 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 26 11:17:59 crc kubenswrapper[4699]: I0226 11:17:59.911536 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 26 11:17:59 crc kubenswrapper[4699]: E0226 11:17:59.911672 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" containerName="oc" Feb 26 11:17:59 crc kubenswrapper[4699]: I0226 11:17:59.911784 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" containerName="oc" Feb 26 11:17:59 crc kubenswrapper[4699]: E0226 11:17:59.911902 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" containerName="installer" Feb 26 11:17:59 crc kubenswrapper[4699]: I0226 11:17:59.912008 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" containerName="installer" Feb 26 11:17:59 crc kubenswrapper[4699]: I0226 11:17:59.912341 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 26 11:17:59 crc kubenswrapper[4699]: I0226 11:17:59.912458 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="a904aa73-23d7-4994-882a-4afafe02fb82" containerName="installer" Feb 26 11:17:59 crc kubenswrapper[4699]: I0226 11:17:59.912548 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" containerName="oc" Feb 26 11:17:59 crc kubenswrapper[4699]: I0226 11:17:59.918514 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:17:59 crc kubenswrapper[4699]: I0226 11:17:59.970773 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-24qnt"] Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.027470 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de71c708-910f-44de-8a6c-93671ddc16ec-ca-trust-extracted\") pod \"image-registry-66df7c8f76-24qnt\" (UID: \"de71c708-910f-44de-8a6c-93671ddc16ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.027613 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dfdh\" (UniqueName: \"kubernetes.io/projected/de71c708-910f-44de-8a6c-93671ddc16ec-kube-api-access-5dfdh\") pod \"image-registry-66df7c8f76-24qnt\" (UID: \"de71c708-910f-44de-8a6c-93671ddc16ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.027645 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de71c708-910f-44de-8a6c-93671ddc16ec-trusted-ca\") pod \"image-registry-66df7c8f76-24qnt\" (UID: \"de71c708-910f-44de-8a6c-93671ddc16ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.027775 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de71c708-910f-44de-8a6c-93671ddc16ec-installation-pull-secrets\") pod \"image-registry-66df7c8f76-24qnt\" (UID: \"de71c708-910f-44de-8a6c-93671ddc16ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.027853 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de71c708-910f-44de-8a6c-93671ddc16ec-bound-sa-token\") pod \"image-registry-66df7c8f76-24qnt\" (UID: \"de71c708-910f-44de-8a6c-93671ddc16ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.027887 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de71c708-910f-44de-8a6c-93671ddc16ec-registry-certificates\") pod \"image-registry-66df7c8f76-24qnt\" (UID: \"de71c708-910f-44de-8a6c-93671ddc16ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.027950 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de71c708-910f-44de-8a6c-93671ddc16ec-registry-tls\") pod \"image-registry-66df7c8f76-24qnt\" (UID: \"de71c708-910f-44de-8a6c-93671ddc16ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.028025 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-24qnt\" (UID: \"de71c708-910f-44de-8a6c-93671ddc16ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.094040 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-24qnt\" (UID: \"de71c708-910f-44de-8a6c-93671ddc16ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.129170 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de71c708-910f-44de-8a6c-93671ddc16ec-ca-trust-extracted\") pod \"image-registry-66df7c8f76-24qnt\" (UID: \"de71c708-910f-44de-8a6c-93671ddc16ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.129256 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dfdh\" (UniqueName: \"kubernetes.io/projected/de71c708-910f-44de-8a6c-93671ddc16ec-kube-api-access-5dfdh\") pod \"image-registry-66df7c8f76-24qnt\" (UID: \"de71c708-910f-44de-8a6c-93671ddc16ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.129279 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de71c708-910f-44de-8a6c-93671ddc16ec-trusted-ca\") pod \"image-registry-66df7c8f76-24qnt\" (UID: \"de71c708-910f-44de-8a6c-93671ddc16ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.129303 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de71c708-910f-44de-8a6c-93671ddc16ec-installation-pull-secrets\") pod \"image-registry-66df7c8f76-24qnt\" (UID: \"de71c708-910f-44de-8a6c-93671ddc16ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.129325 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de71c708-910f-44de-8a6c-93671ddc16ec-bound-sa-token\") pod \"image-registry-66df7c8f76-24qnt\" (UID: \"de71c708-910f-44de-8a6c-93671ddc16ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.129344 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de71c708-910f-44de-8a6c-93671ddc16ec-registry-tls\") pod \"image-registry-66df7c8f76-24qnt\" (UID: \"de71c708-910f-44de-8a6c-93671ddc16ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.129358 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de71c708-910f-44de-8a6c-93671ddc16ec-registry-certificates\") pod \"image-registry-66df7c8f76-24qnt\" (UID: \"de71c708-910f-44de-8a6c-93671ddc16ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.130533 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de71c708-910f-44de-8a6c-93671ddc16ec-registry-certificates\") pod \"image-registry-66df7c8f76-24qnt\" (UID: \"de71c708-910f-44de-8a6c-93671ddc16ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.130790 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de71c708-910f-44de-8a6c-93671ddc16ec-ca-trust-extracted\") pod \"image-registry-66df7c8f76-24qnt\" (UID: \"de71c708-910f-44de-8a6c-93671ddc16ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.131756 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de71c708-910f-44de-8a6c-93671ddc16ec-trusted-ca\") pod \"image-registry-66df7c8f76-24qnt\" (UID: \"de71c708-910f-44de-8a6c-93671ddc16ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.138282 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de71c708-910f-44de-8a6c-93671ddc16ec-installation-pull-secrets\") pod \"image-registry-66df7c8f76-24qnt\" (UID: \"de71c708-910f-44de-8a6c-93671ddc16ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.138530 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de71c708-910f-44de-8a6c-93671ddc16ec-registry-tls\") pod \"image-registry-66df7c8f76-24qnt\" (UID: \"de71c708-910f-44de-8a6c-93671ddc16ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.154619 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dfdh\" (UniqueName: \"kubernetes.io/projected/de71c708-910f-44de-8a6c-93671ddc16ec-kube-api-access-5dfdh\") pod \"image-registry-66df7c8f76-24qnt\" (UID: \"de71c708-910f-44de-8a6c-93671ddc16ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.160431 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de71c708-910f-44de-8a6c-93671ddc16ec-bound-sa-token\") pod \"image-registry-66df7c8f76-24qnt\" (UID: \"de71c708-910f-44de-8a6c-93671ddc16ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.171459 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535078-ktbp9"] Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.172337 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535078-ktbp9" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.178284 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535078-ktbp9"] Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.179009 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.179019 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.179433 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.234805 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.331665 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rggfp\" (UniqueName: \"kubernetes.io/projected/4c181d85-a2e5-4771-a5a7-6cdd1f944012-kube-api-access-rggfp\") pod \"auto-csr-approver-29535078-ktbp9\" (UID: \"4c181d85-a2e5-4771-a5a7-6cdd1f944012\") " pod="openshift-infra/auto-csr-approver-29535078-ktbp9" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.433151 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rggfp\" (UniqueName: \"kubernetes.io/projected/4c181d85-a2e5-4771-a5a7-6cdd1f944012-kube-api-access-rggfp\") pod \"auto-csr-approver-29535078-ktbp9\" (UID: \"4c181d85-a2e5-4771-a5a7-6cdd1f944012\") " pod="openshift-infra/auto-csr-approver-29535078-ktbp9" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.453602 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rggfp\" (UniqueName: \"kubernetes.io/projected/4c181d85-a2e5-4771-a5a7-6cdd1f944012-kube-api-access-rggfp\") pod \"auto-csr-approver-29535078-ktbp9\" (UID: \"4c181d85-a2e5-4771-a5a7-6cdd1f944012\") " pod="openshift-infra/auto-csr-approver-29535078-ktbp9" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.508991 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535078-ktbp9" Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.687658 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-24qnt"] Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.725919 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535078-ktbp9"] Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.786310 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" event={"ID":"de71c708-910f-44de-8a6c-93671ddc16ec","Type":"ContainerStarted","Data":"9cad155f29ac39dc68f3853e794d34c9ce1f1e95cf1f7316ec414ba766fa4b92"} Feb 26 11:18:00 crc kubenswrapper[4699]: I0226 11:18:00.788821 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535078-ktbp9" event={"ID":"4c181d85-a2e5-4771-a5a7-6cdd1f944012","Type":"ContainerStarted","Data":"a8b9a3e7ee013f3209491e0261ab397a01ff3add4cc6f302c1c814bc438cbfa0"} Feb 26 11:18:01 crc kubenswrapper[4699]: I0226 11:18:01.800697 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" event={"ID":"de71c708-910f-44de-8a6c-93671ddc16ec","Type":"ContainerStarted","Data":"755cf55d6dbdcb454ce511304bc1376acc6aa101a58f6c31fa5b30321cd709ed"} Feb 26 11:18:01 crc kubenswrapper[4699]: I0226 11:18:01.801818 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:01 crc kubenswrapper[4699]: I0226 11:18:01.824671 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" podStartSLOduration=2.8246496260000002 podStartE2EDuration="2.824649626s" podCreationTimestamp="2026-02-26 11:17:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:18:01.820198647 +0000 UTC m=+427.631025101" watchObservedRunningTime="2026-02-26 11:18:01.824649626 +0000 UTC m=+427.635476060" Feb 26 11:18:02 crc kubenswrapper[4699]: I0226 11:18:02.811613 4699 generic.go:334] "Generic (PLEG): container finished" podID="4c181d85-a2e5-4771-a5a7-6cdd1f944012" containerID="1eda56a25e25c14621838f63ba6ea80e65461406feb4a8836fe9fda800de7616" exitCode=0 Feb 26 11:18:02 crc kubenswrapper[4699]: I0226 11:18:02.811762 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535078-ktbp9" event={"ID":"4c181d85-a2e5-4771-a5a7-6cdd1f944012","Type":"ContainerDied","Data":"1eda56a25e25c14621838f63ba6ea80e65461406feb4a8836fe9fda800de7616"} Feb 26 11:18:04 crc kubenswrapper[4699]: I0226 11:18:04.092506 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535078-ktbp9" Feb 26 11:18:04 crc kubenswrapper[4699]: I0226 11:18:04.185277 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rggfp\" (UniqueName: \"kubernetes.io/projected/4c181d85-a2e5-4771-a5a7-6cdd1f944012-kube-api-access-rggfp\") pod \"4c181d85-a2e5-4771-a5a7-6cdd1f944012\" (UID: \"4c181d85-a2e5-4771-a5a7-6cdd1f944012\") " Feb 26 11:18:04 crc kubenswrapper[4699]: I0226 11:18:04.191659 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c181d85-a2e5-4771-a5a7-6cdd1f944012-kube-api-access-rggfp" (OuterVolumeSpecName: "kube-api-access-rggfp") pod "4c181d85-a2e5-4771-a5a7-6cdd1f944012" (UID: "4c181d85-a2e5-4771-a5a7-6cdd1f944012"). InnerVolumeSpecName "kube-api-access-rggfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:18:04 crc kubenswrapper[4699]: I0226 11:18:04.286868 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rggfp\" (UniqueName: \"kubernetes.io/projected/4c181d85-a2e5-4771-a5a7-6cdd1f944012-kube-api-access-rggfp\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:04 crc kubenswrapper[4699]: I0226 11:18:04.824719 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535078-ktbp9" event={"ID":"4c181d85-a2e5-4771-a5a7-6cdd1f944012","Type":"ContainerDied","Data":"a8b9a3e7ee013f3209491e0261ab397a01ff3add4cc6f302c1c814bc438cbfa0"} Feb 26 11:18:04 crc kubenswrapper[4699]: I0226 11:18:04.824985 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8b9a3e7ee013f3209491e0261ab397a01ff3add4cc6f302c1c814bc438cbfa0" Feb 26 11:18:04 crc kubenswrapper[4699]: I0226 11:18:04.824764 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535078-ktbp9" Feb 26 11:18:11 crc kubenswrapper[4699]: I0226 11:18:11.585510 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:18:11 crc kubenswrapper[4699]: I0226 11:18:11.585586 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:18:20 crc kubenswrapper[4699]: I0226 11:18:20.241474 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-24qnt" Feb 26 11:18:20 crc kubenswrapper[4699]: I0226 11:18:20.315889 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-t8656"] Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.737397 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mzgjj"] Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.738501 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mzgjj" podUID="71a83978-4f86-404b-967a-0e7493ff6721" containerName="registry-server" containerID="cri-o://c40aadf0dd4571fbdbc8a87295a7b711ac5b2622ecf8b10a9b30c71872d6459a" gracePeriod=30 Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.743368 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-phhbz"] Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.743644 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-phhbz" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" containerName="registry-server" containerID="cri-o://4459df84e7aab7535bf4732238c87c4da5222e3237b69439fff20886a1ea7688" gracePeriod=30 Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.765556 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-czwkc"] Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.766035 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-czwkc" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" containerName="registry-server" containerID="cri-o://b3876e98ecf09cf7959a550cdff441956e5f5211a25918e3b5b1ff888aa1faca" gracePeriod=30 Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.774793 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cd5qf"] Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.775212 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qf" podUID="5cc10041-704b-4b00-8e4e-369103434b64" containerName="marketplace-operator" containerID="cri-o://2d5a0c0e5922846f3c8cbf86200e16bf7b7b0416026b9755460756c2a821cc04" gracePeriod=30 Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.797686 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrk4n"] Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.798060 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hrk4n" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" containerName="registry-server" containerID="cri-o://aa39a04716f8cdcc694931265437b30c1cb1c3615ab11016472ce3e95c18688b" gracePeriod=30 Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.804021 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8kpz"] Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.804515 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s8kpz" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" containerName="registry-server" containerID="cri-o://bca562e2d2fabc5097841d6398d6c6b6a6779605566f4dd3173111ee1e8c04f3" gracePeriod=30 Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.817573 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nwbkq"] Feb 26 11:18:26 crc kubenswrapper[4699]: E0226 11:18:26.817841 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c181d85-a2e5-4771-a5a7-6cdd1f944012" containerName="oc" Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.817856 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c181d85-a2e5-4771-a5a7-6cdd1f944012" containerName="oc" Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.817978 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c181d85-a2e5-4771-a5a7-6cdd1f944012" containerName="oc" Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.818438 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nwbkq" Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.822105 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/43a980f6-1eff-4610-aa3e-69729c3eb7c7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nwbkq\" (UID: \"43a980f6-1eff-4610-aa3e-69729c3eb7c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-nwbkq" Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.822178 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tff82\" (UniqueName: \"kubernetes.io/projected/43a980f6-1eff-4610-aa3e-69729c3eb7c7-kube-api-access-tff82\") pod \"marketplace-operator-79b997595-nwbkq\" (UID: \"43a980f6-1eff-4610-aa3e-69729c3eb7c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-nwbkq" Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.822198 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/43a980f6-1eff-4610-aa3e-69729c3eb7c7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nwbkq\" (UID: \"43a980f6-1eff-4610-aa3e-69729c3eb7c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-nwbkq" Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.822475 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jhgks"] Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.822777 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jhgks" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" containerName="registry-server" containerID="cri-o://58ec65080ac68341b08f4272194fe62d85383a27766f002151749856e7c508e7" gracePeriod=30 Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.827722 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sc9c6"] Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.828009 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sc9c6" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" containerName="registry-server" containerID="cri-o://704c8ba25f50ac5c881bb9d05eb872ee3851c9a21d28c2acf7a27a400acbebe0" gracePeriod=30 Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.830361 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nwbkq"] Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.923725 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/43a980f6-1eff-4610-aa3e-69729c3eb7c7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nwbkq\" (UID: \"43a980f6-1eff-4610-aa3e-69729c3eb7c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-nwbkq" Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.924214 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/43a980f6-1eff-4610-aa3e-69729c3eb7c7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nwbkq\" (UID: \"43a980f6-1eff-4610-aa3e-69729c3eb7c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-nwbkq" Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.924355 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tff82\" (UniqueName: \"kubernetes.io/projected/43a980f6-1eff-4610-aa3e-69729c3eb7c7-kube-api-access-tff82\") pod \"marketplace-operator-79b997595-nwbkq\" (UID: \"43a980f6-1eff-4610-aa3e-69729c3eb7c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-nwbkq" Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.925540 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/43a980f6-1eff-4610-aa3e-69729c3eb7c7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nwbkq\" (UID: \"43a980f6-1eff-4610-aa3e-69729c3eb7c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-nwbkq" Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.930720 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/43a980f6-1eff-4610-aa3e-69729c3eb7c7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nwbkq\" (UID: \"43a980f6-1eff-4610-aa3e-69729c3eb7c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-nwbkq" Feb 26 11:18:26 crc kubenswrapper[4699]: I0226 11:18:26.948347 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tff82\" (UniqueName: \"kubernetes.io/projected/43a980f6-1eff-4610-aa3e-69729c3eb7c7-kube-api-access-tff82\") pod \"marketplace-operator-79b997595-nwbkq\" (UID: \"43a980f6-1eff-4610-aa3e-69729c3eb7c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-nwbkq" Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.021696 4699 generic.go:334] "Generic (PLEG): container finished" podID="ac0026c3-1fad-4b34-9c42-389971f0c773" containerID="b3876e98ecf09cf7959a550cdff441956e5f5211a25918e3b5b1ff888aa1faca" exitCode=0 Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.021763 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-czwkc" event={"ID":"ac0026c3-1fad-4b34-9c42-389971f0c773","Type":"ContainerDied","Data":"b3876e98ecf09cf7959a550cdff441956e5f5211a25918e3b5b1ff888aa1faca"} Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.025952 4699 generic.go:334] "Generic (PLEG): container finished" podID="6b9da973-6b5f-4485-adca-8792b0a3d256" containerID="58ec65080ac68341b08f4272194fe62d85383a27766f002151749856e7c508e7" exitCode=0 Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.026037 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jhgks" event={"ID":"6b9da973-6b5f-4485-adca-8792b0a3d256","Type":"ContainerDied","Data":"58ec65080ac68341b08f4272194fe62d85383a27766f002151749856e7c508e7"} Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.028159 4699 generic.go:334] "Generic (PLEG): container finished" podID="44d171ad-7d92-4c70-a686-65f60ded8a03" containerID="704c8ba25f50ac5c881bb9d05eb872ee3851c9a21d28c2acf7a27a400acbebe0" exitCode=0 Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.028606 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sc9c6" event={"ID":"44d171ad-7d92-4c70-a686-65f60ded8a03","Type":"ContainerDied","Data":"704c8ba25f50ac5c881bb9d05eb872ee3851c9a21d28c2acf7a27a400acbebe0"} Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.030974 4699 generic.go:334] "Generic (PLEG): container finished" podID="6e7ddf51-5522-4085-8567-76c9a254ed15" containerID="aa39a04716f8cdcc694931265437b30c1cb1c3615ab11016472ce3e95c18688b" exitCode=0 Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.031073 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrk4n" event={"ID":"6e7ddf51-5522-4085-8567-76c9a254ed15","Type":"ContainerDied","Data":"aa39a04716f8cdcc694931265437b30c1cb1c3615ab11016472ce3e95c18688b"} Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.033480 4699 generic.go:334] "Generic (PLEG): container finished" podID="8c96a703-e568-4916-8035-a951ae91dc2b" containerID="bca562e2d2fabc5097841d6398d6c6b6a6779605566f4dd3173111ee1e8c04f3" exitCode=0 Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.033577 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8kpz" event={"ID":"8c96a703-e568-4916-8035-a951ae91dc2b","Type":"ContainerDied","Data":"bca562e2d2fabc5097841d6398d6c6b6a6779605566f4dd3173111ee1e8c04f3"} Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.036027 4699 generic.go:334] "Generic (PLEG): container finished" podID="9ea10063-7888-400e-af1c-216cbde5a13e" containerID="4459df84e7aab7535bf4732238c87c4da5222e3237b69439fff20886a1ea7688" exitCode=0 Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.036104 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phhbz" event={"ID":"9ea10063-7888-400e-af1c-216cbde5a13e","Type":"ContainerDied","Data":"4459df84e7aab7535bf4732238c87c4da5222e3237b69439fff20886a1ea7688"} Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.037375 4699 generic.go:334] "Generic (PLEG): container finished" podID="5cc10041-704b-4b00-8e4e-369103434b64" containerID="2d5a0c0e5922846f3c8cbf86200e16bf7b7b0416026b9755460756c2a821cc04" exitCode=0 Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.037438 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qf" event={"ID":"5cc10041-704b-4b00-8e4e-369103434b64","Type":"ContainerDied","Data":"2d5a0c0e5922846f3c8cbf86200e16bf7b7b0416026b9755460756c2a821cc04"} Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.039531 4699 generic.go:334] "Generic (PLEG): container finished" podID="71a83978-4f86-404b-967a-0e7493ff6721" containerID="c40aadf0dd4571fbdbc8a87295a7b711ac5b2622ecf8b10a9b30c71872d6459a" exitCode=0 Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.039633 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzgjj" event={"ID":"71a83978-4f86-404b-967a-0e7493ff6721","Type":"ContainerDied","Data":"c40aadf0dd4571fbdbc8a87295a7b711ac5b2622ecf8b10a9b30c71872d6459a"} Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.152538 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nwbkq" Feb 26 11:18:27 crc kubenswrapper[4699]: E0226 11:18:27.576825 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c40aadf0dd4571fbdbc8a87295a7b711ac5b2622ecf8b10a9b30c71872d6459a is running failed: container process not found" containerID="c40aadf0dd4571fbdbc8a87295a7b711ac5b2622ecf8b10a9b30c71872d6459a" cmd=["grpc_health_probe","-addr=:50051"] Feb 26 11:18:27 crc kubenswrapper[4699]: E0226 11:18:27.577749 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c40aadf0dd4571fbdbc8a87295a7b711ac5b2622ecf8b10a9b30c71872d6459a is running failed: container process not found" containerID="c40aadf0dd4571fbdbc8a87295a7b711ac5b2622ecf8b10a9b30c71872d6459a" cmd=["grpc_health_probe","-addr=:50051"] Feb 26 11:18:27 crc kubenswrapper[4699]: E0226 11:18:27.578491 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c40aadf0dd4571fbdbc8a87295a7b711ac5b2622ecf8b10a9b30c71872d6459a is running failed: container process not found" containerID="c40aadf0dd4571fbdbc8a87295a7b711ac5b2622ecf8b10a9b30c71872d6459a" cmd=["grpc_health_probe","-addr=:50051"] Feb 26 11:18:27 crc kubenswrapper[4699]: E0226 11:18:27.578534 4699 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c40aadf0dd4571fbdbc8a87295a7b711ac5b2622ecf8b10a9b30c71872d6459a is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-mzgjj" podUID="71a83978-4f86-404b-967a-0e7493ff6721" containerName="registry-server" Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.596616 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nwbkq"] Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.662910 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mzgjj" Feb 26 11:18:27 crc kubenswrapper[4699]: E0226 11:18:27.684716 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b3876e98ecf09cf7959a550cdff441956e5f5211a25918e3b5b1ff888aa1faca is running failed: container process not found" containerID="b3876e98ecf09cf7959a550cdff441956e5f5211a25918e3b5b1ff888aa1faca" cmd=["grpc_health_probe","-addr=:50051"] Feb 26 11:18:27 crc kubenswrapper[4699]: E0226 11:18:27.687543 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b3876e98ecf09cf7959a550cdff441956e5f5211a25918e3b5b1ff888aa1faca is running failed: container process not found" containerID="b3876e98ecf09cf7959a550cdff441956e5f5211a25918e3b5b1ff888aa1faca" cmd=["grpc_health_probe","-addr=:50051"] Feb 26 11:18:27 crc kubenswrapper[4699]: E0226 11:18:27.688134 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b3876e98ecf09cf7959a550cdff441956e5f5211a25918e3b5b1ff888aa1faca is running failed: container process not found" containerID="b3876e98ecf09cf7959a550cdff441956e5f5211a25918e3b5b1ff888aa1faca" cmd=["grpc_health_probe","-addr=:50051"] Feb 26 11:18:27 crc kubenswrapper[4699]: E0226 11:18:27.688176 4699 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b3876e98ecf09cf7959a550cdff441956e5f5211a25918e3b5b1ff888aa1faca is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-czwkc" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" containerName="registry-server" Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.719323 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phhbz" Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.841839 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ea10063-7888-400e-af1c-216cbde5a13e-catalog-content\") pod \"9ea10063-7888-400e-af1c-216cbde5a13e\" (UID: \"9ea10063-7888-400e-af1c-216cbde5a13e\") " Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.841897 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71a83978-4f86-404b-967a-0e7493ff6721-utilities\") pod \"71a83978-4f86-404b-967a-0e7493ff6721\" (UID: \"71a83978-4f86-404b-967a-0e7493ff6721\") " Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.841929 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z6wd\" (UniqueName: \"kubernetes.io/projected/71a83978-4f86-404b-967a-0e7493ff6721-kube-api-access-9z6wd\") pod \"71a83978-4f86-404b-967a-0e7493ff6721\" (UID: \"71a83978-4f86-404b-967a-0e7493ff6721\") " Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.841969 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71a83978-4f86-404b-967a-0e7493ff6721-catalog-content\") pod \"71a83978-4f86-404b-967a-0e7493ff6721\" (UID: \"71a83978-4f86-404b-967a-0e7493ff6721\") " Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.842043 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-699tw\" (UniqueName: \"kubernetes.io/projected/9ea10063-7888-400e-af1c-216cbde5a13e-kube-api-access-699tw\") pod \"9ea10063-7888-400e-af1c-216cbde5a13e\" (UID: \"9ea10063-7888-400e-af1c-216cbde5a13e\") " Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.842066 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ea10063-7888-400e-af1c-216cbde5a13e-utilities\") pod \"9ea10063-7888-400e-af1c-216cbde5a13e\" (UID: \"9ea10063-7888-400e-af1c-216cbde5a13e\") " Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.842969 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ea10063-7888-400e-af1c-216cbde5a13e-utilities" (OuterVolumeSpecName: "utilities") pod "9ea10063-7888-400e-af1c-216cbde5a13e" (UID: "9ea10063-7888-400e-af1c-216cbde5a13e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.851013 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ea10063-7888-400e-af1c-216cbde5a13e-kube-api-access-699tw" (OuterVolumeSpecName: "kube-api-access-699tw") pod "9ea10063-7888-400e-af1c-216cbde5a13e" (UID: "9ea10063-7888-400e-af1c-216cbde5a13e"). InnerVolumeSpecName "kube-api-access-699tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.853779 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71a83978-4f86-404b-967a-0e7493ff6721-utilities" (OuterVolumeSpecName: "utilities") pod "71a83978-4f86-404b-967a-0e7493ff6721" (UID: "71a83978-4f86-404b-967a-0e7493ff6721"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.854583 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a83978-4f86-404b-967a-0e7493ff6721-kube-api-access-9z6wd" (OuterVolumeSpecName: "kube-api-access-9z6wd") pod "71a83978-4f86-404b-967a-0e7493ff6721" (UID: "71a83978-4f86-404b-967a-0e7493ff6721"). InnerVolumeSpecName "kube-api-access-9z6wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.923420 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ea10063-7888-400e-af1c-216cbde5a13e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ea10063-7888-400e-af1c-216cbde5a13e" (UID: "9ea10063-7888-400e-af1c-216cbde5a13e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.924904 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71a83978-4f86-404b-967a-0e7493ff6721-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71a83978-4f86-404b-967a-0e7493ff6721" (UID: "71a83978-4f86-404b-967a-0e7493ff6721"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.950289 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ea10063-7888-400e-af1c-216cbde5a13e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.950787 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71a83978-4f86-404b-967a-0e7493ff6721-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.950816 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9z6wd\" (UniqueName: \"kubernetes.io/projected/71a83978-4f86-404b-967a-0e7493ff6721-kube-api-access-9z6wd\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.950830 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71a83978-4f86-404b-967a-0e7493ff6721-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.950842 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-699tw\" (UniqueName: \"kubernetes.io/projected/9ea10063-7888-400e-af1c-216cbde5a13e-kube-api-access-699tw\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.950853 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ea10063-7888-400e-af1c-216cbde5a13e-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:27 crc kubenswrapper[4699]: I0226 11:18:27.996804 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sc9c6" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.008765 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s8kpz" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.010524 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrk4n" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.023024 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jhgks" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.035368 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-czwkc" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.036355 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qf" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.054402 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qf" event={"ID":"5cc10041-704b-4b00-8e4e-369103434b64","Type":"ContainerDied","Data":"be07ebbed72d10e6a52397198b9b567e946941b2a2ee6b1a35e4358ea9958b9f"} Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.054455 4699 scope.go:117] "RemoveContainer" containerID="2d5a0c0e5922846f3c8cbf86200e16bf7b7b0416026b9755460756c2a821cc04" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.054562 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qf" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.058363 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzgjj" event={"ID":"71a83978-4f86-404b-967a-0e7493ff6721","Type":"ContainerDied","Data":"1c165f4cac2c47ef0e2f5ea976276ea6634d20dbf88d2b070f23064d87eecce4"} Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.058499 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mzgjj" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.070505 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jhgks" event={"ID":"6b9da973-6b5f-4485-adca-8792b0a3d256","Type":"ContainerDied","Data":"1df59f3f6cf47eeaee6c7803f5d095457eb18adeaca6dc9c81e5b0dfb758e003"} Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.070564 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jhgks" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.072911 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8kpz" event={"ID":"8c96a703-e568-4916-8035-a951ae91dc2b","Type":"ContainerDied","Data":"18a720cd12fbf1604976388b722cf7ea85f1660cb3d90ac7f016d51d465b43d1"} Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.073027 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s8kpz" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.075457 4699 scope.go:117] "RemoveContainer" containerID="c40aadf0dd4571fbdbc8a87295a7b711ac5b2622ecf8b10a9b30c71872d6459a" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.079430 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phhbz" event={"ID":"9ea10063-7888-400e-af1c-216cbde5a13e","Type":"ContainerDied","Data":"c8dc58ce346d0f6b6aad0363b33f0cf4112745523923e0e7d1cf3d865b90372a"} Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.079498 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phhbz" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.081561 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-czwkc" event={"ID":"ac0026c3-1fad-4b34-9c42-389971f0c773","Type":"ContainerDied","Data":"31376761fbf12a5b81018d6bde894ab4db92607e39e297d6342dce3d31049346"} Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.081636 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-czwkc" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.084792 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sc9c6" event={"ID":"44d171ad-7d92-4c70-a686-65f60ded8a03","Type":"ContainerDied","Data":"8416abc544344d1375d554f38d43ac67e9642de8063e20464268f9eaf0d51147"} Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.084867 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sc9c6" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.086065 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nwbkq" event={"ID":"43a980f6-1eff-4610-aa3e-69729c3eb7c7","Type":"ContainerStarted","Data":"1f6aa9e2c51069147ca61695d18adc4bd68eb808c51979e71400258ae22e6a56"} Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.086100 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nwbkq" event={"ID":"43a980f6-1eff-4610-aa3e-69729c3eb7c7","Type":"ContainerStarted","Data":"cafc0def26259d80ebcc7f8a94991c05183edc9e566b9ec57a181642d2661d9b"} Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.088218 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-nwbkq" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.095747 4699 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nwbkq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" start-of-body= Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.095950 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nwbkq" podUID="43a980f6-1eff-4610-aa3e-69729c3eb7c7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.099254 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrk4n" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.099107 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrk4n" event={"ID":"6e7ddf51-5522-4085-8567-76c9a254ed15","Type":"ContainerDied","Data":"64ab7f5c1142b79d1cad6017fda721d048cccdd042121faa577213948620ffa2"} Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.115993 4699 scope.go:117] "RemoveContainer" containerID="f41fa5d8badc750f1371bec0896b93547f2bd25c6f1942a17a10cfb9c1edba94" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.136988 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-phhbz"] Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.141135 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-phhbz"] Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.153561 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqrqs\" (UniqueName: \"kubernetes.io/projected/ac0026c3-1fad-4b34-9c42-389971f0c773-kube-api-access-rqrqs\") pod \"ac0026c3-1fad-4b34-9c42-389971f0c773\" (UID: \"ac0026c3-1fad-4b34-9c42-389971f0c773\") " Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.153624 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b9da973-6b5f-4485-adca-8792b0a3d256-catalog-content\") pod \"6b9da973-6b5f-4485-adca-8792b0a3d256\" (UID: \"6b9da973-6b5f-4485-adca-8792b0a3d256\") " Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.153646 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e7ddf51-5522-4085-8567-76c9a254ed15-catalog-content\") pod \"6e7ddf51-5522-4085-8567-76c9a254ed15\" (UID: \"6e7ddf51-5522-4085-8567-76c9a254ed15\") " Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.153663 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tqhd\" (UniqueName: \"kubernetes.io/projected/44d171ad-7d92-4c70-a686-65f60ded8a03-kube-api-access-2tqhd\") pod \"44d171ad-7d92-4c70-a686-65f60ded8a03\" (UID: \"44d171ad-7d92-4c70-a686-65f60ded8a03\") " Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.153682 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b9da973-6b5f-4485-adca-8792b0a3d256-utilities\") pod \"6b9da973-6b5f-4485-adca-8792b0a3d256\" (UID: \"6b9da973-6b5f-4485-adca-8792b0a3d256\") " Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.153699 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d171ad-7d92-4c70-a686-65f60ded8a03-catalog-content\") pod \"44d171ad-7d92-4c70-a686-65f60ded8a03\" (UID: \"44d171ad-7d92-4c70-a686-65f60ded8a03\") " Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.153719 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cc10041-704b-4b00-8e4e-369103434b64-marketplace-trusted-ca\") pod \"5cc10041-704b-4b00-8e4e-369103434b64\" (UID: \"5cc10041-704b-4b00-8e4e-369103434b64\") " Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.153766 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr725\" (UniqueName: \"kubernetes.io/projected/8c96a703-e568-4916-8035-a951ae91dc2b-kube-api-access-rr725\") pod \"8c96a703-e568-4916-8035-a951ae91dc2b\" (UID: \"8c96a703-e568-4916-8035-a951ae91dc2b\") " Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.153799 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xhdp\" (UniqueName: \"kubernetes.io/projected/6e7ddf51-5522-4085-8567-76c9a254ed15-kube-api-access-7xhdp\") pod \"6e7ddf51-5522-4085-8567-76c9a254ed15\" (UID: \"6e7ddf51-5522-4085-8567-76c9a254ed15\") " Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.153817 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c96a703-e568-4916-8035-a951ae91dc2b-utilities\") pod \"8c96a703-e568-4916-8035-a951ae91dc2b\" (UID: \"8c96a703-e568-4916-8035-a951ae91dc2b\") " Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.153849 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c96a703-e568-4916-8035-a951ae91dc2b-catalog-content\") pod \"8c96a703-e568-4916-8035-a951ae91dc2b\" (UID: \"8c96a703-e568-4916-8035-a951ae91dc2b\") " Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.153869 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44jnw\" (UniqueName: \"kubernetes.io/projected/6b9da973-6b5f-4485-adca-8792b0a3d256-kube-api-access-44jnw\") pod \"6b9da973-6b5f-4485-adca-8792b0a3d256\" (UID: \"6b9da973-6b5f-4485-adca-8792b0a3d256\") " Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.153886 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwq64\" (UniqueName: \"kubernetes.io/projected/5cc10041-704b-4b00-8e4e-369103434b64-kube-api-access-bwq64\") pod \"5cc10041-704b-4b00-8e4e-369103434b64\" (UID: \"5cc10041-704b-4b00-8e4e-369103434b64\") " Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.153923 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac0026c3-1fad-4b34-9c42-389971f0c773-catalog-content\") pod \"ac0026c3-1fad-4b34-9c42-389971f0c773\" (UID: \"ac0026c3-1fad-4b34-9c42-389971f0c773\") " Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.153941 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cc10041-704b-4b00-8e4e-369103434b64-marketplace-operator-metrics\") pod \"5cc10041-704b-4b00-8e4e-369103434b64\" (UID: \"5cc10041-704b-4b00-8e4e-369103434b64\") " Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.153962 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac0026c3-1fad-4b34-9c42-389971f0c773-utilities\") pod \"ac0026c3-1fad-4b34-9c42-389971f0c773\" (UID: \"ac0026c3-1fad-4b34-9c42-389971f0c773\") " Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.153987 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d171ad-7d92-4c70-a686-65f60ded8a03-utilities\") pod \"44d171ad-7d92-4c70-a686-65f60ded8a03\" (UID: \"44d171ad-7d92-4c70-a686-65f60ded8a03\") " Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.154015 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e7ddf51-5522-4085-8567-76c9a254ed15-utilities\") pod \"6e7ddf51-5522-4085-8567-76c9a254ed15\" (UID: \"6e7ddf51-5522-4085-8567-76c9a254ed15\") " Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.158857 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cc10041-704b-4b00-8e4e-369103434b64-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "5cc10041-704b-4b00-8e4e-369103434b64" (UID: "5cc10041-704b-4b00-8e4e-369103434b64"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.160057 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e7ddf51-5522-4085-8567-76c9a254ed15-kube-api-access-7xhdp" (OuterVolumeSpecName: "kube-api-access-7xhdp") pod "6e7ddf51-5522-4085-8567-76c9a254ed15" (UID: "6e7ddf51-5522-4085-8567-76c9a254ed15"). InnerVolumeSpecName "kube-api-access-7xhdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.160466 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac0026c3-1fad-4b34-9c42-389971f0c773-utilities" (OuterVolumeSpecName: "utilities") pod "ac0026c3-1fad-4b34-9c42-389971f0c773" (UID: "ac0026c3-1fad-4b34-9c42-389971f0c773"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.161855 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44d171ad-7d92-4c70-a686-65f60ded8a03-utilities" (OuterVolumeSpecName: "utilities") pod "44d171ad-7d92-4c70-a686-65f60ded8a03" (UID: "44d171ad-7d92-4c70-a686-65f60ded8a03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.162756 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e7ddf51-5522-4085-8567-76c9a254ed15-utilities" (OuterVolumeSpecName: "utilities") pod "6e7ddf51-5522-4085-8567-76c9a254ed15" (UID: "6e7ddf51-5522-4085-8567-76c9a254ed15"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.162927 4699 scope.go:117] "RemoveContainer" containerID="f1b31944470f82af52e860af7004767cf2db0ef2acdf2a9986adc95701213e55" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.165175 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c96a703-e568-4916-8035-a951ae91dc2b-utilities" (OuterVolumeSpecName: "utilities") pod "8c96a703-e568-4916-8035-a951ae91dc2b" (UID: "8c96a703-e568-4916-8035-a951ae91dc2b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.168306 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b9da973-6b5f-4485-adca-8792b0a3d256-utilities" (OuterVolumeSpecName: "utilities") pod "6b9da973-6b5f-4485-adca-8792b0a3d256" (UID: "6b9da973-6b5f-4485-adca-8792b0a3d256"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.169554 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac0026c3-1fad-4b34-9c42-389971f0c773-kube-api-access-rqrqs" (OuterVolumeSpecName: "kube-api-access-rqrqs") pod "ac0026c3-1fad-4b34-9c42-389971f0c773" (UID: "ac0026c3-1fad-4b34-9c42-389971f0c773"). InnerVolumeSpecName "kube-api-access-rqrqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.172950 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cc10041-704b-4b00-8e4e-369103434b64-kube-api-access-bwq64" (OuterVolumeSpecName: "kube-api-access-bwq64") pod "5cc10041-704b-4b00-8e4e-369103434b64" (UID: "5cc10041-704b-4b00-8e4e-369103434b64"). InnerVolumeSpecName "kube-api-access-bwq64". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.173529 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cc10041-704b-4b00-8e4e-369103434b64-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "5cc10041-704b-4b00-8e4e-369103434b64" (UID: "5cc10041-704b-4b00-8e4e-369103434b64"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.174801 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44d171ad-7d92-4c70-a686-65f60ded8a03-kube-api-access-2tqhd" (OuterVolumeSpecName: "kube-api-access-2tqhd") pod "44d171ad-7d92-4c70-a686-65f60ded8a03" (UID: "44d171ad-7d92-4c70-a686-65f60ded8a03"). InnerVolumeSpecName "kube-api-access-2tqhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.175520 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c96a703-e568-4916-8035-a951ae91dc2b-kube-api-access-rr725" (OuterVolumeSpecName: "kube-api-access-rr725") pod "8c96a703-e568-4916-8035-a951ae91dc2b" (UID: "8c96a703-e568-4916-8035-a951ae91dc2b"). InnerVolumeSpecName "kube-api-access-rr725". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.177413 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b9da973-6b5f-4485-adca-8792b0a3d256-kube-api-access-44jnw" (OuterVolumeSpecName: "kube-api-access-44jnw") pod "6b9da973-6b5f-4485-adca-8792b0a3d256" (UID: "6b9da973-6b5f-4485-adca-8792b0a3d256"). InnerVolumeSpecName "kube-api-access-44jnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.190632 4699 scope.go:117] "RemoveContainer" containerID="58ec65080ac68341b08f4272194fe62d85383a27766f002151749856e7c508e7" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.212413 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mzgjj"] Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.213348 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c96a703-e568-4916-8035-a951ae91dc2b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c96a703-e568-4916-8035-a951ae91dc2b" (UID: "8c96a703-e568-4916-8035-a951ae91dc2b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.217684 4699 scope.go:117] "RemoveContainer" containerID="7d653e44fd8d815b615ce9635176302fd8a0ad6d3f93420c0c7d85da3992bebc" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.224689 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mzgjj"] Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.233865 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-nwbkq" podStartSLOduration=2.233837938 podStartE2EDuration="2.233837938s" podCreationTimestamp="2026-02-26 11:18:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:18:28.224974652 +0000 UTC m=+454.035801106" watchObservedRunningTime="2026-02-26 11:18:28.233837938 +0000 UTC m=+454.044664372" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.244759 4699 scope.go:117] "RemoveContainer" containerID="e514effd43a8aac49eb2edbdb6959f6095c102c0f8bc4412986233930c5d5ff6" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.255649 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xhdp\" (UniqueName: \"kubernetes.io/projected/6e7ddf51-5522-4085-8567-76c9a254ed15-kube-api-access-7xhdp\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.255887 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c96a703-e568-4916-8035-a951ae91dc2b-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.255920 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c96a703-e568-4916-8035-a951ae91dc2b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.255934 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwq64\" (UniqueName: \"kubernetes.io/projected/5cc10041-704b-4b00-8e4e-369103434b64-kube-api-access-bwq64\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.255947 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44jnw\" (UniqueName: \"kubernetes.io/projected/6b9da973-6b5f-4485-adca-8792b0a3d256-kube-api-access-44jnw\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.255959 4699 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cc10041-704b-4b00-8e4e-369103434b64-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.256030 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac0026c3-1fad-4b34-9c42-389971f0c773-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.256046 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d171ad-7d92-4c70-a686-65f60ded8a03-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.256059 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e7ddf51-5522-4085-8567-76c9a254ed15-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.256071 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqrqs\" (UniqueName: \"kubernetes.io/projected/ac0026c3-1fad-4b34-9c42-389971f0c773-kube-api-access-rqrqs\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.256084 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tqhd\" (UniqueName: \"kubernetes.io/projected/44d171ad-7d92-4c70-a686-65f60ded8a03-kube-api-access-2tqhd\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.256098 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b9da973-6b5f-4485-adca-8792b0a3d256-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.257211 4699 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cc10041-704b-4b00-8e4e-369103434b64-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.257243 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr725\" (UniqueName: \"kubernetes.io/projected/8c96a703-e568-4916-8035-a951ae91dc2b-kube-api-access-rr725\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.265159 4699 scope.go:117] "RemoveContainer" containerID="bca562e2d2fabc5097841d6398d6c6b6a6779605566f4dd3173111ee1e8c04f3" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.266458 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac0026c3-1fad-4b34-9c42-389971f0c773-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac0026c3-1fad-4b34-9c42-389971f0c773" (UID: "ac0026c3-1fad-4b34-9c42-389971f0c773"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.270775 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71a83978-4f86-404b-967a-0e7493ff6721" path="/var/lib/kubelet/pods/71a83978-4f86-404b-967a-0e7493ff6721/volumes" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.273613 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" path="/var/lib/kubelet/pods/9ea10063-7888-400e-af1c-216cbde5a13e/volumes" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.282145 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e7ddf51-5522-4085-8567-76c9a254ed15-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e7ddf51-5522-4085-8567-76c9a254ed15" (UID: "6e7ddf51-5522-4085-8567-76c9a254ed15"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.285524 4699 scope.go:117] "RemoveContainer" containerID="7480103b052e67e1c14af93c5ed9ab5b5c3150d0a1dbb5d35641a39bc2cc9515" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.331132 4699 scope.go:117] "RemoveContainer" containerID="0c88d150d726034804b09cdfd6ed7b9a516e4ecd807d5799c0ea12f3955c7b69" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.355820 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44d171ad-7d92-4c70-a686-65f60ded8a03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44d171ad-7d92-4c70-a686-65f60ded8a03" (UID: "44d171ad-7d92-4c70-a686-65f60ded8a03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.358984 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac0026c3-1fad-4b34-9c42-389971f0c773-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.359016 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e7ddf51-5522-4085-8567-76c9a254ed15-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.359038 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d171ad-7d92-4c70-a686-65f60ded8a03-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.362513 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b9da973-6b5f-4485-adca-8792b0a3d256-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b9da973-6b5f-4485-adca-8792b0a3d256" (UID: "6b9da973-6b5f-4485-adca-8792b0a3d256"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.372461 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cd5qf"] Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.375924 4699 scope.go:117] "RemoveContainer" containerID="4459df84e7aab7535bf4732238c87c4da5222e3237b69439fff20886a1ea7688" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.376314 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cd5qf"] Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.402489 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8kpz"] Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.405020 4699 scope.go:117] "RemoveContainer" containerID="c429ee05cb01901447a5e3bded424d4a0427e987ffd209a1f29754bcb9be9b4d" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.409196 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8kpz"] Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.423426 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jhgks"] Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.429240 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jhgks"] Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.442608 4699 scope.go:117] "RemoveContainer" containerID="e2ca3e75def51c6eedb622aaa6507c8da48849ebf241567dc8e903d48fc3a6e5" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.443527 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-czwkc"] Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.452263 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-czwkc"] Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.458750 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrk4n"] Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.459442 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b9da973-6b5f-4485-adca-8792b0a3d256-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.462107 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrk4n"] Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.462107 4699 scope.go:117] "RemoveContainer" containerID="b3876e98ecf09cf7959a550cdff441956e5f5211a25918e3b5b1ff888aa1faca" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.466733 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sc9c6"] Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.471540 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sc9c6"] Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.483003 4699 scope.go:117] "RemoveContainer" containerID="919888fa21cfe39704e1b0c864c73cd7cdeeac94e5ee1bb4c79246202be61323" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.498961 4699 scope.go:117] "RemoveContainer" containerID="39ff3a6e4269604cce0aea66db001b967d934c0076038e7958d8b015de9375a1" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.513816 4699 scope.go:117] "RemoveContainer" containerID="704c8ba25f50ac5c881bb9d05eb872ee3851c9a21d28c2acf7a27a400acbebe0" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.526737 4699 scope.go:117] "RemoveContainer" containerID="d27dda8ede66374aa47b77a60b930fa0b6c4e065e9c9b269dc3e8dd85fa02ece" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.547148 4699 scope.go:117] "RemoveContainer" containerID="0d415d903af1673dff3ecf368cade4c0a0a93c2b3158c0519393d68509c7e6d3" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.570311 4699 scope.go:117] "RemoveContainer" containerID="aa39a04716f8cdcc694931265437b30c1cb1c3615ab11016472ce3e95c18688b" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.585823 4699 scope.go:117] "RemoveContainer" containerID="e63934f65b729d4f1b8b668dbe9b4795f057f647c6b7a160c5e82634ad1de5fd" Feb 26 11:18:28 crc kubenswrapper[4699]: I0226 11:18:28.608402 4699 scope.go:117] "RemoveContainer" containerID="3255d554cf00b3f149c14b7b5562baa6c773b2f01ac34c99e514e81d89810bb1" Feb 26 11:18:29 crc kubenswrapper[4699]: I0226 11:18:29.117274 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-nwbkq" Feb 26 11:18:30 crc kubenswrapper[4699]: I0226 11:18:30.268468 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" path="/var/lib/kubelet/pods/44d171ad-7d92-4c70-a686-65f60ded8a03/volumes" Feb 26 11:18:30 crc kubenswrapper[4699]: I0226 11:18:30.269363 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cc10041-704b-4b00-8e4e-369103434b64" path="/var/lib/kubelet/pods/5cc10041-704b-4b00-8e4e-369103434b64/volumes" Feb 26 11:18:30 crc kubenswrapper[4699]: I0226 11:18:30.269926 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" path="/var/lib/kubelet/pods/6b9da973-6b5f-4485-adca-8792b0a3d256/volumes" Feb 26 11:18:30 crc kubenswrapper[4699]: I0226 11:18:30.271175 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" path="/var/lib/kubelet/pods/6e7ddf51-5522-4085-8567-76c9a254ed15/volumes" Feb 26 11:18:30 crc kubenswrapper[4699]: I0226 11:18:30.271915 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" path="/var/lib/kubelet/pods/8c96a703-e568-4916-8035-a951ae91dc2b/volumes" Feb 26 11:18:30 crc kubenswrapper[4699]: I0226 11:18:30.273151 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" path="/var/lib/kubelet/pods/ac0026c3-1fad-4b34-9c42-389971f0c773/volumes" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.978200 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r555d"] Feb 26 11:18:36 crc kubenswrapper[4699]: E0226 11:18:36.978895 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" containerName="registry-server" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.978909 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" containerName="registry-server" Feb 26 11:18:36 crc kubenswrapper[4699]: E0226 11:18:36.978917 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" containerName="registry-server" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.978923 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" containerName="registry-server" Feb 26 11:18:36 crc kubenswrapper[4699]: E0226 11:18:36.978933 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" containerName="extract-content" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.978940 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" containerName="extract-content" Feb 26 11:18:36 crc kubenswrapper[4699]: E0226 11:18:36.978946 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" containerName="registry-server" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.978952 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" containerName="registry-server" Feb 26 11:18:36 crc kubenswrapper[4699]: E0226 11:18:36.978961 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" containerName="extract-utilities" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.978967 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" containerName="extract-utilities" Feb 26 11:18:36 crc kubenswrapper[4699]: E0226 11:18:36.978974 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" containerName="extract-content" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.978979 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" containerName="extract-content" Feb 26 11:18:36 crc kubenswrapper[4699]: E0226 11:18:36.978988 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" containerName="registry-server" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.978993 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" containerName="registry-server" Feb 26 11:18:36 crc kubenswrapper[4699]: E0226 11:18:36.979002 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a83978-4f86-404b-967a-0e7493ff6721" containerName="extract-content" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.979010 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a83978-4f86-404b-967a-0e7493ff6721" containerName="extract-content" Feb 26 11:18:36 crc kubenswrapper[4699]: E0226 11:18:36.979017 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" containerName="registry-server" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.979023 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" containerName="registry-server" Feb 26 11:18:36 crc kubenswrapper[4699]: E0226 11:18:36.979031 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cc10041-704b-4b00-8e4e-369103434b64" containerName="marketplace-operator" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.979037 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cc10041-704b-4b00-8e4e-369103434b64" containerName="marketplace-operator" Feb 26 11:18:36 crc kubenswrapper[4699]: E0226 11:18:36.979044 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" containerName="extract-utilities" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.979050 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" containerName="extract-utilities" Feb 26 11:18:36 crc kubenswrapper[4699]: E0226 11:18:36.979058 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" containerName="extract-content" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.979063 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" containerName="extract-content" Feb 26 11:18:36 crc kubenswrapper[4699]: E0226 11:18:36.979071 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" containerName="extract-content" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.979076 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" containerName="extract-content" Feb 26 11:18:36 crc kubenswrapper[4699]: E0226 11:18:36.979083 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" containerName="extract-content" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.979088 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" containerName="extract-content" Feb 26 11:18:36 crc kubenswrapper[4699]: E0226 11:18:36.979096 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" containerName="extract-utilities" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.979101 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" containerName="extract-utilities" Feb 26 11:18:36 crc kubenswrapper[4699]: E0226 11:18:36.979109 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" containerName="extract-utilities" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.979132 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" containerName="extract-utilities" Feb 26 11:18:36 crc kubenswrapper[4699]: E0226 11:18:36.979141 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a83978-4f86-404b-967a-0e7493ff6721" containerName="extract-utilities" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.979147 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a83978-4f86-404b-967a-0e7493ff6721" containerName="extract-utilities" Feb 26 11:18:36 crc kubenswrapper[4699]: E0226 11:18:36.979153 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a83978-4f86-404b-967a-0e7493ff6721" containerName="registry-server" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.979158 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a83978-4f86-404b-967a-0e7493ff6721" containerName="registry-server" Feb 26 11:18:36 crc kubenswrapper[4699]: E0226 11:18:36.979169 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" containerName="extract-utilities" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.979175 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" containerName="extract-utilities" Feb 26 11:18:36 crc kubenswrapper[4699]: E0226 11:18:36.979183 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" containerName="registry-server" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.979188 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" containerName="registry-server" Feb 26 11:18:36 crc kubenswrapper[4699]: E0226 11:18:36.979197 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" containerName="extract-content" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.979202 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" containerName="extract-content" Feb 26 11:18:36 crc kubenswrapper[4699]: E0226 11:18:36.979210 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" containerName="extract-utilities" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.979215 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" containerName="extract-utilities" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.979303 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a83978-4f86-404b-967a-0e7493ff6721" containerName="registry-server" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.979312 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d171ad-7d92-4c70-a686-65f60ded8a03" containerName="registry-server" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.979320 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac0026c3-1fad-4b34-9c42-389971f0c773" containerName="registry-server" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.979326 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cc10041-704b-4b00-8e4e-369103434b64" containerName="marketplace-operator" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.979332 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b9da973-6b5f-4485-adca-8792b0a3d256" containerName="registry-server" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.979339 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c96a703-e568-4916-8035-a951ae91dc2b" containerName="registry-server" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.979348 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e7ddf51-5522-4085-8567-76c9a254ed15" containerName="registry-server" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.979355 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ea10063-7888-400e-af1c-216cbde5a13e" containerName="registry-server" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.980077 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r555d" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.982472 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 26 11:18:36 crc kubenswrapper[4699]: I0226 11:18:36.996364 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r555d"] Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.113442 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6tpd\" (UniqueName: \"kubernetes.io/projected/d174508d-e5d5-4912-a652-e7b264f1c882-kube-api-access-l6tpd\") pod \"redhat-marketplace-r555d\" (UID: \"d174508d-e5d5-4912-a652-e7b264f1c882\") " pod="openshift-marketplace/redhat-marketplace-r555d" Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.113525 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d174508d-e5d5-4912-a652-e7b264f1c882-catalog-content\") pod \"redhat-marketplace-r555d\" (UID: \"d174508d-e5d5-4912-a652-e7b264f1c882\") " pod="openshift-marketplace/redhat-marketplace-r555d" Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.113592 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d174508d-e5d5-4912-a652-e7b264f1c882-utilities\") pod \"redhat-marketplace-r555d\" (UID: \"d174508d-e5d5-4912-a652-e7b264f1c882\") " pod="openshift-marketplace/redhat-marketplace-r555d" Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.179232 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xv8lg"] Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.180747 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xv8lg" Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.184354 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.189737 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xv8lg"] Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.214440 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d174508d-e5d5-4912-a652-e7b264f1c882-utilities\") pod \"redhat-marketplace-r555d\" (UID: \"d174508d-e5d5-4912-a652-e7b264f1c882\") " pod="openshift-marketplace/redhat-marketplace-r555d" Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.214492 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6tpd\" (UniqueName: \"kubernetes.io/projected/d174508d-e5d5-4912-a652-e7b264f1c882-kube-api-access-l6tpd\") pod \"redhat-marketplace-r555d\" (UID: \"d174508d-e5d5-4912-a652-e7b264f1c882\") " pod="openshift-marketplace/redhat-marketplace-r555d" Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.214546 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d174508d-e5d5-4912-a652-e7b264f1c882-catalog-content\") pod \"redhat-marketplace-r555d\" (UID: \"d174508d-e5d5-4912-a652-e7b264f1c882\") " pod="openshift-marketplace/redhat-marketplace-r555d" Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.214996 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d174508d-e5d5-4912-a652-e7b264f1c882-utilities\") pod \"redhat-marketplace-r555d\" (UID: \"d174508d-e5d5-4912-a652-e7b264f1c882\") " pod="openshift-marketplace/redhat-marketplace-r555d" Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.215070 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d174508d-e5d5-4912-a652-e7b264f1c882-catalog-content\") pod \"redhat-marketplace-r555d\" (UID: \"d174508d-e5d5-4912-a652-e7b264f1c882\") " pod="openshift-marketplace/redhat-marketplace-r555d" Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.233402 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6tpd\" (UniqueName: \"kubernetes.io/projected/d174508d-e5d5-4912-a652-e7b264f1c882-kube-api-access-l6tpd\") pod \"redhat-marketplace-r555d\" (UID: \"d174508d-e5d5-4912-a652-e7b264f1c882\") " pod="openshift-marketplace/redhat-marketplace-r555d" Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.295260 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r555d" Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.315848 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e84a1dbc-431c-4897-b5fd-f04460b7f943-catalog-content\") pod \"redhat-operators-xv8lg\" (UID: \"e84a1dbc-431c-4897-b5fd-f04460b7f943\") " pod="openshift-marketplace/redhat-operators-xv8lg" Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.315939 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e84a1dbc-431c-4897-b5fd-f04460b7f943-utilities\") pod \"redhat-operators-xv8lg\" (UID: \"e84a1dbc-431c-4897-b5fd-f04460b7f943\") " pod="openshift-marketplace/redhat-operators-xv8lg" Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.315955 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwr8w\" (UniqueName: \"kubernetes.io/projected/e84a1dbc-431c-4897-b5fd-f04460b7f943-kube-api-access-jwr8w\") pod \"redhat-operators-xv8lg\" (UID: \"e84a1dbc-431c-4897-b5fd-f04460b7f943\") " pod="openshift-marketplace/redhat-operators-xv8lg" Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.496534 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwr8w\" (UniqueName: \"kubernetes.io/projected/e84a1dbc-431c-4897-b5fd-f04460b7f943-kube-api-access-jwr8w\") pod \"redhat-operators-xv8lg\" (UID: \"e84a1dbc-431c-4897-b5fd-f04460b7f943\") " pod="openshift-marketplace/redhat-operators-xv8lg" Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.496742 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e84a1dbc-431c-4897-b5fd-f04460b7f943-utilities\") pod \"redhat-operators-xv8lg\" (UID: \"e84a1dbc-431c-4897-b5fd-f04460b7f943\") " pod="openshift-marketplace/redhat-operators-xv8lg" Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.496782 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e84a1dbc-431c-4897-b5fd-f04460b7f943-catalog-content\") pod \"redhat-operators-xv8lg\" (UID: \"e84a1dbc-431c-4897-b5fd-f04460b7f943\") " pod="openshift-marketplace/redhat-operators-xv8lg" Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.497264 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e84a1dbc-431c-4897-b5fd-f04460b7f943-catalog-content\") pod \"redhat-operators-xv8lg\" (UID: \"e84a1dbc-431c-4897-b5fd-f04460b7f943\") " pod="openshift-marketplace/redhat-operators-xv8lg" Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.497335 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e84a1dbc-431c-4897-b5fd-f04460b7f943-utilities\") pod \"redhat-operators-xv8lg\" (UID: \"e84a1dbc-431c-4897-b5fd-f04460b7f943\") " pod="openshift-marketplace/redhat-operators-xv8lg" Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.520646 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwr8w\" (UniqueName: \"kubernetes.io/projected/e84a1dbc-431c-4897-b5fd-f04460b7f943-kube-api-access-jwr8w\") pod \"redhat-operators-xv8lg\" (UID: \"e84a1dbc-431c-4897-b5fd-f04460b7f943\") " pod="openshift-marketplace/redhat-operators-xv8lg" Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.606900 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r555d"] Feb 26 11:18:37 crc kubenswrapper[4699]: I0226 11:18:37.810312 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xv8lg" Feb 26 11:18:38 crc kubenswrapper[4699]: I0226 11:18:38.265759 4699 generic.go:334] "Generic (PLEG): container finished" podID="d174508d-e5d5-4912-a652-e7b264f1c882" containerID="c2143e2c5cc81b1899d2d5bc7fcd2c6e1c715acd804535ca070e55a22efaf376" exitCode=0 Feb 26 11:18:38 crc kubenswrapper[4699]: I0226 11:18:38.271536 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r555d" event={"ID":"d174508d-e5d5-4912-a652-e7b264f1c882","Type":"ContainerDied","Data":"c2143e2c5cc81b1899d2d5bc7fcd2c6e1c715acd804535ca070e55a22efaf376"} Feb 26 11:18:38 crc kubenswrapper[4699]: I0226 11:18:38.271578 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r555d" event={"ID":"d174508d-e5d5-4912-a652-e7b264f1c882","Type":"ContainerStarted","Data":"38f122ab998345047edb5b05c9d24d0513627c08f110dfa18f65cff552b1d59f"} Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.137742 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xv8lg"] Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.273158 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xv8lg" event={"ID":"e84a1dbc-431c-4897-b5fd-f04460b7f943","Type":"ContainerStarted","Data":"6d8c08def942c9655caee92b122902fc51271c1537ca60f7447fb09b383d1bcf"} Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.382025 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5vsj9"] Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.383996 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5vsj9" Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.388067 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.397108 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5vsj9"] Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.496879 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a23d2795-eec2-4e37-8902-7f9220e44cb1-catalog-content\") pod \"certified-operators-5vsj9\" (UID: \"a23d2795-eec2-4e37-8902-7f9220e44cb1\") " pod="openshift-marketplace/certified-operators-5vsj9" Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.497007 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a23d2795-eec2-4e37-8902-7f9220e44cb1-utilities\") pod \"certified-operators-5vsj9\" (UID: \"a23d2795-eec2-4e37-8902-7f9220e44cb1\") " pod="openshift-marketplace/certified-operators-5vsj9" Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.497058 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knlhn\" (UniqueName: \"kubernetes.io/projected/a23d2795-eec2-4e37-8902-7f9220e44cb1-kube-api-access-knlhn\") pod \"certified-operators-5vsj9\" (UID: \"a23d2795-eec2-4e37-8902-7f9220e44cb1\") " pod="openshift-marketplace/certified-operators-5vsj9" Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.581514 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hfvdf"] Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.583657 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hfvdf" Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.585698 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.592917 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hfvdf"] Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.598063 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a23d2795-eec2-4e37-8902-7f9220e44cb1-utilities\") pod \"certified-operators-5vsj9\" (UID: \"a23d2795-eec2-4e37-8902-7f9220e44cb1\") " pod="openshift-marketplace/certified-operators-5vsj9" Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.598123 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knlhn\" (UniqueName: \"kubernetes.io/projected/a23d2795-eec2-4e37-8902-7f9220e44cb1-kube-api-access-knlhn\") pod \"certified-operators-5vsj9\" (UID: \"a23d2795-eec2-4e37-8902-7f9220e44cb1\") " pod="openshift-marketplace/certified-operators-5vsj9" Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.598154 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a23d2795-eec2-4e37-8902-7f9220e44cb1-catalog-content\") pod \"certified-operators-5vsj9\" (UID: \"a23d2795-eec2-4e37-8902-7f9220e44cb1\") " pod="openshift-marketplace/certified-operators-5vsj9" Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.598758 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a23d2795-eec2-4e37-8902-7f9220e44cb1-utilities\") pod \"certified-operators-5vsj9\" (UID: \"a23d2795-eec2-4e37-8902-7f9220e44cb1\") " pod="openshift-marketplace/certified-operators-5vsj9" Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.598772 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a23d2795-eec2-4e37-8902-7f9220e44cb1-catalog-content\") pod \"certified-operators-5vsj9\" (UID: \"a23d2795-eec2-4e37-8902-7f9220e44cb1\") " pod="openshift-marketplace/certified-operators-5vsj9" Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.620825 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knlhn\" (UniqueName: \"kubernetes.io/projected/a23d2795-eec2-4e37-8902-7f9220e44cb1-kube-api-access-knlhn\") pod \"certified-operators-5vsj9\" (UID: \"a23d2795-eec2-4e37-8902-7f9220e44cb1\") " pod="openshift-marketplace/certified-operators-5vsj9" Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.698939 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcflx\" (UniqueName: \"kubernetes.io/projected/fde9effb-9fa9-46a0-a8e6-08080ed0b8ba-kube-api-access-xcflx\") pod \"community-operators-hfvdf\" (UID: \"fde9effb-9fa9-46a0-a8e6-08080ed0b8ba\") " pod="openshift-marketplace/community-operators-hfvdf" Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.699254 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fde9effb-9fa9-46a0-a8e6-08080ed0b8ba-utilities\") pod \"community-operators-hfvdf\" (UID: \"fde9effb-9fa9-46a0-a8e6-08080ed0b8ba\") " pod="openshift-marketplace/community-operators-hfvdf" Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.699274 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fde9effb-9fa9-46a0-a8e6-08080ed0b8ba-catalog-content\") pod \"community-operators-hfvdf\" (UID: \"fde9effb-9fa9-46a0-a8e6-08080ed0b8ba\") " pod="openshift-marketplace/community-operators-hfvdf" Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.703347 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5vsj9" Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.800212 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcflx\" (UniqueName: \"kubernetes.io/projected/fde9effb-9fa9-46a0-a8e6-08080ed0b8ba-kube-api-access-xcflx\") pod \"community-operators-hfvdf\" (UID: \"fde9effb-9fa9-46a0-a8e6-08080ed0b8ba\") " pod="openshift-marketplace/community-operators-hfvdf" Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.800279 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fde9effb-9fa9-46a0-a8e6-08080ed0b8ba-utilities\") pod \"community-operators-hfvdf\" (UID: \"fde9effb-9fa9-46a0-a8e6-08080ed0b8ba\") " pod="openshift-marketplace/community-operators-hfvdf" Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.800304 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fde9effb-9fa9-46a0-a8e6-08080ed0b8ba-catalog-content\") pod \"community-operators-hfvdf\" (UID: \"fde9effb-9fa9-46a0-a8e6-08080ed0b8ba\") " pod="openshift-marketplace/community-operators-hfvdf" Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.800881 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fde9effb-9fa9-46a0-a8e6-08080ed0b8ba-utilities\") pod \"community-operators-hfvdf\" (UID: \"fde9effb-9fa9-46a0-a8e6-08080ed0b8ba\") " pod="openshift-marketplace/community-operators-hfvdf" Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.801134 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fde9effb-9fa9-46a0-a8e6-08080ed0b8ba-catalog-content\") pod \"community-operators-hfvdf\" (UID: \"fde9effb-9fa9-46a0-a8e6-08080ed0b8ba\") " pod="openshift-marketplace/community-operators-hfvdf" Feb 26 11:18:39 crc kubenswrapper[4699]: I0226 11:18:39.839806 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcflx\" (UniqueName: \"kubernetes.io/projected/fde9effb-9fa9-46a0-a8e6-08080ed0b8ba-kube-api-access-xcflx\") pod \"community-operators-hfvdf\" (UID: \"fde9effb-9fa9-46a0-a8e6-08080ed0b8ba\") " pod="openshift-marketplace/community-operators-hfvdf" Feb 26 11:18:40 crc kubenswrapper[4699]: I0226 11:18:40.064529 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hfvdf" Feb 26 11:18:40 crc kubenswrapper[4699]: I0226 11:18:40.132345 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5vsj9"] Feb 26 11:18:40 crc kubenswrapper[4699]: I0226 11:18:40.282359 4699 generic.go:334] "Generic (PLEG): container finished" podID="e84a1dbc-431c-4897-b5fd-f04460b7f943" containerID="388e074c3a65bb78e59f3313093bb2c909eaec385eec81e289a50a066aaab9b5" exitCode=0 Feb 26 11:18:40 crc kubenswrapper[4699]: I0226 11:18:40.282475 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xv8lg" event={"ID":"e84a1dbc-431c-4897-b5fd-f04460b7f943","Type":"ContainerDied","Data":"388e074c3a65bb78e59f3313093bb2c909eaec385eec81e289a50a066aaab9b5"} Feb 26 11:18:40 crc kubenswrapper[4699]: I0226 11:18:40.283979 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5vsj9" event={"ID":"a23d2795-eec2-4e37-8902-7f9220e44cb1","Type":"ContainerStarted","Data":"bfe1b91bcedbbe9a51ded533b5d6175c6435fb9eeb1f1689657fa3d9850ba37f"} Feb 26 11:18:40 crc kubenswrapper[4699]: I0226 11:18:40.285910 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r555d" event={"ID":"d174508d-e5d5-4912-a652-e7b264f1c882","Type":"ContainerStarted","Data":"01b60bb8ecce1da4cf03c4ab1174ba5408fadff25b132bfc5331957adca04cdf"} Feb 26 11:18:40 crc kubenswrapper[4699]: I0226 11:18:40.315634 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hfvdf"] Feb 26 11:18:40 crc kubenswrapper[4699]: W0226 11:18:40.385205 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfde9effb_9fa9_46a0_a8e6_08080ed0b8ba.slice/crio-e830db26342abc0c12262a99f12cecb9b41f22c38af368f13422f72dfaa8737b WatchSource:0}: Error finding container e830db26342abc0c12262a99f12cecb9b41f22c38af368f13422f72dfaa8737b: Status 404 returned error can't find the container with id e830db26342abc0c12262a99f12cecb9b41f22c38af368f13422f72dfaa8737b Feb 26 11:18:41 crc kubenswrapper[4699]: I0226 11:18:41.293517 4699 generic.go:334] "Generic (PLEG): container finished" podID="fde9effb-9fa9-46a0-a8e6-08080ed0b8ba" containerID="0b3c4cb225a0334e1da0494e85efeb36160c8802eabfdd450996c572adea01e2" exitCode=0 Feb 26 11:18:41 crc kubenswrapper[4699]: I0226 11:18:41.293848 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hfvdf" event={"ID":"fde9effb-9fa9-46a0-a8e6-08080ed0b8ba","Type":"ContainerDied","Data":"0b3c4cb225a0334e1da0494e85efeb36160c8802eabfdd450996c572adea01e2"} Feb 26 11:18:41 crc kubenswrapper[4699]: I0226 11:18:41.293878 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hfvdf" event={"ID":"fde9effb-9fa9-46a0-a8e6-08080ed0b8ba","Type":"ContainerStarted","Data":"e830db26342abc0c12262a99f12cecb9b41f22c38af368f13422f72dfaa8737b"} Feb 26 11:18:41 crc kubenswrapper[4699]: I0226 11:18:41.296952 4699 generic.go:334] "Generic (PLEG): container finished" podID="d174508d-e5d5-4912-a652-e7b264f1c882" containerID="01b60bb8ecce1da4cf03c4ab1174ba5408fadff25b132bfc5331957adca04cdf" exitCode=0 Feb 26 11:18:41 crc kubenswrapper[4699]: I0226 11:18:41.297022 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r555d" event={"ID":"d174508d-e5d5-4912-a652-e7b264f1c882","Type":"ContainerDied","Data":"01b60bb8ecce1da4cf03c4ab1174ba5408fadff25b132bfc5331957adca04cdf"} Feb 26 11:18:41 crc kubenswrapper[4699]: I0226 11:18:41.300141 4699 generic.go:334] "Generic (PLEG): container finished" podID="a23d2795-eec2-4e37-8902-7f9220e44cb1" containerID="49c756eff3fcf51adce9012c86148834634fc07c8832d17ca62ff816d2c86ae3" exitCode=0 Feb 26 11:18:41 crc kubenswrapper[4699]: I0226 11:18:41.300168 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5vsj9" event={"ID":"a23d2795-eec2-4e37-8902-7f9220e44cb1","Type":"ContainerDied","Data":"49c756eff3fcf51adce9012c86148834634fc07c8832d17ca62ff816d2c86ae3"} Feb 26 11:18:41 crc kubenswrapper[4699]: I0226 11:18:41.586748 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:18:41 crc kubenswrapper[4699]: I0226 11:18:41.586826 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:18:42 crc kubenswrapper[4699]: I0226 11:18:42.307145 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xv8lg" event={"ID":"e84a1dbc-431c-4897-b5fd-f04460b7f943","Type":"ContainerStarted","Data":"b4bfd779c465bd24858dde6afd4dca28e4ac0e16d185e7df709e6584c822fee4"} Feb 26 11:18:42 crc kubenswrapper[4699]: I0226 11:18:42.309615 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r555d" event={"ID":"d174508d-e5d5-4912-a652-e7b264f1c882","Type":"ContainerStarted","Data":"315e6d4c826d7007192db54c8b333215cec7a542897127c31fb01afbf5a995ac"} Feb 26 11:18:42 crc kubenswrapper[4699]: I0226 11:18:42.353333 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r555d" podStartSLOduration=2.777968489 podStartE2EDuration="6.353316901s" podCreationTimestamp="2026-02-26 11:18:36 +0000 UTC" firstStartedPulling="2026-02-26 11:18:38.26742501 +0000 UTC m=+464.078251444" lastFinishedPulling="2026-02-26 11:18:41.842773422 +0000 UTC m=+467.653599856" observedRunningTime="2026-02-26 11:18:42.351128227 +0000 UTC m=+468.161954681" watchObservedRunningTime="2026-02-26 11:18:42.353316901 +0000 UTC m=+468.164143335" Feb 26 11:18:43 crc kubenswrapper[4699]: I0226 11:18:43.330783 4699 generic.go:334] "Generic (PLEG): container finished" podID="e84a1dbc-431c-4897-b5fd-f04460b7f943" containerID="b4bfd779c465bd24858dde6afd4dca28e4ac0e16d185e7df709e6584c822fee4" exitCode=0 Feb 26 11:18:43 crc kubenswrapper[4699]: I0226 11:18:43.331745 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xv8lg" event={"ID":"e84a1dbc-431c-4897-b5fd-f04460b7f943","Type":"ContainerDied","Data":"b4bfd779c465bd24858dde6afd4dca28e4ac0e16d185e7df709e6584c822fee4"} Feb 26 11:18:43 crc kubenswrapper[4699]: I0226 11:18:43.335138 4699 generic.go:334] "Generic (PLEG): container finished" podID="fde9effb-9fa9-46a0-a8e6-08080ed0b8ba" containerID="44cffda81fe937c8fdf483f307285a80ac13d127d8ff22a73b84d040fdf0e363" exitCode=0 Feb 26 11:18:43 crc kubenswrapper[4699]: I0226 11:18:43.335360 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hfvdf" event={"ID":"fde9effb-9fa9-46a0-a8e6-08080ed0b8ba","Type":"ContainerDied","Data":"44cffda81fe937c8fdf483f307285a80ac13d127d8ff22a73b84d040fdf0e363"} Feb 26 11:18:44 crc kubenswrapper[4699]: I0226 11:18:44.345594 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5vsj9" event={"ID":"a23d2795-eec2-4e37-8902-7f9220e44cb1","Type":"ContainerStarted","Data":"a4a2d76947fef9ec12842bfe0c7c36789ec7162e796e8641057b89f3c7cf4b88"} Feb 26 11:18:45 crc kubenswrapper[4699]: I0226 11:18:45.352195 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xv8lg" event={"ID":"e84a1dbc-431c-4897-b5fd-f04460b7f943","Type":"ContainerStarted","Data":"e566d9d06139e29d621cf7bc7b00ad22f80105f4a94a4e3e816956084476cfc9"} Feb 26 11:18:45 crc kubenswrapper[4699]: I0226 11:18:45.354030 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-t8656" podUID="7232eb23-31ae-4e72-ae27-c256dc4cac9a" containerName="registry" containerID="cri-o://4779da011a858c6a8df3a7fdfbfb2e01a004c252953c916a440f89808caa4efd" gracePeriod=30 Feb 26 11:18:45 crc kubenswrapper[4699]: I0226 11:18:45.354419 4699 generic.go:334] "Generic (PLEG): container finished" podID="a23d2795-eec2-4e37-8902-7f9220e44cb1" containerID="a4a2d76947fef9ec12842bfe0c7c36789ec7162e796e8641057b89f3c7cf4b88" exitCode=0 Feb 26 11:18:45 crc kubenswrapper[4699]: I0226 11:18:45.354472 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5vsj9" event={"ID":"a23d2795-eec2-4e37-8902-7f9220e44cb1","Type":"ContainerDied","Data":"a4a2d76947fef9ec12842bfe0c7c36789ec7162e796e8641057b89f3c7cf4b88"} Feb 26 11:18:45 crc kubenswrapper[4699]: I0226 11:18:45.358040 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hfvdf" event={"ID":"fde9effb-9fa9-46a0-a8e6-08080ed0b8ba","Type":"ContainerStarted","Data":"eaf06ce3d6f30adf2fd276e186ccb2d44080d7c105c0c98ee38e551bb6793706"} Feb 26 11:18:45 crc kubenswrapper[4699]: I0226 11:18:45.732427 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xv8lg" podStartSLOduration=4.693495705 podStartE2EDuration="8.732405289s" podCreationTimestamp="2026-02-26 11:18:37 +0000 UTC" firstStartedPulling="2026-02-26 11:18:40.285895009 +0000 UTC m=+466.096721443" lastFinishedPulling="2026-02-26 11:18:44.324804583 +0000 UTC m=+470.135631027" observedRunningTime="2026-02-26 11:18:45.374641369 +0000 UTC m=+471.185467813" watchObservedRunningTime="2026-02-26 11:18:45.732405289 +0000 UTC m=+471.543231743" Feb 26 11:18:45 crc kubenswrapper[4699]: I0226 11:18:45.785931 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hfvdf" podStartSLOduration=3.63857007 podStartE2EDuration="6.785910179s" podCreationTimestamp="2026-02-26 11:18:39 +0000 UTC" firstStartedPulling="2026-02-26 11:18:41.29556515 +0000 UTC m=+467.106391594" lastFinishedPulling="2026-02-26 11:18:44.442905279 +0000 UTC m=+470.253731703" observedRunningTime="2026-02-26 11:18:45.777562822 +0000 UTC m=+471.588389256" watchObservedRunningTime="2026-02-26 11:18:45.785910179 +0000 UTC m=+471.596736623" Feb 26 11:18:47 crc kubenswrapper[4699]: I0226 11:18:47.295827 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r555d" Feb 26 11:18:47 crc kubenswrapper[4699]: I0226 11:18:47.296168 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r555d" Feb 26 11:18:47 crc kubenswrapper[4699]: I0226 11:18:47.355439 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r555d" Feb 26 11:18:47 crc kubenswrapper[4699]: I0226 11:18:47.467166 4699 generic.go:334] "Generic (PLEG): container finished" podID="7232eb23-31ae-4e72-ae27-c256dc4cac9a" containerID="4779da011a858c6a8df3a7fdfbfb2e01a004c252953c916a440f89808caa4efd" exitCode=0 Feb 26 11:18:47 crc kubenswrapper[4699]: I0226 11:18:47.467261 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-t8656" event={"ID":"7232eb23-31ae-4e72-ae27-c256dc4cac9a","Type":"ContainerDied","Data":"4779da011a858c6a8df3a7fdfbfb2e01a004c252953c916a440f89808caa4efd"} Feb 26 11:18:47 crc kubenswrapper[4699]: I0226 11:18:47.503590 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r555d" Feb 26 11:18:47 crc kubenswrapper[4699]: I0226 11:18:47.811614 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xv8lg" Feb 26 11:18:47 crc kubenswrapper[4699]: I0226 11:18:47.811690 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xv8lg" Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.471439 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.475938 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-t8656" event={"ID":"7232eb23-31ae-4e72-ae27-c256dc4cac9a","Type":"ContainerDied","Data":"39ab42bfea1ba6c0800a2508ff52b7eb12199142899ef006de5ffbee4f2135a3"} Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.475992 4699 scope.go:117] "RemoveContainer" containerID="4779da011a858c6a8df3a7fdfbfb2e01a004c252953c916a440f89808caa4efd" Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.476006 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-t8656" Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.577869 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.577915 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7232eb23-31ae-4e72-ae27-c256dc4cac9a-ca-trust-extracted\") pod \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.577945 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7232eb23-31ae-4e72-ae27-c256dc4cac9a-registry-tls\") pod \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.577977 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trrf7\" (UniqueName: \"kubernetes.io/projected/7232eb23-31ae-4e72-ae27-c256dc4cac9a-kube-api-access-trrf7\") pod \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.577996 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7232eb23-31ae-4e72-ae27-c256dc4cac9a-registry-certificates\") pod \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.578033 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7232eb23-31ae-4e72-ae27-c256dc4cac9a-installation-pull-secrets\") pod \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.578059 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7232eb23-31ae-4e72-ae27-c256dc4cac9a-trusted-ca\") pod \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.578084 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7232eb23-31ae-4e72-ae27-c256dc4cac9a-bound-sa-token\") pod \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\" (UID: \"7232eb23-31ae-4e72-ae27-c256dc4cac9a\") " Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.578866 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7232eb23-31ae-4e72-ae27-c256dc4cac9a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7232eb23-31ae-4e72-ae27-c256dc4cac9a" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.578903 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7232eb23-31ae-4e72-ae27-c256dc4cac9a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7232eb23-31ae-4e72-ae27-c256dc4cac9a" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.584142 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7232eb23-31ae-4e72-ae27-c256dc4cac9a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7232eb23-31ae-4e72-ae27-c256dc4cac9a" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.585508 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7232eb23-31ae-4e72-ae27-c256dc4cac9a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7232eb23-31ae-4e72-ae27-c256dc4cac9a" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.586794 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7232eb23-31ae-4e72-ae27-c256dc4cac9a-kube-api-access-trrf7" (OuterVolumeSpecName: "kube-api-access-trrf7") pod "7232eb23-31ae-4e72-ae27-c256dc4cac9a" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a"). InnerVolumeSpecName "kube-api-access-trrf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.590519 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7232eb23-31ae-4e72-ae27-c256dc4cac9a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7232eb23-31ae-4e72-ae27-c256dc4cac9a" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.598803 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7232eb23-31ae-4e72-ae27-c256dc4cac9a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7232eb23-31ae-4e72-ae27-c256dc4cac9a" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.680177 4699 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7232eb23-31ae-4e72-ae27-c256dc4cac9a-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.680210 4699 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7232eb23-31ae-4e72-ae27-c256dc4cac9a-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.680221 4699 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7232eb23-31ae-4e72-ae27-c256dc4cac9a-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.680229 4699 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7232eb23-31ae-4e72-ae27-c256dc4cac9a-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.680237 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trrf7\" (UniqueName: \"kubernetes.io/projected/7232eb23-31ae-4e72-ae27-c256dc4cac9a-kube-api-access-trrf7\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.680247 4699 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7232eb23-31ae-4e72-ae27-c256dc4cac9a-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.680255 4699 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7232eb23-31ae-4e72-ae27-c256dc4cac9a-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.773683 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "7232eb23-31ae-4e72-ae27-c256dc4cac9a" (UID: "7232eb23-31ae-4e72-ae27-c256dc4cac9a"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.805444 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-t8656"] Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.809272 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-t8656"] Feb 26 11:18:48 crc kubenswrapper[4699]: I0226 11:18:48.850566 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xv8lg" podUID="e84a1dbc-431c-4897-b5fd-f04460b7f943" containerName="registry-server" probeResult="failure" output=< Feb 26 11:18:48 crc kubenswrapper[4699]: timeout: failed to connect service ":50051" within 1s Feb 26 11:18:48 crc kubenswrapper[4699]: > Feb 26 11:18:50 crc kubenswrapper[4699]: I0226 11:18:50.065651 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hfvdf" Feb 26 11:18:50 crc kubenswrapper[4699]: I0226 11:18:50.065964 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hfvdf" Feb 26 11:18:50 crc kubenswrapper[4699]: I0226 11:18:50.108025 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hfvdf" Feb 26 11:18:50 crc kubenswrapper[4699]: I0226 11:18:50.268414 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7232eb23-31ae-4e72-ae27-c256dc4cac9a" path="/var/lib/kubelet/pods/7232eb23-31ae-4e72-ae27-c256dc4cac9a/volumes" Feb 26 11:18:50 crc kubenswrapper[4699]: I0226 11:18:50.530029 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hfvdf" Feb 26 11:18:54 crc kubenswrapper[4699]: I0226 11:18:54.517327 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5vsj9" event={"ID":"a23d2795-eec2-4e37-8902-7f9220e44cb1","Type":"ContainerStarted","Data":"216a021af4af58458326e2f9e398377f0f7c5a03927b8be1378311556dacc286"} Feb 26 11:18:54 crc kubenswrapper[4699]: I0226 11:18:54.533073 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5vsj9" podStartSLOduration=3.348663783 podStartE2EDuration="15.533054513s" podCreationTimestamp="2026-02-26 11:18:39 +0000 UTC" firstStartedPulling="2026-02-26 11:18:41.301809934 +0000 UTC m=+467.112636358" lastFinishedPulling="2026-02-26 11:18:53.486200654 +0000 UTC m=+479.297027088" observedRunningTime="2026-02-26 11:18:54.53159225 +0000 UTC m=+480.342418684" watchObservedRunningTime="2026-02-26 11:18:54.533054513 +0000 UTC m=+480.343880947" Feb 26 11:18:57 crc kubenswrapper[4699]: I0226 11:18:57.851647 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xv8lg" Feb 26 11:18:57 crc kubenswrapper[4699]: I0226 11:18:57.898143 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xv8lg" Feb 26 11:18:59 crc kubenswrapper[4699]: I0226 11:18:59.704473 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5vsj9" Feb 26 11:18:59 crc kubenswrapper[4699]: I0226 11:18:59.704856 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5vsj9" Feb 26 11:18:59 crc kubenswrapper[4699]: I0226 11:18:59.751010 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5vsj9" Feb 26 11:19:00 crc kubenswrapper[4699]: I0226 11:19:00.589918 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5vsj9" Feb 26 11:19:11 crc kubenswrapper[4699]: I0226 11:19:11.585204 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:19:11 crc kubenswrapper[4699]: I0226 11:19:11.585730 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:19:11 crc kubenswrapper[4699]: I0226 11:19:11.585779 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:19:11 crc kubenswrapper[4699]: I0226 11:19:11.586367 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"650d424704999ccaef77ddc678846c35c1a480092b312ddf8beddcd52de6fa7e"} pod="openshift-machine-config-operator/machine-config-daemon-28p79" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 11:19:11 crc kubenswrapper[4699]: I0226 11:19:11.586430 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" containerID="cri-o://650d424704999ccaef77ddc678846c35c1a480092b312ddf8beddcd52de6fa7e" gracePeriod=600 Feb 26 11:19:12 crc kubenswrapper[4699]: I0226 11:19:12.617565 4699 generic.go:334] "Generic (PLEG): container finished" podID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerID="650d424704999ccaef77ddc678846c35c1a480092b312ddf8beddcd52de6fa7e" exitCode=0 Feb 26 11:19:12 crc kubenswrapper[4699]: I0226 11:19:12.617664 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerDied","Data":"650d424704999ccaef77ddc678846c35c1a480092b312ddf8beddcd52de6fa7e"} Feb 26 11:19:12 crc kubenswrapper[4699]: I0226 11:19:12.618144 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerStarted","Data":"b71edd76e1595d983e68eaa39c03589da9abd360ecf74eeb3e44306707c89512"} Feb 26 11:19:12 crc kubenswrapper[4699]: I0226 11:19:12.618164 4699 scope.go:117] "RemoveContainer" containerID="0bdaf031a20943a74ba41b521a83b5b10a6051c6df8a453d647cb318131a97b4" Feb 26 11:20:00 crc kubenswrapper[4699]: I0226 11:20:00.132970 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535080-dcs8z"] Feb 26 11:20:00 crc kubenswrapper[4699]: E0226 11:20:00.133705 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7232eb23-31ae-4e72-ae27-c256dc4cac9a" containerName="registry" Feb 26 11:20:00 crc kubenswrapper[4699]: I0226 11:20:00.133717 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="7232eb23-31ae-4e72-ae27-c256dc4cac9a" containerName="registry" Feb 26 11:20:00 crc kubenswrapper[4699]: I0226 11:20:00.133808 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="7232eb23-31ae-4e72-ae27-c256dc4cac9a" containerName="registry" Feb 26 11:20:00 crc kubenswrapper[4699]: I0226 11:20:00.134189 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535080-dcs8z" Feb 26 11:20:00 crc kubenswrapper[4699]: I0226 11:20:00.136589 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 11:20:00 crc kubenswrapper[4699]: I0226 11:20:00.137623 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 11:20:00 crc kubenswrapper[4699]: I0226 11:20:00.137916 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 11:20:00 crc kubenswrapper[4699]: I0226 11:20:00.141626 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535080-dcs8z"] Feb 26 11:20:00 crc kubenswrapper[4699]: I0226 11:20:00.304059 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6k6l\" (UniqueName: \"kubernetes.io/projected/c9ea4516-0708-4b4a-9dd5-75e6220a55d4-kube-api-access-p6k6l\") pod \"auto-csr-approver-29535080-dcs8z\" (UID: \"c9ea4516-0708-4b4a-9dd5-75e6220a55d4\") " pod="openshift-infra/auto-csr-approver-29535080-dcs8z" Feb 26 11:20:00 crc kubenswrapper[4699]: I0226 11:20:00.405365 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6k6l\" (UniqueName: \"kubernetes.io/projected/c9ea4516-0708-4b4a-9dd5-75e6220a55d4-kube-api-access-p6k6l\") pod \"auto-csr-approver-29535080-dcs8z\" (UID: \"c9ea4516-0708-4b4a-9dd5-75e6220a55d4\") " pod="openshift-infra/auto-csr-approver-29535080-dcs8z" Feb 26 11:20:00 crc kubenswrapper[4699]: I0226 11:20:00.428660 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6k6l\" (UniqueName: \"kubernetes.io/projected/c9ea4516-0708-4b4a-9dd5-75e6220a55d4-kube-api-access-p6k6l\") pod \"auto-csr-approver-29535080-dcs8z\" (UID: \"c9ea4516-0708-4b4a-9dd5-75e6220a55d4\") " pod="openshift-infra/auto-csr-approver-29535080-dcs8z" Feb 26 11:20:00 crc kubenswrapper[4699]: I0226 11:20:00.460500 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535080-dcs8z" Feb 26 11:20:00 crc kubenswrapper[4699]: I0226 11:20:00.880407 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535080-dcs8z"] Feb 26 11:20:00 crc kubenswrapper[4699]: I0226 11:20:00.892522 4699 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 11:20:01 crc kubenswrapper[4699]: I0226 11:20:01.887129 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535080-dcs8z" event={"ID":"c9ea4516-0708-4b4a-9dd5-75e6220a55d4","Type":"ContainerStarted","Data":"9290cf98df79f5c6f276fe0487d74b1e9d60f8b47efda86fef48a4429d494e23"} Feb 26 11:20:03 crc kubenswrapper[4699]: I0226 11:20:03.899715 4699 generic.go:334] "Generic (PLEG): container finished" podID="c9ea4516-0708-4b4a-9dd5-75e6220a55d4" containerID="ec18e4fa3c26a9a3b620eb9c167811e69c8b0db26c298c317aa409e857f17f0c" exitCode=0 Feb 26 11:20:03 crc kubenswrapper[4699]: I0226 11:20:03.899776 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535080-dcs8z" event={"ID":"c9ea4516-0708-4b4a-9dd5-75e6220a55d4","Type":"ContainerDied","Data":"ec18e4fa3c26a9a3b620eb9c167811e69c8b0db26c298c317aa409e857f17f0c"} Feb 26 11:20:05 crc kubenswrapper[4699]: I0226 11:20:05.152850 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535080-dcs8z" Feb 26 11:20:05 crc kubenswrapper[4699]: I0226 11:20:05.266316 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6k6l\" (UniqueName: \"kubernetes.io/projected/c9ea4516-0708-4b4a-9dd5-75e6220a55d4-kube-api-access-p6k6l\") pod \"c9ea4516-0708-4b4a-9dd5-75e6220a55d4\" (UID: \"c9ea4516-0708-4b4a-9dd5-75e6220a55d4\") " Feb 26 11:20:05 crc kubenswrapper[4699]: I0226 11:20:05.273415 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9ea4516-0708-4b4a-9dd5-75e6220a55d4-kube-api-access-p6k6l" (OuterVolumeSpecName: "kube-api-access-p6k6l") pod "c9ea4516-0708-4b4a-9dd5-75e6220a55d4" (UID: "c9ea4516-0708-4b4a-9dd5-75e6220a55d4"). InnerVolumeSpecName "kube-api-access-p6k6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:20:05 crc kubenswrapper[4699]: I0226 11:20:05.367906 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6k6l\" (UniqueName: \"kubernetes.io/projected/c9ea4516-0708-4b4a-9dd5-75e6220a55d4-kube-api-access-p6k6l\") on node \"crc\" DevicePath \"\"" Feb 26 11:20:05 crc kubenswrapper[4699]: I0226 11:20:05.914490 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535080-dcs8z" event={"ID":"c9ea4516-0708-4b4a-9dd5-75e6220a55d4","Type":"ContainerDied","Data":"9290cf98df79f5c6f276fe0487d74b1e9d60f8b47efda86fef48a4429d494e23"} Feb 26 11:20:05 crc kubenswrapper[4699]: I0226 11:20:05.914540 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9290cf98df79f5c6f276fe0487d74b1e9d60f8b47efda86fef48a4429d494e23" Feb 26 11:20:05 crc kubenswrapper[4699]: I0226 11:20:05.914619 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535080-dcs8z" Feb 26 11:20:06 crc kubenswrapper[4699]: I0226 11:20:06.205082 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535074-bjfld"] Feb 26 11:20:06 crc kubenswrapper[4699]: I0226 11:20:06.211519 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535074-bjfld"] Feb 26 11:20:06 crc kubenswrapper[4699]: I0226 11:20:06.268026 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30d444da-9127-459c-97c6-cdcff5b20e67" path="/var/lib/kubelet/pods/30d444da-9127-459c-97c6-cdcff5b20e67/volumes" Feb 26 11:20:09 crc kubenswrapper[4699]: I0226 11:20:09.630917 4699 patch_prober.go:28] interesting pod/oauth-openshift-f54c45747-bbg8s container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 11:20:09 crc kubenswrapper[4699]: I0226 11:20:09.633574 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" podUID="c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 11:20:09 crc kubenswrapper[4699]: I0226 11:20:09.635845 4699 patch_prober.go:28] interesting pod/oauth-openshift-f54c45747-bbg8s container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.66:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 11:20:09 crc kubenswrapper[4699]: I0226 11:20:09.635912 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-f54c45747-bbg8s" podUID="c1d0f1cb-3dbe-4e38-ad2d-579f02a34f3a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.66:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 11:20:20 crc kubenswrapper[4699]: I0226 11:20:20.763864 4699 scope.go:117] "RemoveContainer" containerID="52ffe1a540a589fb575f8cfc11cab09c8b7aa57c3ace31541c3b66e087bf8460" Feb 26 11:21:11 crc kubenswrapper[4699]: I0226 11:21:11.584640 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:21:11 crc kubenswrapper[4699]: I0226 11:21:11.585264 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:21:41 crc kubenswrapper[4699]: I0226 11:21:41.585073 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:21:41 crc kubenswrapper[4699]: I0226 11:21:41.585658 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:22:00 crc kubenswrapper[4699]: I0226 11:22:00.134082 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535082-2l88q"] Feb 26 11:22:00 crc kubenswrapper[4699]: E0226 11:22:00.134835 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ea4516-0708-4b4a-9dd5-75e6220a55d4" containerName="oc" Feb 26 11:22:00 crc kubenswrapper[4699]: I0226 11:22:00.134849 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ea4516-0708-4b4a-9dd5-75e6220a55d4" containerName="oc" Feb 26 11:22:00 crc kubenswrapper[4699]: I0226 11:22:00.135007 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ea4516-0708-4b4a-9dd5-75e6220a55d4" containerName="oc" Feb 26 11:22:00 crc kubenswrapper[4699]: I0226 11:22:00.135558 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535082-2l88q" Feb 26 11:22:00 crc kubenswrapper[4699]: I0226 11:22:00.137276 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 11:22:00 crc kubenswrapper[4699]: I0226 11:22:00.137313 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 11:22:00 crc kubenswrapper[4699]: I0226 11:22:00.137340 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 11:22:00 crc kubenswrapper[4699]: I0226 11:22:00.144466 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535082-2l88q"] Feb 26 11:22:00 crc kubenswrapper[4699]: I0226 11:22:00.259520 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc7k4\" (UniqueName: \"kubernetes.io/projected/b96109ee-edc2-496a-b6bc-cffad5fb9a40-kube-api-access-sc7k4\") pod \"auto-csr-approver-29535082-2l88q\" (UID: \"b96109ee-edc2-496a-b6bc-cffad5fb9a40\") " pod="openshift-infra/auto-csr-approver-29535082-2l88q" Feb 26 11:22:00 crc kubenswrapper[4699]: I0226 11:22:00.360519 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc7k4\" (UniqueName: \"kubernetes.io/projected/b96109ee-edc2-496a-b6bc-cffad5fb9a40-kube-api-access-sc7k4\") pod \"auto-csr-approver-29535082-2l88q\" (UID: \"b96109ee-edc2-496a-b6bc-cffad5fb9a40\") " pod="openshift-infra/auto-csr-approver-29535082-2l88q" Feb 26 11:22:00 crc kubenswrapper[4699]: I0226 11:22:00.380237 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc7k4\" (UniqueName: \"kubernetes.io/projected/b96109ee-edc2-496a-b6bc-cffad5fb9a40-kube-api-access-sc7k4\") pod \"auto-csr-approver-29535082-2l88q\" (UID: \"b96109ee-edc2-496a-b6bc-cffad5fb9a40\") " pod="openshift-infra/auto-csr-approver-29535082-2l88q" Feb 26 11:22:00 crc kubenswrapper[4699]: I0226 11:22:00.455168 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535082-2l88q" Feb 26 11:22:00 crc kubenswrapper[4699]: I0226 11:22:00.696988 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535082-2l88q"] Feb 26 11:22:00 crc kubenswrapper[4699]: I0226 11:22:00.966108 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535082-2l88q" event={"ID":"b96109ee-edc2-496a-b6bc-cffad5fb9a40","Type":"ContainerStarted","Data":"af90e1811d7b4178074ee9001855fddd8d93df324958be18cb5138e0bbeaca19"} Feb 26 11:22:01 crc kubenswrapper[4699]: I0226 11:22:01.974234 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535082-2l88q" event={"ID":"b96109ee-edc2-496a-b6bc-cffad5fb9a40","Type":"ContainerStarted","Data":"13f8b1b98d014497027ee7037eac5f0ce1bbfdb9879bcfae0154cb4a61717ad1"} Feb 26 11:22:02 crc kubenswrapper[4699]: I0226 11:22:02.983125 4699 generic.go:334] "Generic (PLEG): container finished" podID="b96109ee-edc2-496a-b6bc-cffad5fb9a40" containerID="13f8b1b98d014497027ee7037eac5f0ce1bbfdb9879bcfae0154cb4a61717ad1" exitCode=0 Feb 26 11:22:02 crc kubenswrapper[4699]: I0226 11:22:02.983179 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535082-2l88q" event={"ID":"b96109ee-edc2-496a-b6bc-cffad5fb9a40","Type":"ContainerDied","Data":"13f8b1b98d014497027ee7037eac5f0ce1bbfdb9879bcfae0154cb4a61717ad1"} Feb 26 11:22:04 crc kubenswrapper[4699]: I0226 11:22:04.169169 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535082-2l88q" Feb 26 11:22:04 crc kubenswrapper[4699]: I0226 11:22:04.310059 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc7k4\" (UniqueName: \"kubernetes.io/projected/b96109ee-edc2-496a-b6bc-cffad5fb9a40-kube-api-access-sc7k4\") pod \"b96109ee-edc2-496a-b6bc-cffad5fb9a40\" (UID: \"b96109ee-edc2-496a-b6bc-cffad5fb9a40\") " Feb 26 11:22:04 crc kubenswrapper[4699]: I0226 11:22:04.315054 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b96109ee-edc2-496a-b6bc-cffad5fb9a40-kube-api-access-sc7k4" (OuterVolumeSpecName: "kube-api-access-sc7k4") pod "b96109ee-edc2-496a-b6bc-cffad5fb9a40" (UID: "b96109ee-edc2-496a-b6bc-cffad5fb9a40"). InnerVolumeSpecName "kube-api-access-sc7k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:22:04 crc kubenswrapper[4699]: I0226 11:22:04.412011 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc7k4\" (UniqueName: \"kubernetes.io/projected/b96109ee-edc2-496a-b6bc-cffad5fb9a40-kube-api-access-sc7k4\") on node \"crc\" DevicePath \"\"" Feb 26 11:22:04 crc kubenswrapper[4699]: I0226 11:22:04.995782 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535082-2l88q" event={"ID":"b96109ee-edc2-496a-b6bc-cffad5fb9a40","Type":"ContainerDied","Data":"af90e1811d7b4178074ee9001855fddd8d93df324958be18cb5138e0bbeaca19"} Feb 26 11:22:04 crc kubenswrapper[4699]: I0226 11:22:04.995824 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535082-2l88q" Feb 26 11:22:04 crc kubenswrapper[4699]: I0226 11:22:04.995827 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af90e1811d7b4178074ee9001855fddd8d93df324958be18cb5138e0bbeaca19" Feb 26 11:22:05 crc kubenswrapper[4699]: I0226 11:22:05.031193 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535076-rv9x5"] Feb 26 11:22:05 crc kubenswrapper[4699]: I0226 11:22:05.034790 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535076-rv9x5"] Feb 26 11:22:06 crc kubenswrapper[4699]: I0226 11:22:06.268199 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d9d78c8-4193-47a8-9ed9-208f6dc25831" path="/var/lib/kubelet/pods/0d9d78c8-4193-47a8-9ed9-208f6dc25831/volumes" Feb 26 11:22:11 crc kubenswrapper[4699]: I0226 11:22:11.584849 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:22:11 crc kubenswrapper[4699]: I0226 11:22:11.585231 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:22:11 crc kubenswrapper[4699]: I0226 11:22:11.585276 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:22:11 crc kubenswrapper[4699]: I0226 11:22:11.585891 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b71edd76e1595d983e68eaa39c03589da9abd360ecf74eeb3e44306707c89512"} pod="openshift-machine-config-operator/machine-config-daemon-28p79" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 11:22:11 crc kubenswrapper[4699]: I0226 11:22:11.585947 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" containerID="cri-o://b71edd76e1595d983e68eaa39c03589da9abd360ecf74eeb3e44306707c89512" gracePeriod=600 Feb 26 11:22:12 crc kubenswrapper[4699]: I0226 11:22:12.056890 4699 generic.go:334] "Generic (PLEG): container finished" podID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerID="b71edd76e1595d983e68eaa39c03589da9abd360ecf74eeb3e44306707c89512" exitCode=0 Feb 26 11:22:12 crc kubenswrapper[4699]: I0226 11:22:12.056945 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerDied","Data":"b71edd76e1595d983e68eaa39c03589da9abd360ecf74eeb3e44306707c89512"} Feb 26 11:22:12 crc kubenswrapper[4699]: I0226 11:22:12.057014 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerStarted","Data":"bb4262ffa74d3c4cd8ca9d3a4ee81267fb75459ec0f5e9e96d1dd3934b8627ca"} Feb 26 11:22:12 crc kubenswrapper[4699]: I0226 11:22:12.057038 4699 scope.go:117] "RemoveContainer" containerID="650d424704999ccaef77ddc678846c35c1a480092b312ddf8beddcd52de6fa7e" Feb 26 11:22:20 crc kubenswrapper[4699]: I0226 11:22:20.810217 4699 scope.go:117] "RemoveContainer" containerID="576debb0d3d58f5281816cda92fedce6f78492ddc1301cf006959585594f82b9" Feb 26 11:22:20 crc kubenswrapper[4699]: I0226 11:22:20.832868 4699 scope.go:117] "RemoveContainer" containerID="19a60f72e3a64feb9f04d813b42f9a20a08e1ed258c497a9b61b68ad603f4b5b" Feb 26 11:22:20 crc kubenswrapper[4699]: I0226 11:22:20.868013 4699 scope.go:117] "RemoveContainer" containerID="b8eedef066fa8aaa6df130360e7ae91b6c35c65386cf0e1eb331ae24b87e6305" Feb 26 11:22:20 crc kubenswrapper[4699]: I0226 11:22:20.880712 4699 scope.go:117] "RemoveContainer" containerID="000757444f955626a5cade194e8afdfce85b9f484def8b4bc1703641245c47c3" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.146875 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535084-h8xlt"] Feb 26 11:24:00 crc kubenswrapper[4699]: E0226 11:24:00.147703 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b96109ee-edc2-496a-b6bc-cffad5fb9a40" containerName="oc" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.147719 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="b96109ee-edc2-496a-b6bc-cffad5fb9a40" containerName="oc" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.147835 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="b96109ee-edc2-496a-b6bc-cffad5fb9a40" containerName="oc" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.148330 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535084-h8xlt" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.150731 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.151033 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.151459 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.155999 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535084-h8xlt"] Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.181229 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f97pv\" (UniqueName: \"kubernetes.io/projected/98d6d072-33c5-4660-b6c3-80344c215e6a-kube-api-access-f97pv\") pod \"auto-csr-approver-29535084-h8xlt\" (UID: \"98d6d072-33c5-4660-b6c3-80344c215e6a\") " pod="openshift-infra/auto-csr-approver-29535084-h8xlt" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.282419 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f97pv\" (UniqueName: \"kubernetes.io/projected/98d6d072-33c5-4660-b6c3-80344c215e6a-kube-api-access-f97pv\") pod \"auto-csr-approver-29535084-h8xlt\" (UID: \"98d6d072-33c5-4660-b6c3-80344c215e6a\") " pod="openshift-infra/auto-csr-approver-29535084-h8xlt" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.304562 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f97pv\" (UniqueName: \"kubernetes.io/projected/98d6d072-33c5-4660-b6c3-80344c215e6a-kube-api-access-f97pv\") pod \"auto-csr-approver-29535084-h8xlt\" (UID: \"98d6d072-33c5-4660-b6c3-80344c215e6a\") " pod="openshift-infra/auto-csr-approver-29535084-h8xlt" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.466903 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535084-h8xlt" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.672779 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535084-h8xlt"] Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.737275 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-dswxp"] Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.739771 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-dswxp" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.745800 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.746066 4699 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-vg6rm" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.746256 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.753973 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-dswxp"] Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.759537 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-fhn2n"] Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.760240 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-fhn2n" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.853591 4699 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-9d424" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.853685 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzvc8\" (UniqueName: \"kubernetes.io/projected/f026799a-39c7-443e-9801-f046ba8ae94b-kube-api-access-rzvc8\") pod \"cert-manager-cainjector-cf98fcc89-dswxp\" (UID: \"f026799a-39c7-443e-9801-f046ba8ae94b\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-dswxp" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.853750 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k4gj\" (UniqueName: \"kubernetes.io/projected/fc42522b-c5f4-4df2-8435-3e3985dd960c-kube-api-access-2k4gj\") pod \"cert-manager-858654f9db-fhn2n\" (UID: \"fc42522b-c5f4-4df2-8435-3e3985dd960c\") " pod="cert-manager/cert-manager-858654f9db-fhn2n" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.886160 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-l2fdt"] Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.887247 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-l2fdt" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.890062 4699 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-2qr8t" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.895775 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-fhn2n"] Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.899448 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-l2fdt"] Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.910047 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535084-h8xlt" event={"ID":"98d6d072-33c5-4660-b6c3-80344c215e6a","Type":"ContainerStarted","Data":"b06895e9f19dc046adcb983ba655ea56046b63cab9016b1a0bc760d4d3b03db8"} Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.954869 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k4gj\" (UniqueName: \"kubernetes.io/projected/fc42522b-c5f4-4df2-8435-3e3985dd960c-kube-api-access-2k4gj\") pod \"cert-manager-858654f9db-fhn2n\" (UID: \"fc42522b-c5f4-4df2-8435-3e3985dd960c\") " pod="cert-manager/cert-manager-858654f9db-fhn2n" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.954972 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzvc8\" (UniqueName: \"kubernetes.io/projected/f026799a-39c7-443e-9801-f046ba8ae94b-kube-api-access-rzvc8\") pod \"cert-manager-cainjector-cf98fcc89-dswxp\" (UID: \"f026799a-39c7-443e-9801-f046ba8ae94b\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-dswxp" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.972244 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzvc8\" (UniqueName: \"kubernetes.io/projected/f026799a-39c7-443e-9801-f046ba8ae94b-kube-api-access-rzvc8\") pod \"cert-manager-cainjector-cf98fcc89-dswxp\" (UID: \"f026799a-39c7-443e-9801-f046ba8ae94b\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-dswxp" Feb 26 11:24:00 crc kubenswrapper[4699]: I0226 11:24:00.973128 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k4gj\" (UniqueName: \"kubernetes.io/projected/fc42522b-c5f4-4df2-8435-3e3985dd960c-kube-api-access-2k4gj\") pod \"cert-manager-858654f9db-fhn2n\" (UID: \"fc42522b-c5f4-4df2-8435-3e3985dd960c\") " pod="cert-manager/cert-manager-858654f9db-fhn2n" Feb 26 11:24:01 crc kubenswrapper[4699]: I0226 11:24:01.055779 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9c48\" (UniqueName: \"kubernetes.io/projected/fad1f923-b22c-4c0d-9eb9-684636bc76c0-kube-api-access-x9c48\") pod \"cert-manager-webhook-687f57d79b-l2fdt\" (UID: \"fad1f923-b22c-4c0d-9eb9-684636bc76c0\") " pod="cert-manager/cert-manager-webhook-687f57d79b-l2fdt" Feb 26 11:24:01 crc kubenswrapper[4699]: I0226 11:24:01.157781 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9c48\" (UniqueName: \"kubernetes.io/projected/fad1f923-b22c-4c0d-9eb9-684636bc76c0-kube-api-access-x9c48\") pod \"cert-manager-webhook-687f57d79b-l2fdt\" (UID: \"fad1f923-b22c-4c0d-9eb9-684636bc76c0\") " pod="cert-manager/cert-manager-webhook-687f57d79b-l2fdt" Feb 26 11:24:01 crc kubenswrapper[4699]: I0226 11:24:01.164180 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-dswxp" Feb 26 11:24:01 crc kubenswrapper[4699]: I0226 11:24:01.174983 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9c48\" (UniqueName: \"kubernetes.io/projected/fad1f923-b22c-4c0d-9eb9-684636bc76c0-kube-api-access-x9c48\") pod \"cert-manager-webhook-687f57d79b-l2fdt\" (UID: \"fad1f923-b22c-4c0d-9eb9-684636bc76c0\") " pod="cert-manager/cert-manager-webhook-687f57d79b-l2fdt" Feb 26 11:24:01 crc kubenswrapper[4699]: I0226 11:24:01.186390 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-fhn2n" Feb 26 11:24:01 crc kubenswrapper[4699]: I0226 11:24:01.213217 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-l2fdt" Feb 26 11:24:01 crc kubenswrapper[4699]: I0226 11:24:01.396520 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-dswxp"] Feb 26 11:24:01 crc kubenswrapper[4699]: I0226 11:24:01.442190 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-fhn2n"] Feb 26 11:24:01 crc kubenswrapper[4699]: I0226 11:24:01.507376 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-l2fdt"] Feb 26 11:24:01 crc kubenswrapper[4699]: W0226 11:24:01.512510 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfad1f923_b22c_4c0d_9eb9_684636bc76c0.slice/crio-61b5767bf2cc0e2aeb7053618a452eb32aa80ba4f35dc6bbadfca4095e4ef427 WatchSource:0}: Error finding container 61b5767bf2cc0e2aeb7053618a452eb32aa80ba4f35dc6bbadfca4095e4ef427: Status 404 returned error can't find the container with id 61b5767bf2cc0e2aeb7053618a452eb32aa80ba4f35dc6bbadfca4095e4ef427 Feb 26 11:24:01 crc kubenswrapper[4699]: I0226 11:24:01.918337 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-l2fdt" event={"ID":"fad1f923-b22c-4c0d-9eb9-684636bc76c0","Type":"ContainerStarted","Data":"61b5767bf2cc0e2aeb7053618a452eb32aa80ba4f35dc6bbadfca4095e4ef427"} Feb 26 11:24:01 crc kubenswrapper[4699]: I0226 11:24:01.919245 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-fhn2n" event={"ID":"fc42522b-c5f4-4df2-8435-3e3985dd960c","Type":"ContainerStarted","Data":"0770a7f4946f9eacb522a566fd126b99a9173b2f6a09dffbab49aae7b55670e8"} Feb 26 11:24:01 crc kubenswrapper[4699]: I0226 11:24:01.920085 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-dswxp" event={"ID":"f026799a-39c7-443e-9801-f046ba8ae94b","Type":"ContainerStarted","Data":"6d69bca4bcae0b85c4b685d53076413e159390406d3114e3cba2c6dd09d6006d"} Feb 26 11:24:04 crc kubenswrapper[4699]: I0226 11:24:04.936249 4699 generic.go:334] "Generic (PLEG): container finished" podID="98d6d072-33c5-4660-b6c3-80344c215e6a" containerID="dd76d54940753753e3f7a2683a8c241e99cd1928bc9d5ed547595d83c46f6f57" exitCode=0 Feb 26 11:24:04 crc kubenswrapper[4699]: I0226 11:24:04.936359 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535084-h8xlt" event={"ID":"98d6d072-33c5-4660-b6c3-80344c215e6a","Type":"ContainerDied","Data":"dd76d54940753753e3f7a2683a8c241e99cd1928bc9d5ed547595d83c46f6f57"} Feb 26 11:24:06 crc kubenswrapper[4699]: I0226 11:24:06.167804 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535084-h8xlt" Feb 26 11:24:06 crc kubenswrapper[4699]: I0226 11:24:06.324182 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f97pv\" (UniqueName: \"kubernetes.io/projected/98d6d072-33c5-4660-b6c3-80344c215e6a-kube-api-access-f97pv\") pod \"98d6d072-33c5-4660-b6c3-80344c215e6a\" (UID: \"98d6d072-33c5-4660-b6c3-80344c215e6a\") " Feb 26 11:24:06 crc kubenswrapper[4699]: I0226 11:24:06.330281 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98d6d072-33c5-4660-b6c3-80344c215e6a-kube-api-access-f97pv" (OuterVolumeSpecName: "kube-api-access-f97pv") pod "98d6d072-33c5-4660-b6c3-80344c215e6a" (UID: "98d6d072-33c5-4660-b6c3-80344c215e6a"). InnerVolumeSpecName "kube-api-access-f97pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:24:06 crc kubenswrapper[4699]: I0226 11:24:06.425157 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f97pv\" (UniqueName: \"kubernetes.io/projected/98d6d072-33c5-4660-b6c3-80344c215e6a-kube-api-access-f97pv\") on node \"crc\" DevicePath \"\"" Feb 26 11:24:06 crc kubenswrapper[4699]: I0226 11:24:06.949823 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535084-h8xlt" event={"ID":"98d6d072-33c5-4660-b6c3-80344c215e6a","Type":"ContainerDied","Data":"b06895e9f19dc046adcb983ba655ea56046b63cab9016b1a0bc760d4d3b03db8"} Feb 26 11:24:06 crc kubenswrapper[4699]: I0226 11:24:06.950472 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b06895e9f19dc046adcb983ba655ea56046b63cab9016b1a0bc760d4d3b03db8" Feb 26 11:24:06 crc kubenswrapper[4699]: I0226 11:24:06.949873 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535084-h8xlt" Feb 26 11:24:07 crc kubenswrapper[4699]: I0226 11:24:07.220743 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535078-ktbp9"] Feb 26 11:24:07 crc kubenswrapper[4699]: I0226 11:24:07.223555 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535078-ktbp9"] Feb 26 11:24:08 crc kubenswrapper[4699]: I0226 11:24:08.269540 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c181d85-a2e5-4771-a5a7-6cdd1f944012" path="/var/lib/kubelet/pods/4c181d85-a2e5-4771-a5a7-6cdd1f944012/volumes" Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.492937 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cw6vx"] Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.493909 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="ovn-controller" containerID="cri-o://74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f" gracePeriod=30 Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.503003 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c" gracePeriod=30 Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.503183 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="nbdb" containerID="cri-o://5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f" gracePeriod=30 Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.503237 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="northd" containerID="cri-o://e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f" gracePeriod=30 Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.504240 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="ovn-acl-logging" containerID="cri-o://bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f" gracePeriod=30 Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.504313 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="kube-rbac-proxy-node" containerID="cri-o://d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38" gracePeriod=30 Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.504356 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="sbdb" containerID="cri-o://8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f" gracePeriod=30 Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.548557 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="ovnkube-controller" containerID="cri-o://674b1ddc9ce52057921afe22948e78b0ac743b734851b7422144e06a6bedf770" gracePeriod=30 Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.973299 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cw6vx_cd12b2df-7af6-45bc-88e7-d5e5e6451e65/ovnkube-controller/2.log" Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.976105 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cw6vx_cd12b2df-7af6-45bc-88e7-d5e5e6451e65/ovn-acl-logging/0.log" Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.976653 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cw6vx_cd12b2df-7af6-45bc-88e7-d5e5e6451e65/ovn-controller/0.log" Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.977147 4699 generic.go:334] "Generic (PLEG): container finished" podID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerID="674b1ddc9ce52057921afe22948e78b0ac743b734851b7422144e06a6bedf770" exitCode=0 Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.977258 4699 generic.go:334] "Generic (PLEG): container finished" podID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerID="8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f" exitCode=0 Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.977341 4699 generic.go:334] "Generic (PLEG): container finished" podID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerID="5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f" exitCode=0 Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.977429 4699 generic.go:334] "Generic (PLEG): container finished" podID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerID="e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f" exitCode=0 Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.977516 4699 generic.go:334] "Generic (PLEG): container finished" podID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerID="fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c" exitCode=0 Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.977608 4699 generic.go:334] "Generic (PLEG): container finished" podID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerID="d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38" exitCode=0 Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.977744 4699 generic.go:334] "Generic (PLEG): container finished" podID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerID="bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f" exitCode=143 Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.977813 4699 generic.go:334] "Generic (PLEG): container finished" podID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerID="74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f" exitCode=143 Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.977781 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerDied","Data":"674b1ddc9ce52057921afe22948e78b0ac743b734851b7422144e06a6bedf770"} Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.977944 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerDied","Data":"8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f"} Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.978001 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerDied","Data":"5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f"} Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.978039 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerDied","Data":"e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f"} Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.978050 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerDied","Data":"fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c"} Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.978061 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerDied","Data":"d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38"} Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.978070 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerDied","Data":"bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f"} Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.978079 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerDied","Data":"74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f"} Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.978096 4699 scope.go:117] "RemoveContainer" containerID="063e6f60e4a1aae569cb37aa15eefe27977691a15f63d85e697b606d5a1771d0" Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.979817 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2k6b7_32ce77d1-5287-4674-aeda-810070efbb29/kube-multus/1.log" Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.980380 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2k6b7_32ce77d1-5287-4674-aeda-810070efbb29/kube-multus/0.log" Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.980588 4699 generic.go:334] "Generic (PLEG): container finished" podID="32ce77d1-5287-4674-aeda-810070efbb29" containerID="143a97abf6e80c5d27a74181526e16c9b98e3306181c3568beb75b7c14de4b31" exitCode=2 Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.980671 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2k6b7" event={"ID":"32ce77d1-5287-4674-aeda-810070efbb29","Type":"ContainerDied","Data":"143a97abf6e80c5d27a74181526e16c9b98e3306181c3568beb75b7c14de4b31"} Feb 26 11:24:10 crc kubenswrapper[4699]: I0226 11:24:10.981244 4699 scope.go:117] "RemoveContainer" containerID="143a97abf6e80c5d27a74181526e16c9b98e3306181c3568beb75b7c14de4b31" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.585567 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.585915 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.685700 4699 scope.go:117] "RemoveContainer" containerID="b60c607682186fb1f56b927c4ef394f4d6f9e9d14be5c7d13b1b52d84541aee3" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.815767 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cw6vx_cd12b2df-7af6-45bc-88e7-d5e5e6451e65/ovn-acl-logging/0.log" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.816270 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cw6vx_cd12b2df-7af6-45bc-88e7-d5e5e6451e65/ovn-controller/0.log" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.816781 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.878630 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4v2nm"] Feb 26 11:24:11 crc kubenswrapper[4699]: E0226 11:24:11.878937 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="kubecfg-setup" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.878964 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="kubecfg-setup" Feb 26 11:24:11 crc kubenswrapper[4699]: E0226 11:24:11.878983 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="kube-rbac-proxy-ovn-metrics" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.878995 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="kube-rbac-proxy-ovn-metrics" Feb 26 11:24:11 crc kubenswrapper[4699]: E0226 11:24:11.879009 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="ovnkube-controller" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.879021 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="ovnkube-controller" Feb 26 11:24:11 crc kubenswrapper[4699]: E0226 11:24:11.879034 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="ovnkube-controller" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.879044 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="ovnkube-controller" Feb 26 11:24:11 crc kubenswrapper[4699]: E0226 11:24:11.879062 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="sbdb" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.879073 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="sbdb" Feb 26 11:24:11 crc kubenswrapper[4699]: E0226 11:24:11.879091 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="northd" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.879102 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="northd" Feb 26 11:24:11 crc kubenswrapper[4699]: E0226 11:24:11.879138 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="ovn-acl-logging" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.879149 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="ovn-acl-logging" Feb 26 11:24:11 crc kubenswrapper[4699]: E0226 11:24:11.879163 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="nbdb" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.879174 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="nbdb" Feb 26 11:24:11 crc kubenswrapper[4699]: E0226 11:24:11.879194 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="kube-rbac-proxy-node" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.879204 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="kube-rbac-proxy-node" Feb 26 11:24:11 crc kubenswrapper[4699]: E0226 11:24:11.879218 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="ovnkube-controller" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.879228 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="ovnkube-controller" Feb 26 11:24:11 crc kubenswrapper[4699]: E0226 11:24:11.879243 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98d6d072-33c5-4660-b6c3-80344c215e6a" containerName="oc" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.879254 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="98d6d072-33c5-4660-b6c3-80344c215e6a" containerName="oc" Feb 26 11:24:11 crc kubenswrapper[4699]: E0226 11:24:11.879267 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="ovn-controller" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.879277 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="ovn-controller" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.879421 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="ovnkube-controller" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.879438 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="ovnkube-controller" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.879450 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="nbdb" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.879464 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="kube-rbac-proxy-node" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.879481 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="ovn-controller" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.879497 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="kube-rbac-proxy-ovn-metrics" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.879509 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="98d6d072-33c5-4660-b6c3-80344c215e6a" containerName="oc" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.879522 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="ovn-acl-logging" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.879540 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="northd" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.879557 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="ovnkube-controller" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.879572 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="sbdb" Feb 26 11:24:11 crc kubenswrapper[4699]: E0226 11:24:11.879735 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="ovnkube-controller" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.879749 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="ovnkube-controller" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.879884 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" containerName="ovnkube-controller" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.884167 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.895811 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-node-log\") pod \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.895892 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-node-log" (OuterVolumeSpecName: "node-log") pod "cd12b2df-7af6-45bc-88e7-d5e5e6451e65" (UID: "cd12b2df-7af6-45bc-88e7-d5e5e6451e65"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.896042 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-host-run-ovn-kubernetes\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.896067 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-node-log\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.896095 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6e225412-9202-42a4-8244-7a8a6355fcaf-ovnkube-config\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.896137 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-host-kubelet\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.896164 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6e225412-9202-42a4-8244-7a8a6355fcaf-env-overrides\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.896195 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-host-slash\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.896224 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-var-lib-openvswitch\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.896244 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6e225412-9202-42a4-8244-7a8a6355fcaf-ovn-node-metrics-cert\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.896266 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-run-openvswitch\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.896292 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-host-cni-netd\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.896387 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg59h\" (UniqueName: \"kubernetes.io/projected/6e225412-9202-42a4-8244-7a8a6355fcaf-kube-api-access-hg59h\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.896532 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-systemd-units\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.896680 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-host-run-netns\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.896747 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-host-cni-bin\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.896789 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-etc-openvswitch\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.896825 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-run-systemd\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.896867 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-log-socket\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.896882 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6e225412-9202-42a4-8244-7a8a6355fcaf-ovnkube-script-lib\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.896923 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.896953 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-run-ovn\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.897029 4699 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-node-log\") on node \"crc\" DevicePath \"\"" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.989248 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cw6vx_cd12b2df-7af6-45bc-88e7-d5e5e6451e65/ovn-acl-logging/0.log" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.990191 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cw6vx_cd12b2df-7af6-45bc-88e7-d5e5e6451e65/ovn-controller/0.log" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.990555 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" event={"ID":"cd12b2df-7af6-45bc-88e7-d5e5e6451e65","Type":"ContainerDied","Data":"90b46f5a3e61ec03394a2be7ff4739209b903f31912a7a66807fca0693899985"} Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.990603 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cw6vx" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.990641 4699 scope.go:117] "RemoveContainer" containerID="674b1ddc9ce52057921afe22948e78b0ac743b734851b7422144e06a6bedf770" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.992591 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2k6b7_32ce77d1-5287-4674-aeda-810070efbb29/kube-multus/1.log" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.992662 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2k6b7" event={"ID":"32ce77d1-5287-4674-aeda-810070efbb29","Type":"ContainerStarted","Data":"c0b2606aa6761275edf27264b0d44368ad12c528dde0ab91e2f612830847c483"} Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997274 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-kubelet\") pod \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997331 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-ovnkube-config\") pod \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997353 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-ovn-node-metrics-cert\") pod \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997366 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-run-openvswitch\") pod \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997383 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnmg2\" (UniqueName: \"kubernetes.io/projected/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-kube-api-access-tnmg2\") pod \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997402 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-run-ovn-kubernetes\") pod \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997416 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-systemd-units\") pod \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997433 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-log-socket\") pod \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997455 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-var-lib-cni-networks-ovn-kubernetes\") pod \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997462 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "cd12b2df-7af6-45bc-88e7-d5e5e6451e65" (UID: "cd12b2df-7af6-45bc-88e7-d5e5e6451e65"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997558 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "cd12b2df-7af6-45bc-88e7-d5e5e6451e65" (UID: "cd12b2df-7af6-45bc-88e7-d5e5e6451e65"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997541 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "cd12b2df-7af6-45bc-88e7-d5e5e6451e65" (UID: "cd12b2df-7af6-45bc-88e7-d5e5e6451e65"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997475 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-env-overrides\") pod \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997608 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "cd12b2df-7af6-45bc-88e7-d5e5e6451e65" (UID: "cd12b2df-7af6-45bc-88e7-d5e5e6451e65"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997640 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-log-socket" (OuterVolumeSpecName: "log-socket") pod "cd12b2df-7af6-45bc-88e7-d5e5e6451e65" (UID: "cd12b2df-7af6-45bc-88e7-d5e5e6451e65"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997641 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-slash\") pod \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997666 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-slash" (OuterVolumeSpecName: "host-slash") pod "cd12b2df-7af6-45bc-88e7-d5e5e6451e65" (UID: "cd12b2df-7af6-45bc-88e7-d5e5e6451e65"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997691 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "cd12b2df-7af6-45bc-88e7-d5e5e6451e65" (UID: "cd12b2df-7af6-45bc-88e7-d5e5e6451e65"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997699 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-run-systemd\") pod \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997744 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-ovnkube-script-lib\") pod \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997767 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-run-ovn\") pod \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997794 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-var-lib-openvswitch\") pod \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997836 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-cni-netd\") pod \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997864 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-run-netns\") pod \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997897 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-etc-openvswitch\") pod \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997918 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-cni-bin\") pod \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\" (UID: \"cd12b2df-7af6-45bc-88e7-d5e5e6451e65\") " Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.997983 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "cd12b2df-7af6-45bc-88e7-d5e5e6451e65" (UID: "cd12b2df-7af6-45bc-88e7-d5e5e6451e65"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998009 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "cd12b2df-7af6-45bc-88e7-d5e5e6451e65" (UID: "cd12b2df-7af6-45bc-88e7-d5e5e6451e65"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998040 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "cd12b2df-7af6-45bc-88e7-d5e5e6451e65" (UID: "cd12b2df-7af6-45bc-88e7-d5e5e6451e65"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998024 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "cd12b2df-7af6-45bc-88e7-d5e5e6451e65" (UID: "cd12b2df-7af6-45bc-88e7-d5e5e6451e65"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998064 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "cd12b2df-7af6-45bc-88e7-d5e5e6451e65" (UID: "cd12b2df-7af6-45bc-88e7-d5e5e6451e65"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998075 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "cd12b2df-7af6-45bc-88e7-d5e5e6451e65" (UID: "cd12b2df-7af6-45bc-88e7-d5e5e6451e65"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998085 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "cd12b2df-7af6-45bc-88e7-d5e5e6451e65" (UID: "cd12b2df-7af6-45bc-88e7-d5e5e6451e65"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998091 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-host-cni-bin\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998128 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "cd12b2df-7af6-45bc-88e7-d5e5e6451e65" (UID: "cd12b2df-7af6-45bc-88e7-d5e5e6451e65"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998163 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-etc-openvswitch\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998171 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-host-cni-bin\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998196 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-run-systemd\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998230 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-etc-openvswitch\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998274 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-log-socket\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998298 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6e225412-9202-42a4-8244-7a8a6355fcaf-ovnkube-script-lib\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998353 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998361 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-run-systemd\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998378 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-run-ovn\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998386 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-log-socket\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998400 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-host-run-ovn-kubernetes\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998420 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-node-log\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998426 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998435 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "cd12b2df-7af6-45bc-88e7-d5e5e6451e65" (UID: "cd12b2df-7af6-45bc-88e7-d5e5e6451e65"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998449 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-host-run-ovn-kubernetes\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998461 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-node-log\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998494 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-run-ovn\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998519 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6e225412-9202-42a4-8244-7a8a6355fcaf-ovnkube-config\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998553 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-host-kubelet\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998616 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6e225412-9202-42a4-8244-7a8a6355fcaf-env-overrides\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998648 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-host-slash\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998701 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-var-lib-openvswitch\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998726 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6e225412-9202-42a4-8244-7a8a6355fcaf-ovn-node-metrics-cert\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998652 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-host-kubelet\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998757 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-run-openvswitch\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998790 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-host-cni-netd\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998817 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg59h\" (UniqueName: \"kubernetes.io/projected/6e225412-9202-42a4-8244-7a8a6355fcaf-kube-api-access-hg59h\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998844 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-systemd-units\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998873 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-host-run-netns\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998945 4699 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998961 4699 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998972 4699 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998982 4699 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.998993 4699 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.999005 4699 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.999015 4699 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.999027 4699 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-log-socket\") on node \"crc\" DevicePath \"\"" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.999039 4699 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.999050 4699 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.999060 4699 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-slash\") on node \"crc\" DevicePath \"\"" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.999072 4699 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.999082 4699 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.999093 4699 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.999104 4699 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.999138 4699 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.999173 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-host-run-netns\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.999187 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6e225412-9202-42a4-8244-7a8a6355fcaf-ovnkube-script-lib\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.999215 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-host-cni-netd\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.999253 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-var-lib-openvswitch\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.999346 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6e225412-9202-42a4-8244-7a8a6355fcaf-ovnkube-config\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.999389 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-host-slash\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.999493 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-systemd-units\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:11 crc kubenswrapper[4699]: I0226 11:24:11.999529 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e225412-9202-42a4-8244-7a8a6355fcaf-run-openvswitch\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:12 crc kubenswrapper[4699]: I0226 11:24:11.999895 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6e225412-9202-42a4-8244-7a8a6355fcaf-env-overrides\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:12 crc kubenswrapper[4699]: I0226 11:24:12.003668 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "cd12b2df-7af6-45bc-88e7-d5e5e6451e65" (UID: "cd12b2df-7af6-45bc-88e7-d5e5e6451e65"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:24:12 crc kubenswrapper[4699]: I0226 11:24:12.003720 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6e225412-9202-42a4-8244-7a8a6355fcaf-ovn-node-metrics-cert\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:12 crc kubenswrapper[4699]: I0226 11:24:12.004060 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-kube-api-access-tnmg2" (OuterVolumeSpecName: "kube-api-access-tnmg2") pod "cd12b2df-7af6-45bc-88e7-d5e5e6451e65" (UID: "cd12b2df-7af6-45bc-88e7-d5e5e6451e65"). InnerVolumeSpecName "kube-api-access-tnmg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:24:12 crc kubenswrapper[4699]: I0226 11:24:12.014004 4699 scope.go:117] "RemoveContainer" containerID="8eeb219af62ecea95e9f4c720e0a14e4e1e1040fdd269f558f7f41fae5ee962f" Feb 26 11:24:12 crc kubenswrapper[4699]: I0226 11:24:12.016879 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg59h\" (UniqueName: \"kubernetes.io/projected/6e225412-9202-42a4-8244-7a8a6355fcaf-kube-api-access-hg59h\") pod \"ovnkube-node-4v2nm\" (UID: \"6e225412-9202-42a4-8244-7a8a6355fcaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:12 crc kubenswrapper[4699]: I0226 11:24:12.017495 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "cd12b2df-7af6-45bc-88e7-d5e5e6451e65" (UID: "cd12b2df-7af6-45bc-88e7-d5e5e6451e65"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:24:12 crc kubenswrapper[4699]: I0226 11:24:12.027972 4699 scope.go:117] "RemoveContainer" containerID="5e753f4df3718432af139b5f02ab082262c2c8291cde6e4c137e9ff0ccc5d99f" Feb 26 11:24:12 crc kubenswrapper[4699]: I0226 11:24:12.041724 4699 scope.go:117] "RemoveContainer" containerID="e01dd23c87a8712fd245585cd433a7a1866c51b962b1fda3443d9a7769b4c39f" Feb 26 11:24:12 crc kubenswrapper[4699]: I0226 11:24:12.053916 4699 scope.go:117] "RemoveContainer" containerID="fca4b97859dbd31955adbd54e5f1367a778a5c3cf0090aadf13b78bd1a22730c" Feb 26 11:24:12 crc kubenswrapper[4699]: I0226 11:24:12.067538 4699 scope.go:117] "RemoveContainer" containerID="d2a8ccfd692d3ac3cdee1225316983c92d15802aa700defa7e85a3addf9e1f38" Feb 26 11:24:12 crc kubenswrapper[4699]: I0226 11:24:12.080544 4699 scope.go:117] "RemoveContainer" containerID="bdb9d3ff849ad635bd6190a474ab109a5f6bcefcc7a0f2f16b94487089d13e6f" Feb 26 11:24:12 crc kubenswrapper[4699]: I0226 11:24:12.094295 4699 scope.go:117] "RemoveContainer" containerID="74540ecf8d835bb553c16798af12c9682a5bd4609120308ba6fc6a0b9aa1f75f" Feb 26 11:24:12 crc kubenswrapper[4699]: I0226 11:24:12.100618 4699 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:24:12 crc kubenswrapper[4699]: I0226 11:24:12.100647 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnmg2\" (UniqueName: \"kubernetes.io/projected/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-kube-api-access-tnmg2\") on node \"crc\" DevicePath \"\"" Feb 26 11:24:12 crc kubenswrapper[4699]: I0226 11:24:12.100659 4699 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cd12b2df-7af6-45bc-88e7-d5e5e6451e65-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 26 11:24:12 crc kubenswrapper[4699]: I0226 11:24:12.110819 4699 scope.go:117] "RemoveContainer" containerID="f0570cc3dc8f007e88024df69cf743715a318df16fb782800259eccc698cb124" Feb 26 11:24:12 crc kubenswrapper[4699]: I0226 11:24:12.201641 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:12 crc kubenswrapper[4699]: W0226 11:24:12.219232 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e225412_9202_42a4_8244_7a8a6355fcaf.slice/crio-ead38c998cc2a72439d65b79bc2d7dae518f48af2797c07c59c43124b08d6bf3 WatchSource:0}: Error finding container ead38c998cc2a72439d65b79bc2d7dae518f48af2797c07c59c43124b08d6bf3: Status 404 returned error can't find the container with id ead38c998cc2a72439d65b79bc2d7dae518f48af2797c07c59c43124b08d6bf3 Feb 26 11:24:12 crc kubenswrapper[4699]: I0226 11:24:12.342738 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cw6vx"] Feb 26 11:24:12 crc kubenswrapper[4699]: I0226 11:24:12.347056 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cw6vx"] Feb 26 11:24:12 crc kubenswrapper[4699]: I0226 11:24:12.997714 4699 generic.go:334] "Generic (PLEG): container finished" podID="6e225412-9202-42a4-8244-7a8a6355fcaf" containerID="b5583008b20615d2641e157ba520e6ef11eaf1ee067d0e0367d39e9785af9683" exitCode=0 Feb 26 11:24:12 crc kubenswrapper[4699]: I0226 11:24:12.997783 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" event={"ID":"6e225412-9202-42a4-8244-7a8a6355fcaf","Type":"ContainerDied","Data":"b5583008b20615d2641e157ba520e6ef11eaf1ee067d0e0367d39e9785af9683"} Feb 26 11:24:12 crc kubenswrapper[4699]: I0226 11:24:12.997809 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" event={"ID":"6e225412-9202-42a4-8244-7a8a6355fcaf","Type":"ContainerStarted","Data":"ead38c998cc2a72439d65b79bc2d7dae518f48af2797c07c59c43124b08d6bf3"} Feb 26 11:24:14 crc kubenswrapper[4699]: I0226 11:24:14.268093 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd12b2df-7af6-45bc-88e7-d5e5e6451e65" path="/var/lib/kubelet/pods/cd12b2df-7af6-45bc-88e7-d5e5e6451e65/volumes" Feb 26 11:24:15 crc kubenswrapper[4699]: I0226 11:24:15.016974 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" event={"ID":"6e225412-9202-42a4-8244-7a8a6355fcaf","Type":"ContainerStarted","Data":"e5228953a471c5bf62e601425e4a4e1e7a9bca3e8a2fb987d42d53ac18cbf41b"} Feb 26 11:24:15 crc kubenswrapper[4699]: I0226 11:24:15.017020 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" event={"ID":"6e225412-9202-42a4-8244-7a8a6355fcaf","Type":"ContainerStarted","Data":"4d44ebcb4b5934d350c6342c1ef712a9e604aa7b1b7e888c21af059eb4020c57"} Feb 26 11:24:15 crc kubenswrapper[4699]: I0226 11:24:15.017035 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" event={"ID":"6e225412-9202-42a4-8244-7a8a6355fcaf","Type":"ContainerStarted","Data":"b53670f2a1d77ad666ea86f6f35f3bf9efd756f4b62bcad6358d429572d280b1"} Feb 26 11:24:15 crc kubenswrapper[4699]: I0226 11:24:15.017048 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" event={"ID":"6e225412-9202-42a4-8244-7a8a6355fcaf","Type":"ContainerStarted","Data":"8736e52a22701dfef846f9f69791fa56f45c49d6832f030bdd8c30aaeef58f44"} Feb 26 11:24:15 crc kubenswrapper[4699]: I0226 11:24:15.017059 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" event={"ID":"6e225412-9202-42a4-8244-7a8a6355fcaf","Type":"ContainerStarted","Data":"b39d19c3f4b39eca0df8cfaa18890e0ac5f460252a98b91a74533005ff326381"} Feb 26 11:24:15 crc kubenswrapper[4699]: I0226 11:24:15.816188 4699 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 26 11:24:16 crc kubenswrapper[4699]: I0226 11:24:16.026035 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" event={"ID":"6e225412-9202-42a4-8244-7a8a6355fcaf","Type":"ContainerStarted","Data":"66408311a440f996a585906d6b0bb296c0978e4365f541be1cd36119bf734d2e"} Feb 26 11:24:18 crc kubenswrapper[4699]: I0226 11:24:18.040544 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" event={"ID":"6e225412-9202-42a4-8244-7a8a6355fcaf","Type":"ContainerStarted","Data":"44ea8bcccfbbd1e0439ea7b2c30686c35af18b0f333d640e283ef0d40ee573c1"} Feb 26 11:24:18 crc kubenswrapper[4699]: I0226 11:24:18.043441 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-l2fdt" event={"ID":"fad1f923-b22c-4c0d-9eb9-684636bc76c0","Type":"ContainerStarted","Data":"601a1f9bc395916dc73e3649a9402bb2463b9927ed19700fe8b4858c159ddc6a"} Feb 26 11:24:18 crc kubenswrapper[4699]: I0226 11:24:18.043538 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-l2fdt" Feb 26 11:24:18 crc kubenswrapper[4699]: I0226 11:24:18.044993 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-fhn2n" event={"ID":"fc42522b-c5f4-4df2-8435-3e3985dd960c","Type":"ContainerStarted","Data":"e3515b76794064e529f97d98d4b61fc037a56271092bcbd9a727eeab2b391225"} Feb 26 11:24:18 crc kubenswrapper[4699]: I0226 11:24:18.046832 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-dswxp" event={"ID":"f026799a-39c7-443e-9801-f046ba8ae94b","Type":"ContainerStarted","Data":"59d440b27262243190a759fd9d6f8c7c9f604fcdff40437906a0f3452c0c3b79"} Feb 26 11:24:18 crc kubenswrapper[4699]: I0226 11:24:18.062297 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-l2fdt" podStartSLOduration=2.513520577 podStartE2EDuration="18.062278471s" podCreationTimestamp="2026-02-26 11:24:00 +0000 UTC" firstStartedPulling="2026-02-26 11:24:01.515616206 +0000 UTC m=+787.326442640" lastFinishedPulling="2026-02-26 11:24:17.06437411 +0000 UTC m=+802.875200534" observedRunningTime="2026-02-26 11:24:18.059911702 +0000 UTC m=+803.870738136" watchObservedRunningTime="2026-02-26 11:24:18.062278471 +0000 UTC m=+803.873104925" Feb 26 11:24:18 crc kubenswrapper[4699]: I0226 11:24:18.076734 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-fhn2n" podStartSLOduration=2.48237991 podStartE2EDuration="18.076719532s" podCreationTimestamp="2026-02-26 11:24:00 +0000 UTC" firstStartedPulling="2026-02-26 11:24:01.463441886 +0000 UTC m=+787.274268320" lastFinishedPulling="2026-02-26 11:24:17.057781508 +0000 UTC m=+802.868607942" observedRunningTime="2026-02-26 11:24:18.074205049 +0000 UTC m=+803.885031493" watchObservedRunningTime="2026-02-26 11:24:18.076719532 +0000 UTC m=+803.887545966" Feb 26 11:24:20 crc kubenswrapper[4699]: I0226 11:24:20.060822 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" event={"ID":"6e225412-9202-42a4-8244-7a8a6355fcaf","Type":"ContainerStarted","Data":"ae018b0b119c23ef3fe1b58334318bb0b0101e5d55a9aef36c1acb828b434379"} Feb 26 11:24:20 crc kubenswrapper[4699]: I0226 11:24:20.061227 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:20 crc kubenswrapper[4699]: I0226 11:24:20.061246 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:20 crc kubenswrapper[4699]: I0226 11:24:20.061258 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:20 crc kubenswrapper[4699]: I0226 11:24:20.086037 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:20 crc kubenswrapper[4699]: I0226 11:24:20.088367 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-dswxp" podStartSLOduration=4.465270038 podStartE2EDuration="20.088354167s" podCreationTimestamp="2026-02-26 11:24:00 +0000 UTC" firstStartedPulling="2026-02-26 11:24:01.419830826 +0000 UTC m=+787.230657250" lastFinishedPulling="2026-02-26 11:24:17.042914955 +0000 UTC m=+802.853741379" observedRunningTime="2026-02-26 11:24:18.089739181 +0000 UTC m=+803.900565615" watchObservedRunningTime="2026-02-26 11:24:20.088354167 +0000 UTC m=+805.899180601" Feb 26 11:24:20 crc kubenswrapper[4699]: I0226 11:24:20.089498 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" podStartSLOduration=9.0894927 podStartE2EDuration="9.0894927s" podCreationTimestamp="2026-02-26 11:24:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:24:20.087274545 +0000 UTC m=+805.898100989" watchObservedRunningTime="2026-02-26 11:24:20.0894927 +0000 UTC m=+805.900319134" Feb 26 11:24:20 crc kubenswrapper[4699]: I0226 11:24:20.091575 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:24:20 crc kubenswrapper[4699]: I0226 11:24:20.939048 4699 scope.go:117] "RemoveContainer" containerID="1eda56a25e25c14621838f63ba6ea80e65461406feb4a8836fe9fda800de7616" Feb 26 11:24:26 crc kubenswrapper[4699]: I0226 11:24:26.215331 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-l2fdt" Feb 26 11:24:41 crc kubenswrapper[4699]: I0226 11:24:41.584768 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:24:41 crc kubenswrapper[4699]: I0226 11:24:41.585359 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:24:42 crc kubenswrapper[4699]: I0226 11:24:42.225431 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4v2nm" Feb 26 11:25:08 crc kubenswrapper[4699]: I0226 11:25:08.268415 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5"] Feb 26 11:25:08 crc kubenswrapper[4699]: I0226 11:25:08.270021 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5" Feb 26 11:25:08 crc kubenswrapper[4699]: I0226 11:25:08.273158 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 26 11:25:08 crc kubenswrapper[4699]: I0226 11:25:08.278653 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5"] Feb 26 11:25:08 crc kubenswrapper[4699]: I0226 11:25:08.456415 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0751c34-68ec-4fd1-821f-94e314dd5621-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5\" (UID: \"a0751c34-68ec-4fd1-821f-94e314dd5621\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5" Feb 26 11:25:08 crc kubenswrapper[4699]: I0226 11:25:08.456501 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0751c34-68ec-4fd1-821f-94e314dd5621-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5\" (UID: \"a0751c34-68ec-4fd1-821f-94e314dd5621\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5" Feb 26 11:25:08 crc kubenswrapper[4699]: I0226 11:25:08.456547 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hcm2\" (UniqueName: \"kubernetes.io/projected/a0751c34-68ec-4fd1-821f-94e314dd5621-kube-api-access-7hcm2\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5\" (UID: \"a0751c34-68ec-4fd1-821f-94e314dd5621\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5" Feb 26 11:25:08 crc kubenswrapper[4699]: I0226 11:25:08.558108 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hcm2\" (UniqueName: \"kubernetes.io/projected/a0751c34-68ec-4fd1-821f-94e314dd5621-kube-api-access-7hcm2\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5\" (UID: \"a0751c34-68ec-4fd1-821f-94e314dd5621\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5" Feb 26 11:25:08 crc kubenswrapper[4699]: I0226 11:25:08.558224 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0751c34-68ec-4fd1-821f-94e314dd5621-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5\" (UID: \"a0751c34-68ec-4fd1-821f-94e314dd5621\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5" Feb 26 11:25:08 crc kubenswrapper[4699]: I0226 11:25:08.558268 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0751c34-68ec-4fd1-821f-94e314dd5621-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5\" (UID: \"a0751c34-68ec-4fd1-821f-94e314dd5621\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5" Feb 26 11:25:08 crc kubenswrapper[4699]: I0226 11:25:08.558971 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0751c34-68ec-4fd1-821f-94e314dd5621-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5\" (UID: \"a0751c34-68ec-4fd1-821f-94e314dd5621\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5" Feb 26 11:25:08 crc kubenswrapper[4699]: I0226 11:25:08.559341 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0751c34-68ec-4fd1-821f-94e314dd5621-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5\" (UID: \"a0751c34-68ec-4fd1-821f-94e314dd5621\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5" Feb 26 11:25:08 crc kubenswrapper[4699]: I0226 11:25:08.578285 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hcm2\" (UniqueName: \"kubernetes.io/projected/a0751c34-68ec-4fd1-821f-94e314dd5621-kube-api-access-7hcm2\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5\" (UID: \"a0751c34-68ec-4fd1-821f-94e314dd5621\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5" Feb 26 11:25:08 crc kubenswrapper[4699]: I0226 11:25:08.594002 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5" Feb 26 11:25:09 crc kubenswrapper[4699]: I0226 11:25:09.084933 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5"] Feb 26 11:25:09 crc kubenswrapper[4699]: I0226 11:25:09.324722 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5" event={"ID":"a0751c34-68ec-4fd1-821f-94e314dd5621","Type":"ContainerStarted","Data":"b9b833dc0ad614a27fa356e39b9a35f75196680f8a3d5999e5f99c8756fcb337"} Feb 26 11:25:09 crc kubenswrapper[4699]: I0226 11:25:09.324766 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5" event={"ID":"a0751c34-68ec-4fd1-821f-94e314dd5621","Type":"ContainerStarted","Data":"8e60f093fadd5b8f255fd685567fcc1049fd5ee2845fd36a5cbe4b8a83c5a17f"} Feb 26 11:25:09 crc kubenswrapper[4699]: I0226 11:25:09.811410 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wqmqz"] Feb 26 11:25:09 crc kubenswrapper[4699]: I0226 11:25:09.812690 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wqmqz" Feb 26 11:25:09 crc kubenswrapper[4699]: I0226 11:25:09.822891 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wqmqz"] Feb 26 11:25:09 crc kubenswrapper[4699]: I0226 11:25:09.976974 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a69df934-4fa7-472d-abe7-8fa4ec5d4296-utilities\") pod \"redhat-operators-wqmqz\" (UID: \"a69df934-4fa7-472d-abe7-8fa4ec5d4296\") " pod="openshift-marketplace/redhat-operators-wqmqz" Feb 26 11:25:09 crc kubenswrapper[4699]: I0226 11:25:09.977616 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqzrw\" (UniqueName: \"kubernetes.io/projected/a69df934-4fa7-472d-abe7-8fa4ec5d4296-kube-api-access-bqzrw\") pod \"redhat-operators-wqmqz\" (UID: \"a69df934-4fa7-472d-abe7-8fa4ec5d4296\") " pod="openshift-marketplace/redhat-operators-wqmqz" Feb 26 11:25:09 crc kubenswrapper[4699]: I0226 11:25:09.977682 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a69df934-4fa7-472d-abe7-8fa4ec5d4296-catalog-content\") pod \"redhat-operators-wqmqz\" (UID: \"a69df934-4fa7-472d-abe7-8fa4ec5d4296\") " pod="openshift-marketplace/redhat-operators-wqmqz" Feb 26 11:25:10 crc kubenswrapper[4699]: I0226 11:25:10.079063 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a69df934-4fa7-472d-abe7-8fa4ec5d4296-utilities\") pod \"redhat-operators-wqmqz\" (UID: \"a69df934-4fa7-472d-abe7-8fa4ec5d4296\") " pod="openshift-marketplace/redhat-operators-wqmqz" Feb 26 11:25:10 crc kubenswrapper[4699]: I0226 11:25:10.079174 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqzrw\" (UniqueName: \"kubernetes.io/projected/a69df934-4fa7-472d-abe7-8fa4ec5d4296-kube-api-access-bqzrw\") pod \"redhat-operators-wqmqz\" (UID: \"a69df934-4fa7-472d-abe7-8fa4ec5d4296\") " pod="openshift-marketplace/redhat-operators-wqmqz" Feb 26 11:25:10 crc kubenswrapper[4699]: I0226 11:25:10.079221 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a69df934-4fa7-472d-abe7-8fa4ec5d4296-catalog-content\") pod \"redhat-operators-wqmqz\" (UID: \"a69df934-4fa7-472d-abe7-8fa4ec5d4296\") " pod="openshift-marketplace/redhat-operators-wqmqz" Feb 26 11:25:10 crc kubenswrapper[4699]: I0226 11:25:10.080027 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a69df934-4fa7-472d-abe7-8fa4ec5d4296-utilities\") pod \"redhat-operators-wqmqz\" (UID: \"a69df934-4fa7-472d-abe7-8fa4ec5d4296\") " pod="openshift-marketplace/redhat-operators-wqmqz" Feb 26 11:25:10 crc kubenswrapper[4699]: I0226 11:25:10.080080 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a69df934-4fa7-472d-abe7-8fa4ec5d4296-catalog-content\") pod \"redhat-operators-wqmqz\" (UID: \"a69df934-4fa7-472d-abe7-8fa4ec5d4296\") " pod="openshift-marketplace/redhat-operators-wqmqz" Feb 26 11:25:10 crc kubenswrapper[4699]: I0226 11:25:10.108728 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqzrw\" (UniqueName: \"kubernetes.io/projected/a69df934-4fa7-472d-abe7-8fa4ec5d4296-kube-api-access-bqzrw\") pod \"redhat-operators-wqmqz\" (UID: \"a69df934-4fa7-472d-abe7-8fa4ec5d4296\") " pod="openshift-marketplace/redhat-operators-wqmqz" Feb 26 11:25:10 crc kubenswrapper[4699]: I0226 11:25:10.131357 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wqmqz" Feb 26 11:25:10 crc kubenswrapper[4699]: I0226 11:25:10.335313 4699 generic.go:334] "Generic (PLEG): container finished" podID="a0751c34-68ec-4fd1-821f-94e314dd5621" containerID="b9b833dc0ad614a27fa356e39b9a35f75196680f8a3d5999e5f99c8756fcb337" exitCode=0 Feb 26 11:25:10 crc kubenswrapper[4699]: I0226 11:25:10.335365 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5" event={"ID":"a0751c34-68ec-4fd1-821f-94e314dd5621","Type":"ContainerDied","Data":"b9b833dc0ad614a27fa356e39b9a35f75196680f8a3d5999e5f99c8756fcb337"} Feb 26 11:25:10 crc kubenswrapper[4699]: I0226 11:25:10.337013 4699 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 11:25:10 crc kubenswrapper[4699]: I0226 11:25:10.468185 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wqmqz"] Feb 26 11:25:10 crc kubenswrapper[4699]: W0226 11:25:10.476169 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda69df934_4fa7_472d_abe7_8fa4ec5d4296.slice/crio-6f28cea8ca9217cfb71b838533ff5198ca00a3639bdd1298a79995ee2e3cd5ff WatchSource:0}: Error finding container 6f28cea8ca9217cfb71b838533ff5198ca00a3639bdd1298a79995ee2e3cd5ff: Status 404 returned error can't find the container with id 6f28cea8ca9217cfb71b838533ff5198ca00a3639bdd1298a79995ee2e3cd5ff Feb 26 11:25:11 crc kubenswrapper[4699]: I0226 11:25:11.341538 4699 generic.go:334] "Generic (PLEG): container finished" podID="a69df934-4fa7-472d-abe7-8fa4ec5d4296" containerID="97d5eea8e56120b9332d89f62e107ab0fa56bbf98f0f8c93efb08ea22a900d84" exitCode=0 Feb 26 11:25:11 crc kubenswrapper[4699]: I0226 11:25:11.341584 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqmqz" event={"ID":"a69df934-4fa7-472d-abe7-8fa4ec5d4296","Type":"ContainerDied","Data":"97d5eea8e56120b9332d89f62e107ab0fa56bbf98f0f8c93efb08ea22a900d84"} Feb 26 11:25:11 crc kubenswrapper[4699]: I0226 11:25:11.341610 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqmqz" event={"ID":"a69df934-4fa7-472d-abe7-8fa4ec5d4296","Type":"ContainerStarted","Data":"6f28cea8ca9217cfb71b838533ff5198ca00a3639bdd1298a79995ee2e3cd5ff"} Feb 26 11:25:11 crc kubenswrapper[4699]: I0226 11:25:11.585445 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:25:11 crc kubenswrapper[4699]: I0226 11:25:11.585729 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:25:11 crc kubenswrapper[4699]: I0226 11:25:11.585832 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:25:11 crc kubenswrapper[4699]: I0226 11:25:11.586521 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bb4262ffa74d3c4cd8ca9d3a4ee81267fb75459ec0f5e9e96d1dd3934b8627ca"} pod="openshift-machine-config-operator/machine-config-daemon-28p79" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 11:25:11 crc kubenswrapper[4699]: I0226 11:25:11.586674 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" containerID="cri-o://bb4262ffa74d3c4cd8ca9d3a4ee81267fb75459ec0f5e9e96d1dd3934b8627ca" gracePeriod=600 Feb 26 11:25:12 crc kubenswrapper[4699]: I0226 11:25:12.349529 4699 generic.go:334] "Generic (PLEG): container finished" podID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerID="bb4262ffa74d3c4cd8ca9d3a4ee81267fb75459ec0f5e9e96d1dd3934b8627ca" exitCode=0 Feb 26 11:25:12 crc kubenswrapper[4699]: I0226 11:25:12.349579 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerDied","Data":"bb4262ffa74d3c4cd8ca9d3a4ee81267fb75459ec0f5e9e96d1dd3934b8627ca"} Feb 26 11:25:12 crc kubenswrapper[4699]: I0226 11:25:12.350145 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerStarted","Data":"119837a96f7eb017f5f7e56268e9cf0e4a17276f8f8dd21ae8a57f4864ea4cf7"} Feb 26 11:25:12 crc kubenswrapper[4699]: I0226 11:25:12.350172 4699 scope.go:117] "RemoveContainer" containerID="b71edd76e1595d983e68eaa39c03589da9abd360ecf74eeb3e44306707c89512" Feb 26 11:25:13 crc kubenswrapper[4699]: I0226 11:25:13.360700 4699 generic.go:334] "Generic (PLEG): container finished" podID="a0751c34-68ec-4fd1-821f-94e314dd5621" containerID="0133aa6fda56a47cc137b971e9f9e5b35387818b4f9389af34a3ab9ed0a72a2e" exitCode=0 Feb 26 11:25:13 crc kubenswrapper[4699]: I0226 11:25:13.360791 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5" event={"ID":"a0751c34-68ec-4fd1-821f-94e314dd5621","Type":"ContainerDied","Data":"0133aa6fda56a47cc137b971e9f9e5b35387818b4f9389af34a3ab9ed0a72a2e"} Feb 26 11:25:14 crc kubenswrapper[4699]: I0226 11:25:14.374191 4699 generic.go:334] "Generic (PLEG): container finished" podID="a0751c34-68ec-4fd1-821f-94e314dd5621" containerID="1081701f4f8e2b852fa913e23a40fca64b36b6412291cd67cb93addd8c21658d" exitCode=0 Feb 26 11:25:14 crc kubenswrapper[4699]: I0226 11:25:14.374266 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5" event={"ID":"a0751c34-68ec-4fd1-821f-94e314dd5621","Type":"ContainerDied","Data":"1081701f4f8e2b852fa913e23a40fca64b36b6412291cd67cb93addd8c21658d"} Feb 26 11:25:15 crc kubenswrapper[4699]: I0226 11:25:15.734711 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5" Feb 26 11:25:15 crc kubenswrapper[4699]: I0226 11:25:15.865282 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0751c34-68ec-4fd1-821f-94e314dd5621-util\") pod \"a0751c34-68ec-4fd1-821f-94e314dd5621\" (UID: \"a0751c34-68ec-4fd1-821f-94e314dd5621\") " Feb 26 11:25:15 crc kubenswrapper[4699]: I0226 11:25:15.865343 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hcm2\" (UniqueName: \"kubernetes.io/projected/a0751c34-68ec-4fd1-821f-94e314dd5621-kube-api-access-7hcm2\") pod \"a0751c34-68ec-4fd1-821f-94e314dd5621\" (UID: \"a0751c34-68ec-4fd1-821f-94e314dd5621\") " Feb 26 11:25:15 crc kubenswrapper[4699]: I0226 11:25:15.865378 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0751c34-68ec-4fd1-821f-94e314dd5621-bundle\") pod \"a0751c34-68ec-4fd1-821f-94e314dd5621\" (UID: \"a0751c34-68ec-4fd1-821f-94e314dd5621\") " Feb 26 11:25:15 crc kubenswrapper[4699]: I0226 11:25:15.866485 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0751c34-68ec-4fd1-821f-94e314dd5621-bundle" (OuterVolumeSpecName: "bundle") pod "a0751c34-68ec-4fd1-821f-94e314dd5621" (UID: "a0751c34-68ec-4fd1-821f-94e314dd5621"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:25:15 crc kubenswrapper[4699]: I0226 11:25:15.875858 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0751c34-68ec-4fd1-821f-94e314dd5621-kube-api-access-7hcm2" (OuterVolumeSpecName: "kube-api-access-7hcm2") pod "a0751c34-68ec-4fd1-821f-94e314dd5621" (UID: "a0751c34-68ec-4fd1-821f-94e314dd5621"). InnerVolumeSpecName "kube-api-access-7hcm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:25:15 crc kubenswrapper[4699]: I0226 11:25:15.876611 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0751c34-68ec-4fd1-821f-94e314dd5621-util" (OuterVolumeSpecName: "util") pod "a0751c34-68ec-4fd1-821f-94e314dd5621" (UID: "a0751c34-68ec-4fd1-821f-94e314dd5621"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:25:15 crc kubenswrapper[4699]: I0226 11:25:15.966516 4699 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0751c34-68ec-4fd1-821f-94e314dd5621-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:25:15 crc kubenswrapper[4699]: I0226 11:25:15.966553 4699 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0751c34-68ec-4fd1-821f-94e314dd5621-util\") on node \"crc\" DevicePath \"\"" Feb 26 11:25:15 crc kubenswrapper[4699]: I0226 11:25:15.966562 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hcm2\" (UniqueName: \"kubernetes.io/projected/a0751c34-68ec-4fd1-821f-94e314dd5621-kube-api-access-7hcm2\") on node \"crc\" DevicePath \"\"" Feb 26 11:25:16 crc kubenswrapper[4699]: I0226 11:25:16.386361 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5" event={"ID":"a0751c34-68ec-4fd1-821f-94e314dd5621","Type":"ContainerDied","Data":"8e60f093fadd5b8f255fd685567fcc1049fd5ee2845fd36a5cbe4b8a83c5a17f"} Feb 26 11:25:16 crc kubenswrapper[4699]: I0226 11:25:16.386401 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e60f093fadd5b8f255fd685567fcc1049fd5ee2845fd36a5cbe4b8a83c5a17f" Feb 26 11:25:16 crc kubenswrapper[4699]: I0226 11:25:16.386459 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5" Feb 26 11:25:17 crc kubenswrapper[4699]: I0226 11:25:17.735564 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-8l8n8"] Feb 26 11:25:17 crc kubenswrapper[4699]: E0226 11:25:17.736073 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0751c34-68ec-4fd1-821f-94e314dd5621" containerName="extract" Feb 26 11:25:17 crc kubenswrapper[4699]: I0226 11:25:17.736092 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0751c34-68ec-4fd1-821f-94e314dd5621" containerName="extract" Feb 26 11:25:17 crc kubenswrapper[4699]: E0226 11:25:17.736135 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0751c34-68ec-4fd1-821f-94e314dd5621" containerName="util" Feb 26 11:25:17 crc kubenswrapper[4699]: I0226 11:25:17.736145 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0751c34-68ec-4fd1-821f-94e314dd5621" containerName="util" Feb 26 11:25:17 crc kubenswrapper[4699]: E0226 11:25:17.736159 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0751c34-68ec-4fd1-821f-94e314dd5621" containerName="pull" Feb 26 11:25:17 crc kubenswrapper[4699]: I0226 11:25:17.736176 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0751c34-68ec-4fd1-821f-94e314dd5621" containerName="pull" Feb 26 11:25:17 crc kubenswrapper[4699]: I0226 11:25:17.736286 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0751c34-68ec-4fd1-821f-94e314dd5621" containerName="extract" Feb 26 11:25:17 crc kubenswrapper[4699]: I0226 11:25:17.736638 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-8l8n8" Feb 26 11:25:17 crc kubenswrapper[4699]: I0226 11:25:17.740900 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 26 11:25:17 crc kubenswrapper[4699]: I0226 11:25:17.741274 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-9krck" Feb 26 11:25:17 crc kubenswrapper[4699]: I0226 11:25:17.741435 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 26 11:25:17 crc kubenswrapper[4699]: I0226 11:25:17.745804 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-8l8n8"] Feb 26 11:25:17 crc kubenswrapper[4699]: I0226 11:25:17.801272 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ff56\" (UniqueName: \"kubernetes.io/projected/15312afe-49aa-4681-8513-6ed9c774d222-kube-api-access-8ff56\") pod \"nmstate-operator-75c5dccd6c-8l8n8\" (UID: \"15312afe-49aa-4681-8513-6ed9c774d222\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-8l8n8" Feb 26 11:25:17 crc kubenswrapper[4699]: I0226 11:25:17.902744 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ff56\" (UniqueName: \"kubernetes.io/projected/15312afe-49aa-4681-8513-6ed9c774d222-kube-api-access-8ff56\") pod \"nmstate-operator-75c5dccd6c-8l8n8\" (UID: \"15312afe-49aa-4681-8513-6ed9c774d222\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-8l8n8" Feb 26 11:25:17 crc kubenswrapper[4699]: I0226 11:25:17.920843 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ff56\" (UniqueName: \"kubernetes.io/projected/15312afe-49aa-4681-8513-6ed9c774d222-kube-api-access-8ff56\") pod \"nmstate-operator-75c5dccd6c-8l8n8\" (UID: \"15312afe-49aa-4681-8513-6ed9c774d222\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-8l8n8" Feb 26 11:25:18 crc kubenswrapper[4699]: I0226 11:25:18.057330 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-8l8n8" Feb 26 11:25:22 crc kubenswrapper[4699]: I0226 11:25:22.575399 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-8l8n8"] Feb 26 11:25:23 crc kubenswrapper[4699]: I0226 11:25:23.454428 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-8l8n8" event={"ID":"15312afe-49aa-4681-8513-6ed9c774d222","Type":"ContainerStarted","Data":"437400b19e39ad13f8a5fba459599183e02859395f12ff0d707908885ae8c8bd"} Feb 26 11:25:23 crc kubenswrapper[4699]: I0226 11:25:23.457091 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqmqz" event={"ID":"a69df934-4fa7-472d-abe7-8fa4ec5d4296","Type":"ContainerStarted","Data":"82bb3a908b13a6664db9f5fd16f23580aaf3e16ecceb5cb2a3e2885f89be6580"} Feb 26 11:25:27 crc kubenswrapper[4699]: I0226 11:25:27.685503 4699 generic.go:334] "Generic (PLEG): container finished" podID="a69df934-4fa7-472d-abe7-8fa4ec5d4296" containerID="82bb3a908b13a6664db9f5fd16f23580aaf3e16ecceb5cb2a3e2885f89be6580" exitCode=0 Feb 26 11:25:27 crc kubenswrapper[4699]: I0226 11:25:27.685611 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqmqz" event={"ID":"a69df934-4fa7-472d-abe7-8fa4ec5d4296","Type":"ContainerDied","Data":"82bb3a908b13a6664db9f5fd16f23580aaf3e16ecceb5cb2a3e2885f89be6580"} Feb 26 11:25:32 crc kubenswrapper[4699]: I0226 11:25:32.719794 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqmqz" event={"ID":"a69df934-4fa7-472d-abe7-8fa4ec5d4296","Type":"ContainerStarted","Data":"0cbe859cad3566719b89bc1b0cbfabbe4d6c0bd549cbc5a1536325d87ef1795f"} Feb 26 11:25:32 crc kubenswrapper[4699]: I0226 11:25:32.739314 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wqmqz" podStartSLOduration=2.936186202 podStartE2EDuration="23.73929663s" podCreationTimestamp="2026-02-26 11:25:09 +0000 UTC" firstStartedPulling="2026-02-26 11:25:11.343104725 +0000 UTC m=+857.153931159" lastFinishedPulling="2026-02-26 11:25:32.146215153 +0000 UTC m=+877.957041587" observedRunningTime="2026-02-26 11:25:32.736431237 +0000 UTC m=+878.547257691" watchObservedRunningTime="2026-02-26 11:25:32.73929663 +0000 UTC m=+878.550123064" Feb 26 11:25:33 crc kubenswrapper[4699]: I0226 11:25:33.726845 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-8l8n8" event={"ID":"15312afe-49aa-4681-8513-6ed9c774d222","Type":"ContainerStarted","Data":"56452d7d4e4d53e5e0d776eef0ec2c70219b0f7226a62ebb47fc6bdf3d76555a"} Feb 26 11:25:33 crc kubenswrapper[4699]: I0226 11:25:33.753705 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-8l8n8" podStartSLOduration=6.192539795 podStartE2EDuration="16.753688563s" podCreationTimestamp="2026-02-26 11:25:17 +0000 UTC" firstStartedPulling="2026-02-26 11:25:22.587943311 +0000 UTC m=+868.398769745" lastFinishedPulling="2026-02-26 11:25:33.149092079 +0000 UTC m=+878.959918513" observedRunningTime="2026-02-26 11:25:33.75254242 +0000 UTC m=+879.563368874" watchObservedRunningTime="2026-02-26 11:25:33.753688563 +0000 UTC m=+879.564514987" Feb 26 11:25:34 crc kubenswrapper[4699]: I0226 11:25:34.767456 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-jnrsc"] Feb 26 11:25:34 crc kubenswrapper[4699]: I0226 11:25:34.768803 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-jnrsc" Feb 26 11:25:34 crc kubenswrapper[4699]: I0226 11:25:34.773412 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-qmw66"] Feb 26 11:25:34 crc kubenswrapper[4699]: I0226 11:25:34.774175 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-qmw66" Feb 26 11:25:34 crc kubenswrapper[4699]: I0226 11:25:34.782352 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-r6rhm" Feb 26 11:25:34 crc kubenswrapper[4699]: I0226 11:25:34.782412 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 26 11:25:34 crc kubenswrapper[4699]: I0226 11:25:34.805886 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-jnrsc"] Feb 26 11:25:34 crc kubenswrapper[4699]: I0226 11:25:34.822233 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-qmw66"] Feb 26 11:25:34 crc kubenswrapper[4699]: I0226 11:25:34.838296 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-5jrwg"] Feb 26 11:25:34 crc kubenswrapper[4699]: I0226 11:25:34.839286 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-5jrwg" Feb 26 11:25:34 crc kubenswrapper[4699]: I0226 11:25:34.908053 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d674e733-7357-43e5-be9c-4d4e9bad252c-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-qmw66\" (UID: \"d674e733-7357-43e5-be9c-4d4e9bad252c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-qmw66" Feb 26 11:25:34 crc kubenswrapper[4699]: I0226 11:25:34.908127 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s95g\" (UniqueName: \"kubernetes.io/projected/c4897df9-3a79-41bf-a7ba-7a72d888f8e1-kube-api-access-4s95g\") pod \"nmstate-metrics-69594cc75-jnrsc\" (UID: \"c4897df9-3a79-41bf-a7ba-7a72d888f8e1\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-jnrsc" Feb 26 11:25:34 crc kubenswrapper[4699]: I0226 11:25:34.908169 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b26sz\" (UniqueName: \"kubernetes.io/projected/d674e733-7357-43e5-be9c-4d4e9bad252c-kube-api-access-b26sz\") pod \"nmstate-webhook-786f45cff4-qmw66\" (UID: \"d674e733-7357-43e5-be9c-4d4e9bad252c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-qmw66" Feb 26 11:25:34 crc kubenswrapper[4699]: I0226 11:25:34.911189 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-7f4bx"] Feb 26 11:25:34 crc kubenswrapper[4699]: I0226 11:25:34.912052 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-7f4bx" Feb 26 11:25:34 crc kubenswrapper[4699]: I0226 11:25:34.915003 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 26 11:25:34 crc kubenswrapper[4699]: I0226 11:25:34.915267 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-jlnzj" Feb 26 11:25:34 crc kubenswrapper[4699]: I0226 11:25:34.917069 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 26 11:25:34 crc kubenswrapper[4699]: I0226 11:25:34.921370 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-7f4bx"] Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.009846 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9lwl\" (UniqueName: \"kubernetes.io/projected/80de38f0-8620-4e27-988e-6d85d7c8bc24-kube-api-access-k9lwl\") pod \"nmstate-handler-5jrwg\" (UID: \"80de38f0-8620-4e27-988e-6d85d7c8bc24\") " pod="openshift-nmstate/nmstate-handler-5jrwg" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.009934 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d674e733-7357-43e5-be9c-4d4e9bad252c-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-qmw66\" (UID: \"d674e733-7357-43e5-be9c-4d4e9bad252c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-qmw66" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.009969 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s95g\" (UniqueName: \"kubernetes.io/projected/c4897df9-3a79-41bf-a7ba-7a72d888f8e1-kube-api-access-4s95g\") pod \"nmstate-metrics-69594cc75-jnrsc\" (UID: \"c4897df9-3a79-41bf-a7ba-7a72d888f8e1\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-jnrsc" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.010002 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/80de38f0-8620-4e27-988e-6d85d7c8bc24-ovs-socket\") pod \"nmstate-handler-5jrwg\" (UID: \"80de38f0-8620-4e27-988e-6d85d7c8bc24\") " pod="openshift-nmstate/nmstate-handler-5jrwg" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.010030 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b26sz\" (UniqueName: \"kubernetes.io/projected/d674e733-7357-43e5-be9c-4d4e9bad252c-kube-api-access-b26sz\") pod \"nmstate-webhook-786f45cff4-qmw66\" (UID: \"d674e733-7357-43e5-be9c-4d4e9bad252c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-qmw66" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.010054 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/80de38f0-8620-4e27-988e-6d85d7c8bc24-nmstate-lock\") pod \"nmstate-handler-5jrwg\" (UID: \"80de38f0-8620-4e27-988e-6d85d7c8bc24\") " pod="openshift-nmstate/nmstate-handler-5jrwg" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.010083 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/80de38f0-8620-4e27-988e-6d85d7c8bc24-dbus-socket\") pod \"nmstate-handler-5jrwg\" (UID: \"80de38f0-8620-4e27-988e-6d85d7c8bc24\") " pod="openshift-nmstate/nmstate-handler-5jrwg" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.021743 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d674e733-7357-43e5-be9c-4d4e9bad252c-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-qmw66\" (UID: \"d674e733-7357-43e5-be9c-4d4e9bad252c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-qmw66" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.026793 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s95g\" (UniqueName: \"kubernetes.io/projected/c4897df9-3a79-41bf-a7ba-7a72d888f8e1-kube-api-access-4s95g\") pod \"nmstate-metrics-69594cc75-jnrsc\" (UID: \"c4897df9-3a79-41bf-a7ba-7a72d888f8e1\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-jnrsc" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.040106 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b26sz\" (UniqueName: \"kubernetes.io/projected/d674e733-7357-43e5-be9c-4d4e9bad252c-kube-api-access-b26sz\") pod \"nmstate-webhook-786f45cff4-qmw66\" (UID: \"d674e733-7357-43e5-be9c-4d4e9bad252c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-qmw66" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.103517 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-jnrsc" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.110939 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/13fc1aa0-a043-4b42-952b-7f718ff577d2-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-7f4bx\" (UID: \"13fc1aa0-a043-4b42-952b-7f718ff577d2\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-7f4bx" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.110986 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/80de38f0-8620-4e27-988e-6d85d7c8bc24-ovs-socket\") pod \"nmstate-handler-5jrwg\" (UID: \"80de38f0-8620-4e27-988e-6d85d7c8bc24\") " pod="openshift-nmstate/nmstate-handler-5jrwg" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.111018 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/80de38f0-8620-4e27-988e-6d85d7c8bc24-nmstate-lock\") pod \"nmstate-handler-5jrwg\" (UID: \"80de38f0-8620-4e27-988e-6d85d7c8bc24\") " pod="openshift-nmstate/nmstate-handler-5jrwg" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.111313 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4rgn\" (UniqueName: \"kubernetes.io/projected/13fc1aa0-a043-4b42-952b-7f718ff577d2-kube-api-access-f4rgn\") pod \"nmstate-console-plugin-5dcbbd79cf-7f4bx\" (UID: \"13fc1aa0-a043-4b42-952b-7f718ff577d2\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-7f4bx" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.111320 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/80de38f0-8620-4e27-988e-6d85d7c8bc24-ovs-socket\") pod \"nmstate-handler-5jrwg\" (UID: \"80de38f0-8620-4e27-988e-6d85d7c8bc24\") " pod="openshift-nmstate/nmstate-handler-5jrwg" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.111347 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/80de38f0-8620-4e27-988e-6d85d7c8bc24-dbus-socket\") pod \"nmstate-handler-5jrwg\" (UID: \"80de38f0-8620-4e27-988e-6d85d7c8bc24\") " pod="openshift-nmstate/nmstate-handler-5jrwg" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.111488 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/13fc1aa0-a043-4b42-952b-7f718ff577d2-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-7f4bx\" (UID: \"13fc1aa0-a043-4b42-952b-7f718ff577d2\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-7f4bx" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.111534 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/80de38f0-8620-4e27-988e-6d85d7c8bc24-dbus-socket\") pod \"nmstate-handler-5jrwg\" (UID: \"80de38f0-8620-4e27-988e-6d85d7c8bc24\") " pod="openshift-nmstate/nmstate-handler-5jrwg" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.111540 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/80de38f0-8620-4e27-988e-6d85d7c8bc24-nmstate-lock\") pod \"nmstate-handler-5jrwg\" (UID: \"80de38f0-8620-4e27-988e-6d85d7c8bc24\") " pod="openshift-nmstate/nmstate-handler-5jrwg" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.111610 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9lwl\" (UniqueName: \"kubernetes.io/projected/80de38f0-8620-4e27-988e-6d85d7c8bc24-kube-api-access-k9lwl\") pod \"nmstate-handler-5jrwg\" (UID: \"80de38f0-8620-4e27-988e-6d85d7c8bc24\") " pod="openshift-nmstate/nmstate-handler-5jrwg" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.115885 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-qmw66" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.123284 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-799ddfb64f-wf4l2"] Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.124229 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.143440 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-799ddfb64f-wf4l2"] Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.177794 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9lwl\" (UniqueName: \"kubernetes.io/projected/80de38f0-8620-4e27-988e-6d85d7c8bc24-kube-api-access-k9lwl\") pod \"nmstate-handler-5jrwg\" (UID: \"80de38f0-8620-4e27-988e-6d85d7c8bc24\") " pod="openshift-nmstate/nmstate-handler-5jrwg" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.213313 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1f64b56-52a9-4f58-a20b-04c94c94fb9d-oauth-serving-cert\") pod \"console-799ddfb64f-wf4l2\" (UID: \"e1f64b56-52a9-4f58-a20b-04c94c94fb9d\") " pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.213390 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/13fc1aa0-a043-4b42-952b-7f718ff577d2-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-7f4bx\" (UID: \"13fc1aa0-a043-4b42-952b-7f718ff577d2\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-7f4bx" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.213419 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmh94\" (UniqueName: \"kubernetes.io/projected/e1f64b56-52a9-4f58-a20b-04c94c94fb9d-kube-api-access-vmh94\") pod \"console-799ddfb64f-wf4l2\" (UID: \"e1f64b56-52a9-4f58-a20b-04c94c94fb9d\") " pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.213463 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1f64b56-52a9-4f58-a20b-04c94c94fb9d-console-oauth-config\") pod \"console-799ddfb64f-wf4l2\" (UID: \"e1f64b56-52a9-4f58-a20b-04c94c94fb9d\") " pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.213488 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4rgn\" (UniqueName: \"kubernetes.io/projected/13fc1aa0-a043-4b42-952b-7f718ff577d2-kube-api-access-f4rgn\") pod \"nmstate-console-plugin-5dcbbd79cf-7f4bx\" (UID: \"13fc1aa0-a043-4b42-952b-7f718ff577d2\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-7f4bx" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.213503 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1f64b56-52a9-4f58-a20b-04c94c94fb9d-trusted-ca-bundle\") pod \"console-799ddfb64f-wf4l2\" (UID: \"e1f64b56-52a9-4f58-a20b-04c94c94fb9d\") " pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.213547 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1f64b56-52a9-4f58-a20b-04c94c94fb9d-service-ca\") pod \"console-799ddfb64f-wf4l2\" (UID: \"e1f64b56-52a9-4f58-a20b-04c94c94fb9d\") " pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.213571 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1f64b56-52a9-4f58-a20b-04c94c94fb9d-console-config\") pod \"console-799ddfb64f-wf4l2\" (UID: \"e1f64b56-52a9-4f58-a20b-04c94c94fb9d\") " pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.213610 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1f64b56-52a9-4f58-a20b-04c94c94fb9d-console-serving-cert\") pod \"console-799ddfb64f-wf4l2\" (UID: \"e1f64b56-52a9-4f58-a20b-04c94c94fb9d\") " pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.213635 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/13fc1aa0-a043-4b42-952b-7f718ff577d2-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-7f4bx\" (UID: \"13fc1aa0-a043-4b42-952b-7f718ff577d2\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-7f4bx" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.214483 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/13fc1aa0-a043-4b42-952b-7f718ff577d2-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-7f4bx\" (UID: \"13fc1aa0-a043-4b42-952b-7f718ff577d2\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-7f4bx" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.216639 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/13fc1aa0-a043-4b42-952b-7f718ff577d2-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-7f4bx\" (UID: \"13fc1aa0-a043-4b42-952b-7f718ff577d2\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-7f4bx" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.231275 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4rgn\" (UniqueName: \"kubernetes.io/projected/13fc1aa0-a043-4b42-952b-7f718ff577d2-kube-api-access-f4rgn\") pod \"nmstate-console-plugin-5dcbbd79cf-7f4bx\" (UID: \"13fc1aa0-a043-4b42-952b-7f718ff577d2\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-7f4bx" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.314505 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1f64b56-52a9-4f58-a20b-04c94c94fb9d-oauth-serving-cert\") pod \"console-799ddfb64f-wf4l2\" (UID: \"e1f64b56-52a9-4f58-a20b-04c94c94fb9d\") " pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.314590 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmh94\" (UniqueName: \"kubernetes.io/projected/e1f64b56-52a9-4f58-a20b-04c94c94fb9d-kube-api-access-vmh94\") pod \"console-799ddfb64f-wf4l2\" (UID: \"e1f64b56-52a9-4f58-a20b-04c94c94fb9d\") " pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.314616 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1f64b56-52a9-4f58-a20b-04c94c94fb9d-console-oauth-config\") pod \"console-799ddfb64f-wf4l2\" (UID: \"e1f64b56-52a9-4f58-a20b-04c94c94fb9d\") " pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.318202 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1f64b56-52a9-4f58-a20b-04c94c94fb9d-console-oauth-config\") pod \"console-799ddfb64f-wf4l2\" (UID: \"e1f64b56-52a9-4f58-a20b-04c94c94fb9d\") " pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.318282 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1f64b56-52a9-4f58-a20b-04c94c94fb9d-trusted-ca-bundle\") pod \"console-799ddfb64f-wf4l2\" (UID: \"e1f64b56-52a9-4f58-a20b-04c94c94fb9d\") " pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.318319 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1f64b56-52a9-4f58-a20b-04c94c94fb9d-service-ca\") pod \"console-799ddfb64f-wf4l2\" (UID: \"e1f64b56-52a9-4f58-a20b-04c94c94fb9d\") " pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.318351 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1f64b56-52a9-4f58-a20b-04c94c94fb9d-console-config\") pod \"console-799ddfb64f-wf4l2\" (UID: \"e1f64b56-52a9-4f58-a20b-04c94c94fb9d\") " pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.318375 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1f64b56-52a9-4f58-a20b-04c94c94fb9d-console-serving-cert\") pod \"console-799ddfb64f-wf4l2\" (UID: \"e1f64b56-52a9-4f58-a20b-04c94c94fb9d\") " pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.320983 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1f64b56-52a9-4f58-a20b-04c94c94fb9d-console-serving-cert\") pod \"console-799ddfb64f-wf4l2\" (UID: \"e1f64b56-52a9-4f58-a20b-04c94c94fb9d\") " pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.321637 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1f64b56-52a9-4f58-a20b-04c94c94fb9d-oauth-serving-cert\") pod \"console-799ddfb64f-wf4l2\" (UID: \"e1f64b56-52a9-4f58-a20b-04c94c94fb9d\") " pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.335616 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1f64b56-52a9-4f58-a20b-04c94c94fb9d-service-ca\") pod \"console-799ddfb64f-wf4l2\" (UID: \"e1f64b56-52a9-4f58-a20b-04c94c94fb9d\") " pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.336196 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1f64b56-52a9-4f58-a20b-04c94c94fb9d-console-config\") pod \"console-799ddfb64f-wf4l2\" (UID: \"e1f64b56-52a9-4f58-a20b-04c94c94fb9d\") " pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.338018 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmh94\" (UniqueName: \"kubernetes.io/projected/e1f64b56-52a9-4f58-a20b-04c94c94fb9d-kube-api-access-vmh94\") pod \"console-799ddfb64f-wf4l2\" (UID: \"e1f64b56-52a9-4f58-a20b-04c94c94fb9d\") " pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.369696 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1f64b56-52a9-4f58-a20b-04c94c94fb9d-trusted-ca-bundle\") pod \"console-799ddfb64f-wf4l2\" (UID: \"e1f64b56-52a9-4f58-a20b-04c94c94fb9d\") " pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.457237 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-5jrwg" Feb 26 11:25:35 crc kubenswrapper[4699]: W0226 11:25:35.484595 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80de38f0_8620_4e27_988e_6d85d7c8bc24.slice/crio-cf3c458fa688f071c79305c0c35e01ea234329467585d8accc935ecd72622e2b WatchSource:0}: Error finding container cf3c458fa688f071c79305c0c35e01ea234329467585d8accc935ecd72622e2b: Status 404 returned error can't find the container with id cf3c458fa688f071c79305c0c35e01ea234329467585d8accc935ecd72622e2b Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.494928 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.527828 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-7f4bx" Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.539293 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-jnrsc"] Feb 26 11:25:35 crc kubenswrapper[4699]: W0226 11:25:35.565277 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4897df9_3a79_41bf_a7ba_7a72d888f8e1.slice/crio-af4684c7fb3b349d3e6c643049f0fbcb46851639637ba37a76dec29f1efb0a24 WatchSource:0}: Error finding container af4684c7fb3b349d3e6c643049f0fbcb46851639637ba37a76dec29f1efb0a24: Status 404 returned error can't find the container with id af4684c7fb3b349d3e6c643049f0fbcb46851639637ba37a76dec29f1efb0a24 Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.739585 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-5jrwg" event={"ID":"80de38f0-8620-4e27-988e-6d85d7c8bc24","Type":"ContainerStarted","Data":"cf3c458fa688f071c79305c0c35e01ea234329467585d8accc935ecd72622e2b"} Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.741313 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-jnrsc" event={"ID":"c4897df9-3a79-41bf-a7ba-7a72d888f8e1","Type":"ContainerStarted","Data":"af4684c7fb3b349d3e6c643049f0fbcb46851639637ba37a76dec29f1efb0a24"} Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.802529 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-qmw66"] Feb 26 11:25:35 crc kubenswrapper[4699]: W0226 11:25:35.806620 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd674e733_7357_43e5_be9c_4d4e9bad252c.slice/crio-3f421296c721c33476eb00fa702942c3481d0aebc3247b470d9edcb3c9bc06b0 WatchSource:0}: Error finding container 3f421296c721c33476eb00fa702942c3481d0aebc3247b470d9edcb3c9bc06b0: Status 404 returned error can't find the container with id 3f421296c721c33476eb00fa702942c3481d0aebc3247b470d9edcb3c9bc06b0 Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.939651 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-799ddfb64f-wf4l2"] Feb 26 11:25:35 crc kubenswrapper[4699]: W0226 11:25:35.941499 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1f64b56_52a9_4f58_a20b_04c94c94fb9d.slice/crio-86ffd65f29058f41e7d42a7289882fb46913e5f44d0210f4d4320d8b53187a2d WatchSource:0}: Error finding container 86ffd65f29058f41e7d42a7289882fb46913e5f44d0210f4d4320d8b53187a2d: Status 404 returned error can't find the container with id 86ffd65f29058f41e7d42a7289882fb46913e5f44d0210f4d4320d8b53187a2d Feb 26 11:25:35 crc kubenswrapper[4699]: I0226 11:25:35.967092 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-7f4bx"] Feb 26 11:25:35 crc kubenswrapper[4699]: W0226 11:25:35.972512 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13fc1aa0_a043_4b42_952b_7f718ff577d2.slice/crio-d60d16c47edb3972def6d97df802c1b1a86c6dbe8cfce544e1fdaf762d875d54 WatchSource:0}: Error finding container d60d16c47edb3972def6d97df802c1b1a86c6dbe8cfce544e1fdaf762d875d54: Status 404 returned error can't find the container with id d60d16c47edb3972def6d97df802c1b1a86c6dbe8cfce544e1fdaf762d875d54 Feb 26 11:25:36 crc kubenswrapper[4699]: I0226 11:25:36.747919 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-799ddfb64f-wf4l2" event={"ID":"e1f64b56-52a9-4f58-a20b-04c94c94fb9d","Type":"ContainerStarted","Data":"65521525e894a180389887166ec1a9561b3180c80b9d12275598c55fbd6ce6cc"} Feb 26 11:25:36 crc kubenswrapper[4699]: I0226 11:25:36.748321 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-799ddfb64f-wf4l2" event={"ID":"e1f64b56-52a9-4f58-a20b-04c94c94fb9d","Type":"ContainerStarted","Data":"86ffd65f29058f41e7d42a7289882fb46913e5f44d0210f4d4320d8b53187a2d"} Feb 26 11:25:36 crc kubenswrapper[4699]: I0226 11:25:36.749883 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-qmw66" event={"ID":"d674e733-7357-43e5-be9c-4d4e9bad252c","Type":"ContainerStarted","Data":"3f421296c721c33476eb00fa702942c3481d0aebc3247b470d9edcb3c9bc06b0"} Feb 26 11:25:36 crc kubenswrapper[4699]: I0226 11:25:36.750753 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-7f4bx" event={"ID":"13fc1aa0-a043-4b42-952b-7f718ff577d2","Type":"ContainerStarted","Data":"d60d16c47edb3972def6d97df802c1b1a86c6dbe8cfce544e1fdaf762d875d54"} Feb 26 11:25:36 crc kubenswrapper[4699]: I0226 11:25:36.771748 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-799ddfb64f-wf4l2" podStartSLOduration=1.771730948 podStartE2EDuration="1.771730948s" podCreationTimestamp="2026-02-26 11:25:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:25:36.767579347 +0000 UTC m=+882.578405801" watchObservedRunningTime="2026-02-26 11:25:36.771730948 +0000 UTC m=+882.582557382" Feb 26 11:25:39 crc kubenswrapper[4699]: I0226 11:25:39.772013 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-7f4bx" event={"ID":"13fc1aa0-a043-4b42-952b-7f718ff577d2","Type":"ContainerStarted","Data":"0e03e537a0b0e5d937c5a32b928e5e0bfc6bd5e36979095c545552e26e58e356"} Feb 26 11:25:39 crc kubenswrapper[4699]: I0226 11:25:39.774457 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-qmw66" event={"ID":"d674e733-7357-43e5-be9c-4d4e9bad252c","Type":"ContainerStarted","Data":"63c66c4576ed4461bd7fbc121a3ae4f04ecb97cee007e9d72a613167741796d6"} Feb 26 11:25:39 crc kubenswrapper[4699]: I0226 11:25:39.775237 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-qmw66" Feb 26 11:25:39 crc kubenswrapper[4699]: I0226 11:25:39.777384 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-jnrsc" event={"ID":"c4897df9-3a79-41bf-a7ba-7a72d888f8e1","Type":"ContainerStarted","Data":"262c78d22a675e5b5e6f9df75c304d1e26ca3bde89ecdbd804c743e2b4234713"} Feb 26 11:25:39 crc kubenswrapper[4699]: I0226 11:25:39.779101 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-5jrwg" event={"ID":"80de38f0-8620-4e27-988e-6d85d7c8bc24","Type":"ContainerStarted","Data":"3b299fc65625c25cc65e2a7b038bb4412403edc1687c45536ba1818e4a3bdeaf"} Feb 26 11:25:39 crc kubenswrapper[4699]: I0226 11:25:39.779261 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-5jrwg" Feb 26 11:25:39 crc kubenswrapper[4699]: I0226 11:25:39.792505 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-7f4bx" podStartSLOduration=2.631660578 podStartE2EDuration="5.792487793s" podCreationTimestamp="2026-02-26 11:25:34 +0000 UTC" firstStartedPulling="2026-02-26 11:25:35.974875724 +0000 UTC m=+881.785702158" lastFinishedPulling="2026-02-26 11:25:39.135702939 +0000 UTC m=+884.946529373" observedRunningTime="2026-02-26 11:25:39.787298971 +0000 UTC m=+885.598125426" watchObservedRunningTime="2026-02-26 11:25:39.792487793 +0000 UTC m=+885.603314227" Feb 26 11:25:39 crc kubenswrapper[4699]: I0226 11:25:39.808915 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-5jrwg" podStartSLOduration=2.150407497 podStartE2EDuration="5.808894951s" podCreationTimestamp="2026-02-26 11:25:34 +0000 UTC" firstStartedPulling="2026-02-26 11:25:35.486859536 +0000 UTC m=+881.297685970" lastFinishedPulling="2026-02-26 11:25:39.14534699 +0000 UTC m=+884.956173424" observedRunningTime="2026-02-26 11:25:39.80543103 +0000 UTC m=+885.616257464" watchObservedRunningTime="2026-02-26 11:25:39.808894951 +0000 UTC m=+885.619721395" Feb 26 11:25:39 crc kubenswrapper[4699]: I0226 11:25:39.827941 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-qmw66" podStartSLOduration=2.489405032 podStartE2EDuration="5.827921625s" podCreationTimestamp="2026-02-26 11:25:34 +0000 UTC" firstStartedPulling="2026-02-26 11:25:35.809879976 +0000 UTC m=+881.620706410" lastFinishedPulling="2026-02-26 11:25:39.148396559 +0000 UTC m=+884.959223003" observedRunningTime="2026-02-26 11:25:39.82294612 +0000 UTC m=+885.633772574" watchObservedRunningTime="2026-02-26 11:25:39.827921625 +0000 UTC m=+885.638748059" Feb 26 11:25:40 crc kubenswrapper[4699]: I0226 11:25:40.133060 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wqmqz" Feb 26 11:25:40 crc kubenswrapper[4699]: I0226 11:25:40.133464 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wqmqz" Feb 26 11:25:40 crc kubenswrapper[4699]: I0226 11:25:40.168225 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wqmqz" Feb 26 11:25:40 crc kubenswrapper[4699]: I0226 11:25:40.821592 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wqmqz" Feb 26 11:25:40 crc kubenswrapper[4699]: I0226 11:25:40.885825 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wqmqz"] Feb 26 11:25:41 crc kubenswrapper[4699]: I0226 11:25:41.018326 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xv8lg"] Feb 26 11:25:41 crc kubenswrapper[4699]: I0226 11:25:41.018549 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xv8lg" podUID="e84a1dbc-431c-4897-b5fd-f04460b7f943" containerName="registry-server" containerID="cri-o://e566d9d06139e29d621cf7bc7b00ad22f80105f4a94a4e3e816956084476cfc9" gracePeriod=2 Feb 26 11:25:41 crc kubenswrapper[4699]: I0226 11:25:41.392534 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xv8lg" Feb 26 11:25:41 crc kubenswrapper[4699]: I0226 11:25:41.531131 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e84a1dbc-431c-4897-b5fd-f04460b7f943-catalog-content\") pod \"e84a1dbc-431c-4897-b5fd-f04460b7f943\" (UID: \"e84a1dbc-431c-4897-b5fd-f04460b7f943\") " Feb 26 11:25:41 crc kubenswrapper[4699]: I0226 11:25:41.531199 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e84a1dbc-431c-4897-b5fd-f04460b7f943-utilities\") pod \"e84a1dbc-431c-4897-b5fd-f04460b7f943\" (UID: \"e84a1dbc-431c-4897-b5fd-f04460b7f943\") " Feb 26 11:25:41 crc kubenswrapper[4699]: I0226 11:25:41.531219 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwr8w\" (UniqueName: \"kubernetes.io/projected/e84a1dbc-431c-4897-b5fd-f04460b7f943-kube-api-access-jwr8w\") pod \"e84a1dbc-431c-4897-b5fd-f04460b7f943\" (UID: \"e84a1dbc-431c-4897-b5fd-f04460b7f943\") " Feb 26 11:25:41 crc kubenswrapper[4699]: I0226 11:25:41.532034 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e84a1dbc-431c-4897-b5fd-f04460b7f943-utilities" (OuterVolumeSpecName: "utilities") pod "e84a1dbc-431c-4897-b5fd-f04460b7f943" (UID: "e84a1dbc-431c-4897-b5fd-f04460b7f943"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:25:41 crc kubenswrapper[4699]: I0226 11:25:41.537333 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e84a1dbc-431c-4897-b5fd-f04460b7f943-kube-api-access-jwr8w" (OuterVolumeSpecName: "kube-api-access-jwr8w") pod "e84a1dbc-431c-4897-b5fd-f04460b7f943" (UID: "e84a1dbc-431c-4897-b5fd-f04460b7f943"). InnerVolumeSpecName "kube-api-access-jwr8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:25:41 crc kubenswrapper[4699]: I0226 11:25:41.633195 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e84a1dbc-431c-4897-b5fd-f04460b7f943-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:25:41 crc kubenswrapper[4699]: I0226 11:25:41.633231 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwr8w\" (UniqueName: \"kubernetes.io/projected/e84a1dbc-431c-4897-b5fd-f04460b7f943-kube-api-access-jwr8w\") on node \"crc\" DevicePath \"\"" Feb 26 11:25:41 crc kubenswrapper[4699]: I0226 11:25:41.690451 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e84a1dbc-431c-4897-b5fd-f04460b7f943-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e84a1dbc-431c-4897-b5fd-f04460b7f943" (UID: "e84a1dbc-431c-4897-b5fd-f04460b7f943"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:25:41 crc kubenswrapper[4699]: I0226 11:25:41.734047 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e84a1dbc-431c-4897-b5fd-f04460b7f943-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:25:41 crc kubenswrapper[4699]: I0226 11:25:41.793786 4699 generic.go:334] "Generic (PLEG): container finished" podID="e84a1dbc-431c-4897-b5fd-f04460b7f943" containerID="e566d9d06139e29d621cf7bc7b00ad22f80105f4a94a4e3e816956084476cfc9" exitCode=0 Feb 26 11:25:41 crc kubenswrapper[4699]: I0226 11:25:41.793898 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xv8lg" event={"ID":"e84a1dbc-431c-4897-b5fd-f04460b7f943","Type":"ContainerDied","Data":"e566d9d06139e29d621cf7bc7b00ad22f80105f4a94a4e3e816956084476cfc9"} Feb 26 11:25:41 crc kubenswrapper[4699]: I0226 11:25:41.793925 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xv8lg" event={"ID":"e84a1dbc-431c-4897-b5fd-f04460b7f943","Type":"ContainerDied","Data":"6d8c08def942c9655caee92b122902fc51271c1537ca60f7447fb09b383d1bcf"} Feb 26 11:25:41 crc kubenswrapper[4699]: I0226 11:25:41.793941 4699 scope.go:117] "RemoveContainer" containerID="e566d9d06139e29d621cf7bc7b00ad22f80105f4a94a4e3e816956084476cfc9" Feb 26 11:25:41 crc kubenswrapper[4699]: I0226 11:25:41.794339 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xv8lg" Feb 26 11:25:41 crc kubenswrapper[4699]: I0226 11:25:41.826834 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xv8lg"] Feb 26 11:25:41 crc kubenswrapper[4699]: I0226 11:25:41.826892 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xv8lg"] Feb 26 11:25:41 crc kubenswrapper[4699]: I0226 11:25:41.826999 4699 scope.go:117] "RemoveContainer" containerID="b4bfd779c465bd24858dde6afd4dca28e4ac0e16d185e7df709e6584c822fee4" Feb 26 11:25:41 crc kubenswrapper[4699]: I0226 11:25:41.863736 4699 scope.go:117] "RemoveContainer" containerID="388e074c3a65bb78e59f3313093bb2c909eaec385eec81e289a50a066aaab9b5" Feb 26 11:25:42 crc kubenswrapper[4699]: I0226 11:25:42.268709 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e84a1dbc-431c-4897-b5fd-f04460b7f943" path="/var/lib/kubelet/pods/e84a1dbc-431c-4897-b5fd-f04460b7f943/volumes" Feb 26 11:25:42 crc kubenswrapper[4699]: I0226 11:25:42.539135 4699 scope.go:117] "RemoveContainer" containerID="e566d9d06139e29d621cf7bc7b00ad22f80105f4a94a4e3e816956084476cfc9" Feb 26 11:25:42 crc kubenswrapper[4699]: E0226 11:25:42.539673 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e566d9d06139e29d621cf7bc7b00ad22f80105f4a94a4e3e816956084476cfc9\": container with ID starting with e566d9d06139e29d621cf7bc7b00ad22f80105f4a94a4e3e816956084476cfc9 not found: ID does not exist" containerID="e566d9d06139e29d621cf7bc7b00ad22f80105f4a94a4e3e816956084476cfc9" Feb 26 11:25:42 crc kubenswrapper[4699]: I0226 11:25:42.539706 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e566d9d06139e29d621cf7bc7b00ad22f80105f4a94a4e3e816956084476cfc9"} err="failed to get container status \"e566d9d06139e29d621cf7bc7b00ad22f80105f4a94a4e3e816956084476cfc9\": rpc error: code = NotFound desc = could not find container \"e566d9d06139e29d621cf7bc7b00ad22f80105f4a94a4e3e816956084476cfc9\": container with ID starting with e566d9d06139e29d621cf7bc7b00ad22f80105f4a94a4e3e816956084476cfc9 not found: ID does not exist" Feb 26 11:25:42 crc kubenswrapper[4699]: I0226 11:25:42.539726 4699 scope.go:117] "RemoveContainer" containerID="b4bfd779c465bd24858dde6afd4dca28e4ac0e16d185e7df709e6584c822fee4" Feb 26 11:25:42 crc kubenswrapper[4699]: E0226 11:25:42.539947 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4bfd779c465bd24858dde6afd4dca28e4ac0e16d185e7df709e6584c822fee4\": container with ID starting with b4bfd779c465bd24858dde6afd4dca28e4ac0e16d185e7df709e6584c822fee4 not found: ID does not exist" containerID="b4bfd779c465bd24858dde6afd4dca28e4ac0e16d185e7df709e6584c822fee4" Feb 26 11:25:42 crc kubenswrapper[4699]: I0226 11:25:42.539970 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4bfd779c465bd24858dde6afd4dca28e4ac0e16d185e7df709e6584c822fee4"} err="failed to get container status \"b4bfd779c465bd24858dde6afd4dca28e4ac0e16d185e7df709e6584c822fee4\": rpc error: code = NotFound desc = could not find container \"b4bfd779c465bd24858dde6afd4dca28e4ac0e16d185e7df709e6584c822fee4\": container with ID starting with b4bfd779c465bd24858dde6afd4dca28e4ac0e16d185e7df709e6584c822fee4 not found: ID does not exist" Feb 26 11:25:42 crc kubenswrapper[4699]: I0226 11:25:42.539983 4699 scope.go:117] "RemoveContainer" containerID="388e074c3a65bb78e59f3313093bb2c909eaec385eec81e289a50a066aaab9b5" Feb 26 11:25:42 crc kubenswrapper[4699]: E0226 11:25:42.540203 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"388e074c3a65bb78e59f3313093bb2c909eaec385eec81e289a50a066aaab9b5\": container with ID starting with 388e074c3a65bb78e59f3313093bb2c909eaec385eec81e289a50a066aaab9b5 not found: ID does not exist" containerID="388e074c3a65bb78e59f3313093bb2c909eaec385eec81e289a50a066aaab9b5" Feb 26 11:25:42 crc kubenswrapper[4699]: I0226 11:25:42.540222 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"388e074c3a65bb78e59f3313093bb2c909eaec385eec81e289a50a066aaab9b5"} err="failed to get container status \"388e074c3a65bb78e59f3313093bb2c909eaec385eec81e289a50a066aaab9b5\": rpc error: code = NotFound desc = could not find container \"388e074c3a65bb78e59f3313093bb2c909eaec385eec81e289a50a066aaab9b5\": container with ID starting with 388e074c3a65bb78e59f3313093bb2c909eaec385eec81e289a50a066aaab9b5 not found: ID does not exist" Feb 26 11:25:43 crc kubenswrapper[4699]: I0226 11:25:43.808127 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-jnrsc" event={"ID":"c4897df9-3a79-41bf-a7ba-7a72d888f8e1","Type":"ContainerStarted","Data":"827f63efb0e8e180c7cb29b8b50b93fc12b981356642e7812eb67717b5870aee"} Feb 26 11:25:43 crc kubenswrapper[4699]: I0226 11:25:43.829457 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-jnrsc" podStartSLOduration=2.5427655160000002 podStartE2EDuration="9.829442422s" podCreationTimestamp="2026-02-26 11:25:34 +0000 UTC" firstStartedPulling="2026-02-26 11:25:35.571481491 +0000 UTC m=+881.382307915" lastFinishedPulling="2026-02-26 11:25:42.858158387 +0000 UTC m=+888.668984821" observedRunningTime="2026-02-26 11:25:43.824966642 +0000 UTC m=+889.635793096" watchObservedRunningTime="2026-02-26 11:25:43.829442422 +0000 UTC m=+889.640268856" Feb 26 11:25:45 crc kubenswrapper[4699]: I0226 11:25:45.481421 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-5jrwg" Feb 26 11:25:45 crc kubenswrapper[4699]: I0226 11:25:45.497184 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:45 crc kubenswrapper[4699]: I0226 11:25:45.497229 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:45 crc kubenswrapper[4699]: I0226 11:25:45.504617 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:45 crc kubenswrapper[4699]: I0226 11:25:45.829629 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-799ddfb64f-wf4l2" Feb 26 11:25:45 crc kubenswrapper[4699]: I0226 11:25:45.885415 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hnsh7"] Feb 26 11:25:55 crc kubenswrapper[4699]: I0226 11:25:55.122156 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-qmw66" Feb 26 11:26:00 crc kubenswrapper[4699]: I0226 11:26:00.131890 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535086-jjp9j"] Feb 26 11:26:00 crc kubenswrapper[4699]: E0226 11:26:00.133617 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e84a1dbc-431c-4897-b5fd-f04460b7f943" containerName="registry-server" Feb 26 11:26:00 crc kubenswrapper[4699]: I0226 11:26:00.133715 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="e84a1dbc-431c-4897-b5fd-f04460b7f943" containerName="registry-server" Feb 26 11:26:00 crc kubenswrapper[4699]: E0226 11:26:00.133785 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e84a1dbc-431c-4897-b5fd-f04460b7f943" containerName="extract-utilities" Feb 26 11:26:00 crc kubenswrapper[4699]: I0226 11:26:00.133925 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="e84a1dbc-431c-4897-b5fd-f04460b7f943" containerName="extract-utilities" Feb 26 11:26:00 crc kubenswrapper[4699]: E0226 11:26:00.133980 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e84a1dbc-431c-4897-b5fd-f04460b7f943" containerName="extract-content" Feb 26 11:26:00 crc kubenswrapper[4699]: I0226 11:26:00.134031 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="e84a1dbc-431c-4897-b5fd-f04460b7f943" containerName="extract-content" Feb 26 11:26:00 crc kubenswrapper[4699]: I0226 11:26:00.134196 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="e84a1dbc-431c-4897-b5fd-f04460b7f943" containerName="registry-server" Feb 26 11:26:00 crc kubenswrapper[4699]: I0226 11:26:00.134623 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535086-jjp9j" Feb 26 11:26:00 crc kubenswrapper[4699]: I0226 11:26:00.137394 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 11:26:00 crc kubenswrapper[4699]: I0226 11:26:00.137611 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 11:26:00 crc kubenswrapper[4699]: I0226 11:26:00.137662 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 11:26:00 crc kubenswrapper[4699]: I0226 11:26:00.142205 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535086-jjp9j"] Feb 26 11:26:00 crc kubenswrapper[4699]: I0226 11:26:00.185334 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnvxf\" (UniqueName: \"kubernetes.io/projected/fce3efa9-6f6f-4e81-a7a4-6249237a0d61-kube-api-access-tnvxf\") pod \"auto-csr-approver-29535086-jjp9j\" (UID: \"fce3efa9-6f6f-4e81-a7a4-6249237a0d61\") " pod="openshift-infra/auto-csr-approver-29535086-jjp9j" Feb 26 11:26:00 crc kubenswrapper[4699]: I0226 11:26:00.286985 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnvxf\" (UniqueName: \"kubernetes.io/projected/fce3efa9-6f6f-4e81-a7a4-6249237a0d61-kube-api-access-tnvxf\") pod \"auto-csr-approver-29535086-jjp9j\" (UID: \"fce3efa9-6f6f-4e81-a7a4-6249237a0d61\") " pod="openshift-infra/auto-csr-approver-29535086-jjp9j" Feb 26 11:26:00 crc kubenswrapper[4699]: I0226 11:26:00.305887 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnvxf\" (UniqueName: \"kubernetes.io/projected/fce3efa9-6f6f-4e81-a7a4-6249237a0d61-kube-api-access-tnvxf\") pod \"auto-csr-approver-29535086-jjp9j\" (UID: \"fce3efa9-6f6f-4e81-a7a4-6249237a0d61\") " pod="openshift-infra/auto-csr-approver-29535086-jjp9j" Feb 26 11:26:00 crc kubenswrapper[4699]: I0226 11:26:00.472469 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535086-jjp9j" Feb 26 11:26:00 crc kubenswrapper[4699]: I0226 11:26:00.861341 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535086-jjp9j"] Feb 26 11:26:01 crc kubenswrapper[4699]: I0226 11:26:01.919928 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535086-jjp9j" event={"ID":"fce3efa9-6f6f-4e81-a7a4-6249237a0d61","Type":"ContainerStarted","Data":"6a8f188e1e1d79ac2bc4764d8f75fb2c0ee626e5c9dfacdd5ec8bb51719ceaa4"} Feb 26 11:26:02 crc kubenswrapper[4699]: I0226 11:26:02.929957 4699 generic.go:334] "Generic (PLEG): container finished" podID="fce3efa9-6f6f-4e81-a7a4-6249237a0d61" containerID="842f6cf352666ae13feda0b772e0ee74a200121a74a35bd2b4b96deac77bd6aa" exitCode=0 Feb 26 11:26:02 crc kubenswrapper[4699]: I0226 11:26:02.930056 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535086-jjp9j" event={"ID":"fce3efa9-6f6f-4e81-a7a4-6249237a0d61","Type":"ContainerDied","Data":"842f6cf352666ae13feda0b772e0ee74a200121a74a35bd2b4b96deac77bd6aa"} Feb 26 11:26:04 crc kubenswrapper[4699]: I0226 11:26:04.203800 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535086-jjp9j" Feb 26 11:26:04 crc kubenswrapper[4699]: I0226 11:26:04.266834 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnvxf\" (UniqueName: \"kubernetes.io/projected/fce3efa9-6f6f-4e81-a7a4-6249237a0d61-kube-api-access-tnvxf\") pod \"fce3efa9-6f6f-4e81-a7a4-6249237a0d61\" (UID: \"fce3efa9-6f6f-4e81-a7a4-6249237a0d61\") " Feb 26 11:26:04 crc kubenswrapper[4699]: I0226 11:26:04.278716 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fce3efa9-6f6f-4e81-a7a4-6249237a0d61-kube-api-access-tnvxf" (OuterVolumeSpecName: "kube-api-access-tnvxf") pod "fce3efa9-6f6f-4e81-a7a4-6249237a0d61" (UID: "fce3efa9-6f6f-4e81-a7a4-6249237a0d61"). InnerVolumeSpecName "kube-api-access-tnvxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:26:04 crc kubenswrapper[4699]: I0226 11:26:04.371228 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnvxf\" (UniqueName: \"kubernetes.io/projected/fce3efa9-6f6f-4e81-a7a4-6249237a0d61-kube-api-access-tnvxf\") on node \"crc\" DevicePath \"\"" Feb 26 11:26:04 crc kubenswrapper[4699]: I0226 11:26:04.945293 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535086-jjp9j" event={"ID":"fce3efa9-6f6f-4e81-a7a4-6249237a0d61","Type":"ContainerDied","Data":"6a8f188e1e1d79ac2bc4764d8f75fb2c0ee626e5c9dfacdd5ec8bb51719ceaa4"} Feb 26 11:26:04 crc kubenswrapper[4699]: I0226 11:26:04.945668 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a8f188e1e1d79ac2bc4764d8f75fb2c0ee626e5c9dfacdd5ec8bb51719ceaa4" Feb 26 11:26:04 crc kubenswrapper[4699]: I0226 11:26:04.945388 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535086-jjp9j" Feb 26 11:26:05 crc kubenswrapper[4699]: I0226 11:26:05.252179 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535080-dcs8z"] Feb 26 11:26:05 crc kubenswrapper[4699]: I0226 11:26:05.256096 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535080-dcs8z"] Feb 26 11:26:06 crc kubenswrapper[4699]: I0226 11:26:06.268893 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9ea4516-0708-4b4a-9dd5-75e6220a55d4" path="/var/lib/kubelet/pods/c9ea4516-0708-4b4a-9dd5-75e6220a55d4/volumes" Feb 26 11:26:08 crc kubenswrapper[4699]: I0226 11:26:08.521659 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb"] Feb 26 11:26:08 crc kubenswrapper[4699]: E0226 11:26:08.522262 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fce3efa9-6f6f-4e81-a7a4-6249237a0d61" containerName="oc" Feb 26 11:26:08 crc kubenswrapper[4699]: I0226 11:26:08.522279 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fce3efa9-6f6f-4e81-a7a4-6249237a0d61" containerName="oc" Feb 26 11:26:08 crc kubenswrapper[4699]: I0226 11:26:08.522401 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="fce3efa9-6f6f-4e81-a7a4-6249237a0d61" containerName="oc" Feb 26 11:26:08 crc kubenswrapper[4699]: I0226 11:26:08.523332 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb" Feb 26 11:26:08 crc kubenswrapper[4699]: I0226 11:26:08.525739 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 26 11:26:08 crc kubenswrapper[4699]: I0226 11:26:08.535777 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb"] Feb 26 11:26:08 crc kubenswrapper[4699]: I0226 11:26:08.625098 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2628fd13-0f89-4bb3-9b76-86a9331a303e-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb\" (UID: \"2628fd13-0f89-4bb3-9b76-86a9331a303e\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb" Feb 26 11:26:08 crc kubenswrapper[4699]: I0226 11:26:08.625190 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6zvr\" (UniqueName: \"kubernetes.io/projected/2628fd13-0f89-4bb3-9b76-86a9331a303e-kube-api-access-b6zvr\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb\" (UID: \"2628fd13-0f89-4bb3-9b76-86a9331a303e\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb" Feb 26 11:26:08 crc kubenswrapper[4699]: I0226 11:26:08.625212 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2628fd13-0f89-4bb3-9b76-86a9331a303e-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb\" (UID: \"2628fd13-0f89-4bb3-9b76-86a9331a303e\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb" Feb 26 11:26:08 crc kubenswrapper[4699]: I0226 11:26:08.727009 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2628fd13-0f89-4bb3-9b76-86a9331a303e-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb\" (UID: \"2628fd13-0f89-4bb3-9b76-86a9331a303e\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb" Feb 26 11:26:08 crc kubenswrapper[4699]: I0226 11:26:08.727099 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6zvr\" (UniqueName: \"kubernetes.io/projected/2628fd13-0f89-4bb3-9b76-86a9331a303e-kube-api-access-b6zvr\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb\" (UID: \"2628fd13-0f89-4bb3-9b76-86a9331a303e\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb" Feb 26 11:26:08 crc kubenswrapper[4699]: I0226 11:26:08.727146 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2628fd13-0f89-4bb3-9b76-86a9331a303e-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb\" (UID: \"2628fd13-0f89-4bb3-9b76-86a9331a303e\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb" Feb 26 11:26:08 crc kubenswrapper[4699]: I0226 11:26:08.727661 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2628fd13-0f89-4bb3-9b76-86a9331a303e-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb\" (UID: \"2628fd13-0f89-4bb3-9b76-86a9331a303e\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb" Feb 26 11:26:08 crc kubenswrapper[4699]: I0226 11:26:08.727869 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2628fd13-0f89-4bb3-9b76-86a9331a303e-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb\" (UID: \"2628fd13-0f89-4bb3-9b76-86a9331a303e\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb" Feb 26 11:26:08 crc kubenswrapper[4699]: I0226 11:26:08.745985 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6zvr\" (UniqueName: \"kubernetes.io/projected/2628fd13-0f89-4bb3-9b76-86a9331a303e-kube-api-access-b6zvr\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb\" (UID: \"2628fd13-0f89-4bb3-9b76-86a9331a303e\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb" Feb 26 11:26:08 crc kubenswrapper[4699]: I0226 11:26:08.847995 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb" Feb 26 11:26:09 crc kubenswrapper[4699]: I0226 11:26:09.030585 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb"] Feb 26 11:26:09 crc kubenswrapper[4699]: I0226 11:26:09.977464 4699 generic.go:334] "Generic (PLEG): container finished" podID="2628fd13-0f89-4bb3-9b76-86a9331a303e" containerID="9a479ac992935636809c4c304863e5e93d2ad4ebac7734e672373d337cb9fb85" exitCode=0 Feb 26 11:26:09 crc kubenswrapper[4699]: I0226 11:26:09.977510 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb" event={"ID":"2628fd13-0f89-4bb3-9b76-86a9331a303e","Type":"ContainerDied","Data":"9a479ac992935636809c4c304863e5e93d2ad4ebac7734e672373d337cb9fb85"} Feb 26 11:26:09 crc kubenswrapper[4699]: I0226 11:26:09.977768 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb" event={"ID":"2628fd13-0f89-4bb3-9b76-86a9331a303e","Type":"ContainerStarted","Data":"16696c6f79ef696b7807a897e6edabd91a65b5b5a7df8c6396ccce83c630f467"} Feb 26 11:26:10 crc kubenswrapper[4699]: I0226 11:26:10.931515 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-hnsh7" podUID="e6bdcf19-db76-497c-a2fe-a6de38fae724" containerName="console" containerID="cri-o://53695fe3f72ab23b0bc81ef3a7cecb32887acb430df553db59397c8ca54a398f" gracePeriod=15 Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.248281 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hnsh7_e6bdcf19-db76-497c-a2fe-a6de38fae724/console/0.log" Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.248608 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.427623 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e6bdcf19-db76-497c-a2fe-a6de38fae724-console-oauth-config\") pod \"e6bdcf19-db76-497c-a2fe-a6de38fae724\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.427710 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6bdcf19-db76-497c-a2fe-a6de38fae724-console-serving-cert\") pod \"e6bdcf19-db76-497c-a2fe-a6de38fae724\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.427732 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-console-config\") pod \"e6bdcf19-db76-497c-a2fe-a6de38fae724\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.427771 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-service-ca\") pod \"e6bdcf19-db76-497c-a2fe-a6de38fae724\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.427797 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-trusted-ca-bundle\") pod \"e6bdcf19-db76-497c-a2fe-a6de38fae724\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.427903 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvvdh\" (UniqueName: \"kubernetes.io/projected/e6bdcf19-db76-497c-a2fe-a6de38fae724-kube-api-access-wvvdh\") pod \"e6bdcf19-db76-497c-a2fe-a6de38fae724\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.427928 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-oauth-serving-cert\") pod \"e6bdcf19-db76-497c-a2fe-a6de38fae724\" (UID: \"e6bdcf19-db76-497c-a2fe-a6de38fae724\") " Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.429740 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-service-ca" (OuterVolumeSpecName: "service-ca") pod "e6bdcf19-db76-497c-a2fe-a6de38fae724" (UID: "e6bdcf19-db76-497c-a2fe-a6de38fae724"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.429869 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-console-config" (OuterVolumeSpecName: "console-config") pod "e6bdcf19-db76-497c-a2fe-a6de38fae724" (UID: "e6bdcf19-db76-497c-a2fe-a6de38fae724"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.430139 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e6bdcf19-db76-497c-a2fe-a6de38fae724" (UID: "e6bdcf19-db76-497c-a2fe-a6de38fae724"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.430176 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e6bdcf19-db76-497c-a2fe-a6de38fae724" (UID: "e6bdcf19-db76-497c-a2fe-a6de38fae724"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.437438 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6bdcf19-db76-497c-a2fe-a6de38fae724-kube-api-access-wvvdh" (OuterVolumeSpecName: "kube-api-access-wvvdh") pod "e6bdcf19-db76-497c-a2fe-a6de38fae724" (UID: "e6bdcf19-db76-497c-a2fe-a6de38fae724"). InnerVolumeSpecName "kube-api-access-wvvdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.438406 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6bdcf19-db76-497c-a2fe-a6de38fae724-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e6bdcf19-db76-497c-a2fe-a6de38fae724" (UID: "e6bdcf19-db76-497c-a2fe-a6de38fae724"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.439210 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6bdcf19-db76-497c-a2fe-a6de38fae724-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e6bdcf19-db76-497c-a2fe-a6de38fae724" (UID: "e6bdcf19-db76-497c-a2fe-a6de38fae724"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.530033 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvvdh\" (UniqueName: \"kubernetes.io/projected/e6bdcf19-db76-497c-a2fe-a6de38fae724-kube-api-access-wvvdh\") on node \"crc\" DevicePath \"\"" Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.530086 4699 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.530107 4699 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e6bdcf19-db76-497c-a2fe-a6de38fae724-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.530128 4699 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6bdcf19-db76-497c-a2fe-a6de38fae724-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.530174 4699 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-console-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.530190 4699 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.530206 4699 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6bdcf19-db76-497c-a2fe-a6de38fae724-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.989437 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hnsh7_e6bdcf19-db76-497c-a2fe-a6de38fae724/console/0.log" Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.989481 4699 generic.go:334] "Generic (PLEG): container finished" podID="e6bdcf19-db76-497c-a2fe-a6de38fae724" containerID="53695fe3f72ab23b0bc81ef3a7cecb32887acb430df553db59397c8ca54a398f" exitCode=2 Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.989512 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hnsh7" event={"ID":"e6bdcf19-db76-497c-a2fe-a6de38fae724","Type":"ContainerDied","Data":"53695fe3f72ab23b0bc81ef3a7cecb32887acb430df553db59397c8ca54a398f"} Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.989538 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hnsh7" event={"ID":"e6bdcf19-db76-497c-a2fe-a6de38fae724","Type":"ContainerDied","Data":"70e987324485f04a528051e1c4554d8c5806c907f67af5218c0970ab13cf9e3b"} Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.989553 4699 scope.go:117] "RemoveContainer" containerID="53695fe3f72ab23b0bc81ef3a7cecb32887acb430df553db59397c8ca54a398f" Feb 26 11:26:11 crc kubenswrapper[4699]: I0226 11:26:11.989681 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hnsh7" Feb 26 11:26:12 crc kubenswrapper[4699]: I0226 11:26:12.034149 4699 scope.go:117] "RemoveContainer" containerID="53695fe3f72ab23b0bc81ef3a7cecb32887acb430df553db59397c8ca54a398f" Feb 26 11:26:12 crc kubenswrapper[4699]: E0226 11:26:12.034539 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53695fe3f72ab23b0bc81ef3a7cecb32887acb430df553db59397c8ca54a398f\": container with ID starting with 53695fe3f72ab23b0bc81ef3a7cecb32887acb430df553db59397c8ca54a398f not found: ID does not exist" containerID="53695fe3f72ab23b0bc81ef3a7cecb32887acb430df553db59397c8ca54a398f" Feb 26 11:26:12 crc kubenswrapper[4699]: I0226 11:26:12.034591 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53695fe3f72ab23b0bc81ef3a7cecb32887acb430df553db59397c8ca54a398f"} err="failed to get container status \"53695fe3f72ab23b0bc81ef3a7cecb32887acb430df553db59397c8ca54a398f\": rpc error: code = NotFound desc = could not find container \"53695fe3f72ab23b0bc81ef3a7cecb32887acb430df553db59397c8ca54a398f\": container with ID starting with 53695fe3f72ab23b0bc81ef3a7cecb32887acb430df553db59397c8ca54a398f not found: ID does not exist" Feb 26 11:26:12 crc kubenswrapper[4699]: I0226 11:26:12.036646 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hnsh7"] Feb 26 11:26:12 crc kubenswrapper[4699]: I0226 11:26:12.040596 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-hnsh7"] Feb 26 11:26:12 crc kubenswrapper[4699]: I0226 11:26:12.280748 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6bdcf19-db76-497c-a2fe-a6de38fae724" path="/var/lib/kubelet/pods/e6bdcf19-db76-497c-a2fe-a6de38fae724/volumes" Feb 26 11:26:13 crc kubenswrapper[4699]: I0226 11:26:13.000450 4699 generic.go:334] "Generic (PLEG): container finished" podID="2628fd13-0f89-4bb3-9b76-86a9331a303e" containerID="39f20a8262954809bb749b951e7954c51b3b01f83d6d9b533079491f91de7f81" exitCode=0 Feb 26 11:26:13 crc kubenswrapper[4699]: I0226 11:26:13.000543 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb" event={"ID":"2628fd13-0f89-4bb3-9b76-86a9331a303e","Type":"ContainerDied","Data":"39f20a8262954809bb749b951e7954c51b3b01f83d6d9b533079491f91de7f81"} Feb 26 11:26:14 crc kubenswrapper[4699]: I0226 11:26:14.009508 4699 generic.go:334] "Generic (PLEG): container finished" podID="2628fd13-0f89-4bb3-9b76-86a9331a303e" containerID="071c5a4ed9d914dc4937b803376a521876de0e1bc41ff9eef3bdef617821dfbe" exitCode=0 Feb 26 11:26:14 crc kubenswrapper[4699]: I0226 11:26:14.009736 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb" event={"ID":"2628fd13-0f89-4bb3-9b76-86a9331a303e","Type":"ContainerDied","Data":"071c5a4ed9d914dc4937b803376a521876de0e1bc41ff9eef3bdef617821dfbe"} Feb 26 11:26:15 crc kubenswrapper[4699]: I0226 11:26:15.266428 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb" Feb 26 11:26:15 crc kubenswrapper[4699]: I0226 11:26:15.377643 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2628fd13-0f89-4bb3-9b76-86a9331a303e-util\") pod \"2628fd13-0f89-4bb3-9b76-86a9331a303e\" (UID: \"2628fd13-0f89-4bb3-9b76-86a9331a303e\") " Feb 26 11:26:15 crc kubenswrapper[4699]: I0226 11:26:15.377932 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2628fd13-0f89-4bb3-9b76-86a9331a303e-bundle\") pod \"2628fd13-0f89-4bb3-9b76-86a9331a303e\" (UID: \"2628fd13-0f89-4bb3-9b76-86a9331a303e\") " Feb 26 11:26:15 crc kubenswrapper[4699]: I0226 11:26:15.378028 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6zvr\" (UniqueName: \"kubernetes.io/projected/2628fd13-0f89-4bb3-9b76-86a9331a303e-kube-api-access-b6zvr\") pod \"2628fd13-0f89-4bb3-9b76-86a9331a303e\" (UID: \"2628fd13-0f89-4bb3-9b76-86a9331a303e\") " Feb 26 11:26:15 crc kubenswrapper[4699]: I0226 11:26:15.380209 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2628fd13-0f89-4bb3-9b76-86a9331a303e-bundle" (OuterVolumeSpecName: "bundle") pod "2628fd13-0f89-4bb3-9b76-86a9331a303e" (UID: "2628fd13-0f89-4bb3-9b76-86a9331a303e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:26:15 crc kubenswrapper[4699]: I0226 11:26:15.384427 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2628fd13-0f89-4bb3-9b76-86a9331a303e-kube-api-access-b6zvr" (OuterVolumeSpecName: "kube-api-access-b6zvr") pod "2628fd13-0f89-4bb3-9b76-86a9331a303e" (UID: "2628fd13-0f89-4bb3-9b76-86a9331a303e"). InnerVolumeSpecName "kube-api-access-b6zvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:26:15 crc kubenswrapper[4699]: I0226 11:26:15.393666 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2628fd13-0f89-4bb3-9b76-86a9331a303e-util" (OuterVolumeSpecName: "util") pod "2628fd13-0f89-4bb3-9b76-86a9331a303e" (UID: "2628fd13-0f89-4bb3-9b76-86a9331a303e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:26:15 crc kubenswrapper[4699]: I0226 11:26:15.479146 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6zvr\" (UniqueName: \"kubernetes.io/projected/2628fd13-0f89-4bb3-9b76-86a9331a303e-kube-api-access-b6zvr\") on node \"crc\" DevicePath \"\"" Feb 26 11:26:15 crc kubenswrapper[4699]: I0226 11:26:15.479183 4699 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2628fd13-0f89-4bb3-9b76-86a9331a303e-util\") on node \"crc\" DevicePath \"\"" Feb 26 11:26:15 crc kubenswrapper[4699]: I0226 11:26:15.479194 4699 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2628fd13-0f89-4bb3-9b76-86a9331a303e-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:26:16 crc kubenswrapper[4699]: I0226 11:26:16.025543 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb" event={"ID":"2628fd13-0f89-4bb3-9b76-86a9331a303e","Type":"ContainerDied","Data":"16696c6f79ef696b7807a897e6edabd91a65b5b5a7df8c6396ccce83c630f467"} Feb 26 11:26:16 crc kubenswrapper[4699]: I0226 11:26:16.025601 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16696c6f79ef696b7807a897e6edabd91a65b5b5a7df8c6396ccce83c630f467" Feb 26 11:26:16 crc kubenswrapper[4699]: I0226 11:26:16.025740 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb" Feb 26 11:26:21 crc kubenswrapper[4699]: I0226 11:26:21.013932 4699 scope.go:117] "RemoveContainer" containerID="ec18e4fa3c26a9a3b620eb9c167811e69c8b0db26c298c317aa409e857f17f0c" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.358356 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5d58b8658b-qjr5b"] Feb 26 11:26:25 crc kubenswrapper[4699]: E0226 11:26:25.359105 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2628fd13-0f89-4bb3-9b76-86a9331a303e" containerName="pull" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.359145 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="2628fd13-0f89-4bb3-9b76-86a9331a303e" containerName="pull" Feb 26 11:26:25 crc kubenswrapper[4699]: E0226 11:26:25.359164 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2628fd13-0f89-4bb3-9b76-86a9331a303e" containerName="extract" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.359172 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="2628fd13-0f89-4bb3-9b76-86a9331a303e" containerName="extract" Feb 26 11:26:25 crc kubenswrapper[4699]: E0226 11:26:25.359184 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6bdcf19-db76-497c-a2fe-a6de38fae724" containerName="console" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.359192 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6bdcf19-db76-497c-a2fe-a6de38fae724" containerName="console" Feb 26 11:26:25 crc kubenswrapper[4699]: E0226 11:26:25.359204 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2628fd13-0f89-4bb3-9b76-86a9331a303e" containerName="util" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.359209 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="2628fd13-0f89-4bb3-9b76-86a9331a303e" containerName="util" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.359321 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6bdcf19-db76-497c-a2fe-a6de38fae724" containerName="console" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.359338 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="2628fd13-0f89-4bb3-9b76-86a9331a303e" containerName="extract" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.359863 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5d58b8658b-qjr5b" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.362325 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.362403 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.362673 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.364510 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.365649 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-x9knq" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.376996 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5d58b8658b-qjr5b"] Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.525703 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7swj\" (UniqueName: \"kubernetes.io/projected/cacc25f7-9d5d-4ba4-b7b7-f95d7cba63e8-kube-api-access-s7swj\") pod \"metallb-operator-controller-manager-5d58b8658b-qjr5b\" (UID: \"cacc25f7-9d5d-4ba4-b7b7-f95d7cba63e8\") " pod="metallb-system/metallb-operator-controller-manager-5d58b8658b-qjr5b" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.525763 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cacc25f7-9d5d-4ba4-b7b7-f95d7cba63e8-webhook-cert\") pod \"metallb-operator-controller-manager-5d58b8658b-qjr5b\" (UID: \"cacc25f7-9d5d-4ba4-b7b7-f95d7cba63e8\") " pod="metallb-system/metallb-operator-controller-manager-5d58b8658b-qjr5b" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.525800 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cacc25f7-9d5d-4ba4-b7b7-f95d7cba63e8-apiservice-cert\") pod \"metallb-operator-controller-manager-5d58b8658b-qjr5b\" (UID: \"cacc25f7-9d5d-4ba4-b7b7-f95d7cba63e8\") " pod="metallb-system/metallb-operator-controller-manager-5d58b8658b-qjr5b" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.608160 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6d98597f89-glkjh"] Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.609067 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6d98597f89-glkjh" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.614679 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.615706 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.615891 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-872t9" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.627164 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cacc25f7-9d5d-4ba4-b7b7-f95d7cba63e8-apiservice-cert\") pod \"metallb-operator-controller-manager-5d58b8658b-qjr5b\" (UID: \"cacc25f7-9d5d-4ba4-b7b7-f95d7cba63e8\") " pod="metallb-system/metallb-operator-controller-manager-5d58b8658b-qjr5b" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.627313 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7swj\" (UniqueName: \"kubernetes.io/projected/cacc25f7-9d5d-4ba4-b7b7-f95d7cba63e8-kube-api-access-s7swj\") pod \"metallb-operator-controller-manager-5d58b8658b-qjr5b\" (UID: \"cacc25f7-9d5d-4ba4-b7b7-f95d7cba63e8\") " pod="metallb-system/metallb-operator-controller-manager-5d58b8658b-qjr5b" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.627357 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cacc25f7-9d5d-4ba4-b7b7-f95d7cba63e8-webhook-cert\") pod \"metallb-operator-controller-manager-5d58b8658b-qjr5b\" (UID: \"cacc25f7-9d5d-4ba4-b7b7-f95d7cba63e8\") " pod="metallb-system/metallb-operator-controller-manager-5d58b8658b-qjr5b" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.634807 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cacc25f7-9d5d-4ba4-b7b7-f95d7cba63e8-apiservice-cert\") pod \"metallb-operator-controller-manager-5d58b8658b-qjr5b\" (UID: \"cacc25f7-9d5d-4ba4-b7b7-f95d7cba63e8\") " pod="metallb-system/metallb-operator-controller-manager-5d58b8658b-qjr5b" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.638973 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cacc25f7-9d5d-4ba4-b7b7-f95d7cba63e8-webhook-cert\") pod \"metallb-operator-controller-manager-5d58b8658b-qjr5b\" (UID: \"cacc25f7-9d5d-4ba4-b7b7-f95d7cba63e8\") " pod="metallb-system/metallb-operator-controller-manager-5d58b8658b-qjr5b" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.655490 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7swj\" (UniqueName: \"kubernetes.io/projected/cacc25f7-9d5d-4ba4-b7b7-f95d7cba63e8-kube-api-access-s7swj\") pod \"metallb-operator-controller-manager-5d58b8658b-qjr5b\" (UID: \"cacc25f7-9d5d-4ba4-b7b7-f95d7cba63e8\") " pod="metallb-system/metallb-operator-controller-manager-5d58b8658b-qjr5b" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.663223 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6d98597f89-glkjh"] Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.676261 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5d58b8658b-qjr5b" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.728241 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/af2438c1-8812-4bb1-8999-66cb8d804c05-webhook-cert\") pod \"metallb-operator-webhook-server-6d98597f89-glkjh\" (UID: \"af2438c1-8812-4bb1-8999-66cb8d804c05\") " pod="metallb-system/metallb-operator-webhook-server-6d98597f89-glkjh" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.728323 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cp9w\" (UniqueName: \"kubernetes.io/projected/af2438c1-8812-4bb1-8999-66cb8d804c05-kube-api-access-6cp9w\") pod \"metallb-operator-webhook-server-6d98597f89-glkjh\" (UID: \"af2438c1-8812-4bb1-8999-66cb8d804c05\") " pod="metallb-system/metallb-operator-webhook-server-6d98597f89-glkjh" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.728515 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/af2438c1-8812-4bb1-8999-66cb8d804c05-apiservice-cert\") pod \"metallb-operator-webhook-server-6d98597f89-glkjh\" (UID: \"af2438c1-8812-4bb1-8999-66cb8d804c05\") " pod="metallb-system/metallb-operator-webhook-server-6d98597f89-glkjh" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.829843 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cp9w\" (UniqueName: \"kubernetes.io/projected/af2438c1-8812-4bb1-8999-66cb8d804c05-kube-api-access-6cp9w\") pod \"metallb-operator-webhook-server-6d98597f89-glkjh\" (UID: \"af2438c1-8812-4bb1-8999-66cb8d804c05\") " pod="metallb-system/metallb-operator-webhook-server-6d98597f89-glkjh" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.829901 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/af2438c1-8812-4bb1-8999-66cb8d804c05-apiservice-cert\") pod \"metallb-operator-webhook-server-6d98597f89-glkjh\" (UID: \"af2438c1-8812-4bb1-8999-66cb8d804c05\") " pod="metallb-system/metallb-operator-webhook-server-6d98597f89-glkjh" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.829949 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/af2438c1-8812-4bb1-8999-66cb8d804c05-webhook-cert\") pod \"metallb-operator-webhook-server-6d98597f89-glkjh\" (UID: \"af2438c1-8812-4bb1-8999-66cb8d804c05\") " pod="metallb-system/metallb-operator-webhook-server-6d98597f89-glkjh" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.835802 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/af2438c1-8812-4bb1-8999-66cb8d804c05-apiservice-cert\") pod \"metallb-operator-webhook-server-6d98597f89-glkjh\" (UID: \"af2438c1-8812-4bb1-8999-66cb8d804c05\") " pod="metallb-system/metallb-operator-webhook-server-6d98597f89-glkjh" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.835875 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/af2438c1-8812-4bb1-8999-66cb8d804c05-webhook-cert\") pod \"metallb-operator-webhook-server-6d98597f89-glkjh\" (UID: \"af2438c1-8812-4bb1-8999-66cb8d804c05\") " pod="metallb-system/metallb-operator-webhook-server-6d98597f89-glkjh" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.861686 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cp9w\" (UniqueName: \"kubernetes.io/projected/af2438c1-8812-4bb1-8999-66cb8d804c05-kube-api-access-6cp9w\") pod \"metallb-operator-webhook-server-6d98597f89-glkjh\" (UID: \"af2438c1-8812-4bb1-8999-66cb8d804c05\") " pod="metallb-system/metallb-operator-webhook-server-6d98597f89-glkjh" Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.925576 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5d58b8658b-qjr5b"] Feb 26 11:26:25 crc kubenswrapper[4699]: I0226 11:26:25.925894 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6d98597f89-glkjh" Feb 26 11:26:26 crc kubenswrapper[4699]: I0226 11:26:26.113094 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5d58b8658b-qjr5b" event={"ID":"cacc25f7-9d5d-4ba4-b7b7-f95d7cba63e8","Type":"ContainerStarted","Data":"126c16a07f27b3ebfb4ef4d7167dfcb10e2537d56e3a6f41235a7c03088b9d52"} Feb 26 11:26:26 crc kubenswrapper[4699]: I0226 11:26:26.355212 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6d98597f89-glkjh"] Feb 26 11:26:26 crc kubenswrapper[4699]: W0226 11:26:26.361531 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf2438c1_8812_4bb1_8999_66cb8d804c05.slice/crio-63c962498ad09cd2d7911c43ba1263076a71c6c3d06a5cc940bbd4d151f6d4a2 WatchSource:0}: Error finding container 63c962498ad09cd2d7911c43ba1263076a71c6c3d06a5cc940bbd4d151f6d4a2: Status 404 returned error can't find the container with id 63c962498ad09cd2d7911c43ba1263076a71c6c3d06a5cc940bbd4d151f6d4a2 Feb 26 11:26:27 crc kubenswrapper[4699]: I0226 11:26:27.118802 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6d98597f89-glkjh" event={"ID":"af2438c1-8812-4bb1-8999-66cb8d804c05","Type":"ContainerStarted","Data":"63c962498ad09cd2d7911c43ba1263076a71c6c3d06a5cc940bbd4d151f6d4a2"} Feb 26 11:26:30 crc kubenswrapper[4699]: I0226 11:26:30.140721 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5d58b8658b-qjr5b" event={"ID":"cacc25f7-9d5d-4ba4-b7b7-f95d7cba63e8","Type":"ContainerStarted","Data":"5d0d11f0bc581f7731d75f17295799a975a413b459eeb4c8572e36a67d411967"} Feb 26 11:26:30 crc kubenswrapper[4699]: I0226 11:26:30.141349 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5d58b8658b-qjr5b" Feb 26 11:26:30 crc kubenswrapper[4699]: I0226 11:26:30.166041 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5d58b8658b-qjr5b" podStartSLOduration=1.662847131 podStartE2EDuration="5.166021485s" podCreationTimestamp="2026-02-26 11:26:25 +0000 UTC" firstStartedPulling="2026-02-26 11:26:25.945359793 +0000 UTC m=+931.756186227" lastFinishedPulling="2026-02-26 11:26:29.448534147 +0000 UTC m=+935.259360581" observedRunningTime="2026-02-26 11:26:30.162989148 +0000 UTC m=+935.973815602" watchObservedRunningTime="2026-02-26 11:26:30.166021485 +0000 UTC m=+935.976847929" Feb 26 11:26:32 crc kubenswrapper[4699]: I0226 11:26:32.156080 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6d98597f89-glkjh" event={"ID":"af2438c1-8812-4bb1-8999-66cb8d804c05","Type":"ContainerStarted","Data":"3953f87cfe3d6d7ffb9e60f6c4d444487aaef75cc6561dc999ac060b63dfc8b7"} Feb 26 11:26:32 crc kubenswrapper[4699]: I0226 11:26:32.156674 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6d98597f89-glkjh" Feb 26 11:26:32 crc kubenswrapper[4699]: I0226 11:26:32.180421 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6d98597f89-glkjh" podStartSLOduration=2.131936047 podStartE2EDuration="7.180399081s" podCreationTimestamp="2026-02-26 11:26:25 +0000 UTC" firstStartedPulling="2026-02-26 11:26:26.364879028 +0000 UTC m=+932.175705462" lastFinishedPulling="2026-02-26 11:26:31.413342062 +0000 UTC m=+937.224168496" observedRunningTime="2026-02-26 11:26:32.178534978 +0000 UTC m=+937.989361412" watchObservedRunningTime="2026-02-26 11:26:32.180399081 +0000 UTC m=+937.991225525" Feb 26 11:26:45 crc kubenswrapper[4699]: I0226 11:26:45.933475 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6d98597f89-glkjh" Feb 26 11:27:05 crc kubenswrapper[4699]: I0226 11:27:05.678699 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5d58b8658b-qjr5b" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.532952 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-wszs7"] Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.535041 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.541407 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.541472 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.541641 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-n4r2c" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.573434 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-svsrb"] Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.574637 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-svsrb" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.576916 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.582050 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-svsrb"] Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.651439 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-l8phj"] Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.652554 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-l8phj" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.655610 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.655837 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.656024 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-8vc6f" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.656184 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.675059 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-bs5nk"] Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.676175 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-bs5nk" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.677445 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-frr-startup\") pod \"frr-k8s-wszs7\" (UID: \"dfa29d17-a66a-42fe-8275-1526f8fb6dc9\") " pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.677501 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-frr-conf\") pod \"frr-k8s-wszs7\" (UID: \"dfa29d17-a66a-42fe-8275-1526f8fb6dc9\") " pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.677529 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-reloader\") pod \"frr-k8s-wszs7\" (UID: \"dfa29d17-a66a-42fe-8275-1526f8fb6dc9\") " pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.677559 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-metrics-certs\") pod \"frr-k8s-wszs7\" (UID: \"dfa29d17-a66a-42fe-8275-1526f8fb6dc9\") " pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.677577 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwb7f\" (UniqueName: \"kubernetes.io/projected/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-kube-api-access-jwb7f\") pod \"frr-k8s-wszs7\" (UID: \"dfa29d17-a66a-42fe-8275-1526f8fb6dc9\") " pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.677604 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-metrics\") pod \"frr-k8s-wszs7\" (UID: \"dfa29d17-a66a-42fe-8275-1526f8fb6dc9\") " pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.677661 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-frr-sockets\") pod \"frr-k8s-wszs7\" (UID: \"dfa29d17-a66a-42fe-8275-1526f8fb6dc9\") " pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.678806 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.690713 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-bs5nk"] Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.778946 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlnpn\" (UniqueName: \"kubernetes.io/projected/6ef6a9d7-6997-485a-a812-ded9d3a2df85-kube-api-access-jlnpn\") pod \"controller-86ddb6bd46-bs5nk\" (UID: \"6ef6a9d7-6997-485a-a812-ded9d3a2df85\") " pod="metallb-system/controller-86ddb6bd46-bs5nk" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.779013 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35357e2c-2a03-46f8-bc28-f7daad3b679d-cert\") pod \"frr-k8s-webhook-server-7f989f654f-svsrb\" (UID: \"35357e2c-2a03-46f8-bc28-f7daad3b679d\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-svsrb" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.779348 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ef6a9d7-6997-485a-a812-ded9d3a2df85-metrics-certs\") pod \"controller-86ddb6bd46-bs5nk\" (UID: \"6ef6a9d7-6997-485a-a812-ded9d3a2df85\") " pod="metallb-system/controller-86ddb6bd46-bs5nk" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.779392 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-frr-sockets\") pod \"frr-k8s-wszs7\" (UID: \"dfa29d17-a66a-42fe-8275-1526f8fb6dc9\") " pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.779416 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-frr-startup\") pod \"frr-k8s-wszs7\" (UID: \"dfa29d17-a66a-42fe-8275-1526f8fb6dc9\") " pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.779700 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-frr-conf\") pod \"frr-k8s-wszs7\" (UID: \"dfa29d17-a66a-42fe-8275-1526f8fb6dc9\") " pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.779824 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-frr-sockets\") pod \"frr-k8s-wszs7\" (UID: \"dfa29d17-a66a-42fe-8275-1526f8fb6dc9\") " pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.780074 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-frr-conf\") pod \"frr-k8s-wszs7\" (UID: \"dfa29d17-a66a-42fe-8275-1526f8fb6dc9\") " pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.780257 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-reloader\") pod \"frr-k8s-wszs7\" (UID: \"dfa29d17-a66a-42fe-8275-1526f8fb6dc9\") " pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.780301 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-metrics-certs\") pod \"frr-k8s-wszs7\" (UID: \"dfa29d17-a66a-42fe-8275-1526f8fb6dc9\") " pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.780351 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tgrz\" (UniqueName: \"kubernetes.io/projected/d656ca89-f955-44bb-9944-f75bf485a254-kube-api-access-8tgrz\") pod \"speaker-l8phj\" (UID: \"d656ca89-f955-44bb-9944-f75bf485a254\") " pod="metallb-system/speaker-l8phj" Feb 26 11:27:06 crc kubenswrapper[4699]: E0226 11:27:06.780476 4699 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.780557 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-reloader\") pod \"frr-k8s-wszs7\" (UID: \"dfa29d17-a66a-42fe-8275-1526f8fb6dc9\") " pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:06 crc kubenswrapper[4699]: E0226 11:27:06.780580 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-metrics-certs podName:dfa29d17-a66a-42fe-8275-1526f8fb6dc9 nodeName:}" failed. No retries permitted until 2026-02-26 11:27:07.280520306 +0000 UTC m=+973.091346740 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-metrics-certs") pod "frr-k8s-wszs7" (UID: "dfa29d17-a66a-42fe-8275-1526f8fb6dc9") : secret "frr-k8s-certs-secret" not found Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.780602 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwb7f\" (UniqueName: \"kubernetes.io/projected/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-kube-api-access-jwb7f\") pod \"frr-k8s-wszs7\" (UID: \"dfa29d17-a66a-42fe-8275-1526f8fb6dc9\") " pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.780633 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d656ca89-f955-44bb-9944-f75bf485a254-metrics-certs\") pod \"speaker-l8phj\" (UID: \"d656ca89-f955-44bb-9944-f75bf485a254\") " pod="metallb-system/speaker-l8phj" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.780663 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-metrics\") pod \"frr-k8s-wszs7\" (UID: \"dfa29d17-a66a-42fe-8275-1526f8fb6dc9\") " pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.780689 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrr6g\" (UniqueName: \"kubernetes.io/projected/35357e2c-2a03-46f8-bc28-f7daad3b679d-kube-api-access-qrr6g\") pod \"frr-k8s-webhook-server-7f989f654f-svsrb\" (UID: \"35357e2c-2a03-46f8-bc28-f7daad3b679d\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-svsrb" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.780710 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ef6a9d7-6997-485a-a812-ded9d3a2df85-cert\") pod \"controller-86ddb6bd46-bs5nk\" (UID: \"6ef6a9d7-6997-485a-a812-ded9d3a2df85\") " pod="metallb-system/controller-86ddb6bd46-bs5nk" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.780742 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d656ca89-f955-44bb-9944-f75bf485a254-metallb-excludel2\") pod \"speaker-l8phj\" (UID: \"d656ca89-f955-44bb-9944-f75bf485a254\") " pod="metallb-system/speaker-l8phj" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.780787 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d656ca89-f955-44bb-9944-f75bf485a254-memberlist\") pod \"speaker-l8phj\" (UID: \"d656ca89-f955-44bb-9944-f75bf485a254\") " pod="metallb-system/speaker-l8phj" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.780947 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-metrics\") pod \"frr-k8s-wszs7\" (UID: \"dfa29d17-a66a-42fe-8275-1526f8fb6dc9\") " pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.781009 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-frr-startup\") pod \"frr-k8s-wszs7\" (UID: \"dfa29d17-a66a-42fe-8275-1526f8fb6dc9\") " pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.799066 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwb7f\" (UniqueName: \"kubernetes.io/projected/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-kube-api-access-jwb7f\") pod \"frr-k8s-wszs7\" (UID: \"dfa29d17-a66a-42fe-8275-1526f8fb6dc9\") " pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.881902 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35357e2c-2a03-46f8-bc28-f7daad3b679d-cert\") pod \"frr-k8s-webhook-server-7f989f654f-svsrb\" (UID: \"35357e2c-2a03-46f8-bc28-f7daad3b679d\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-svsrb" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.882254 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ef6a9d7-6997-485a-a812-ded9d3a2df85-metrics-certs\") pod \"controller-86ddb6bd46-bs5nk\" (UID: \"6ef6a9d7-6997-485a-a812-ded9d3a2df85\") " pod="metallb-system/controller-86ddb6bd46-bs5nk" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.882478 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tgrz\" (UniqueName: \"kubernetes.io/projected/d656ca89-f955-44bb-9944-f75bf485a254-kube-api-access-8tgrz\") pod \"speaker-l8phj\" (UID: \"d656ca89-f955-44bb-9944-f75bf485a254\") " pod="metallb-system/speaker-l8phj" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.882595 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d656ca89-f955-44bb-9944-f75bf485a254-metrics-certs\") pod \"speaker-l8phj\" (UID: \"d656ca89-f955-44bb-9944-f75bf485a254\") " pod="metallb-system/speaker-l8phj" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.882719 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrr6g\" (UniqueName: \"kubernetes.io/projected/35357e2c-2a03-46f8-bc28-f7daad3b679d-kube-api-access-qrr6g\") pod \"frr-k8s-webhook-server-7f989f654f-svsrb\" (UID: \"35357e2c-2a03-46f8-bc28-f7daad3b679d\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-svsrb" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.882843 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ef6a9d7-6997-485a-a812-ded9d3a2df85-cert\") pod \"controller-86ddb6bd46-bs5nk\" (UID: \"6ef6a9d7-6997-485a-a812-ded9d3a2df85\") " pod="metallb-system/controller-86ddb6bd46-bs5nk" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.882968 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d656ca89-f955-44bb-9944-f75bf485a254-metallb-excludel2\") pod \"speaker-l8phj\" (UID: \"d656ca89-f955-44bb-9944-f75bf485a254\") " pod="metallb-system/speaker-l8phj" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.883135 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d656ca89-f955-44bb-9944-f75bf485a254-memberlist\") pod \"speaker-l8phj\" (UID: \"d656ca89-f955-44bb-9944-f75bf485a254\") " pod="metallb-system/speaker-l8phj" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.883259 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlnpn\" (UniqueName: \"kubernetes.io/projected/6ef6a9d7-6997-485a-a812-ded9d3a2df85-kube-api-access-jlnpn\") pod \"controller-86ddb6bd46-bs5nk\" (UID: \"6ef6a9d7-6997-485a-a812-ded9d3a2df85\") " pod="metallb-system/controller-86ddb6bd46-bs5nk" Feb 26 11:27:06 crc kubenswrapper[4699]: E0226 11:27:06.882829 4699 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Feb 26 11:27:06 crc kubenswrapper[4699]: E0226 11:27:06.883509 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d656ca89-f955-44bb-9944-f75bf485a254-metrics-certs podName:d656ca89-f955-44bb-9944-f75bf485a254 nodeName:}" failed. No retries permitted until 2026-02-26 11:27:07.383489487 +0000 UTC m=+973.194315931 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d656ca89-f955-44bb-9944-f75bf485a254-metrics-certs") pod "speaker-l8phj" (UID: "d656ca89-f955-44bb-9944-f75bf485a254") : secret "speaker-certs-secret" not found Feb 26 11:27:06 crc kubenswrapper[4699]: E0226 11:27:06.883259 4699 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 26 11:27:06 crc kubenswrapper[4699]: E0226 11:27:06.883692 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d656ca89-f955-44bb-9944-f75bf485a254-memberlist podName:d656ca89-f955-44bb-9944-f75bf485a254 nodeName:}" failed. No retries permitted until 2026-02-26 11:27:07.383667773 +0000 UTC m=+973.194494237 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d656ca89-f955-44bb-9944-f75bf485a254-memberlist") pod "speaker-l8phj" (UID: "d656ca89-f955-44bb-9944-f75bf485a254") : secret "metallb-memberlist" not found Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.883796 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d656ca89-f955-44bb-9944-f75bf485a254-metallb-excludel2\") pod \"speaker-l8phj\" (UID: \"d656ca89-f955-44bb-9944-f75bf485a254\") " pod="metallb-system/speaker-l8phj" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.887839 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.888219 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ef6a9d7-6997-485a-a812-ded9d3a2df85-metrics-certs\") pod \"controller-86ddb6bd46-bs5nk\" (UID: \"6ef6a9d7-6997-485a-a812-ded9d3a2df85\") " pod="metallb-system/controller-86ddb6bd46-bs5nk" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.888235 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35357e2c-2a03-46f8-bc28-f7daad3b679d-cert\") pod \"frr-k8s-webhook-server-7f989f654f-svsrb\" (UID: \"35357e2c-2a03-46f8-bc28-f7daad3b679d\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-svsrb" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.898712 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ef6a9d7-6997-485a-a812-ded9d3a2df85-cert\") pod \"controller-86ddb6bd46-bs5nk\" (UID: \"6ef6a9d7-6997-485a-a812-ded9d3a2df85\") " pod="metallb-system/controller-86ddb6bd46-bs5nk" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.903813 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrr6g\" (UniqueName: \"kubernetes.io/projected/35357e2c-2a03-46f8-bc28-f7daad3b679d-kube-api-access-qrr6g\") pod \"frr-k8s-webhook-server-7f989f654f-svsrb\" (UID: \"35357e2c-2a03-46f8-bc28-f7daad3b679d\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-svsrb" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.910391 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tgrz\" (UniqueName: \"kubernetes.io/projected/d656ca89-f955-44bb-9944-f75bf485a254-kube-api-access-8tgrz\") pod \"speaker-l8phj\" (UID: \"d656ca89-f955-44bb-9944-f75bf485a254\") " pod="metallb-system/speaker-l8phj" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.912882 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlnpn\" (UniqueName: \"kubernetes.io/projected/6ef6a9d7-6997-485a-a812-ded9d3a2df85-kube-api-access-jlnpn\") pod \"controller-86ddb6bd46-bs5nk\" (UID: \"6ef6a9d7-6997-485a-a812-ded9d3a2df85\") " pod="metallb-system/controller-86ddb6bd46-bs5nk" Feb 26 11:27:06 crc kubenswrapper[4699]: I0226 11:27:06.996079 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-bs5nk" Feb 26 11:27:07 crc kubenswrapper[4699]: I0226 11:27:07.190595 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-svsrb" Feb 26 11:27:07 crc kubenswrapper[4699]: I0226 11:27:07.291902 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-metrics-certs\") pod \"frr-k8s-wszs7\" (UID: \"dfa29d17-a66a-42fe-8275-1526f8fb6dc9\") " pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:07 crc kubenswrapper[4699]: I0226 11:27:07.295981 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dfa29d17-a66a-42fe-8275-1526f8fb6dc9-metrics-certs\") pod \"frr-k8s-wszs7\" (UID: \"dfa29d17-a66a-42fe-8275-1526f8fb6dc9\") " pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:07 crc kubenswrapper[4699]: I0226 11:27:07.377056 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-svsrb"] Feb 26 11:27:07 crc kubenswrapper[4699]: W0226 11:27:07.382547 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35357e2c_2a03_46f8_bc28_f7daad3b679d.slice/crio-36b83af52be953d7bad40bb5c6ad5f1848a66b78add84e2007e4c0e8dcbff369 WatchSource:0}: Error finding container 36b83af52be953d7bad40bb5c6ad5f1848a66b78add84e2007e4c0e8dcbff369: Status 404 returned error can't find the container with id 36b83af52be953d7bad40bb5c6ad5f1848a66b78add84e2007e4c0e8dcbff369 Feb 26 11:27:07 crc kubenswrapper[4699]: I0226 11:27:07.393254 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d656ca89-f955-44bb-9944-f75bf485a254-metrics-certs\") pod \"speaker-l8phj\" (UID: \"d656ca89-f955-44bb-9944-f75bf485a254\") " pod="metallb-system/speaker-l8phj" Feb 26 11:27:07 crc kubenswrapper[4699]: I0226 11:27:07.393342 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d656ca89-f955-44bb-9944-f75bf485a254-memberlist\") pod \"speaker-l8phj\" (UID: \"d656ca89-f955-44bb-9944-f75bf485a254\") " pod="metallb-system/speaker-l8phj" Feb 26 11:27:07 crc kubenswrapper[4699]: E0226 11:27:07.393488 4699 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 26 11:27:07 crc kubenswrapper[4699]: E0226 11:27:07.393560 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d656ca89-f955-44bb-9944-f75bf485a254-memberlist podName:d656ca89-f955-44bb-9944-f75bf485a254 nodeName:}" failed. No retries permitted until 2026-02-26 11:27:08.393543049 +0000 UTC m=+974.204369483 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d656ca89-f955-44bb-9944-f75bf485a254-memberlist") pod "speaker-l8phj" (UID: "d656ca89-f955-44bb-9944-f75bf485a254") : secret "metallb-memberlist" not found Feb 26 11:27:07 crc kubenswrapper[4699]: I0226 11:27:07.397948 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d656ca89-f955-44bb-9944-f75bf485a254-metrics-certs\") pod \"speaker-l8phj\" (UID: \"d656ca89-f955-44bb-9944-f75bf485a254\") " pod="metallb-system/speaker-l8phj" Feb 26 11:27:07 crc kubenswrapper[4699]: I0226 11:27:07.451620 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:07 crc kubenswrapper[4699]: I0226 11:27:07.475652 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-bs5nk"] Feb 26 11:27:07 crc kubenswrapper[4699]: I0226 11:27:07.529980 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-svsrb" event={"ID":"35357e2c-2a03-46f8-bc28-f7daad3b679d","Type":"ContainerStarted","Data":"36b83af52be953d7bad40bb5c6ad5f1848a66b78add84e2007e4c0e8dcbff369"} Feb 26 11:27:07 crc kubenswrapper[4699]: I0226 11:27:07.531348 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-bs5nk" event={"ID":"6ef6a9d7-6997-485a-a812-ded9d3a2df85","Type":"ContainerStarted","Data":"5e8fb4298f3a3a8bd86209e0afd9b52912537cb1c66df9db24fa6b59e4dc1adf"} Feb 26 11:27:08 crc kubenswrapper[4699]: I0226 11:27:08.405306 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d656ca89-f955-44bb-9944-f75bf485a254-memberlist\") pod \"speaker-l8phj\" (UID: \"d656ca89-f955-44bb-9944-f75bf485a254\") " pod="metallb-system/speaker-l8phj" Feb 26 11:27:08 crc kubenswrapper[4699]: I0226 11:27:08.410006 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d656ca89-f955-44bb-9944-f75bf485a254-memberlist\") pod \"speaker-l8phj\" (UID: \"d656ca89-f955-44bb-9944-f75bf485a254\") " pod="metallb-system/speaker-l8phj" Feb 26 11:27:08 crc kubenswrapper[4699]: I0226 11:27:08.469499 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-l8phj" Feb 26 11:27:08 crc kubenswrapper[4699]: W0226 11:27:08.489924 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd656ca89_f955_44bb_9944_f75bf485a254.slice/crio-cccd11e4a1914d85bafde0a0bc8f0324748197e5ab8421f4b32496e0a983bcc1 WatchSource:0}: Error finding container cccd11e4a1914d85bafde0a0bc8f0324748197e5ab8421f4b32496e0a983bcc1: Status 404 returned error can't find the container with id cccd11e4a1914d85bafde0a0bc8f0324748197e5ab8421f4b32496e0a983bcc1 Feb 26 11:27:08 crc kubenswrapper[4699]: I0226 11:27:08.538460 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wszs7" event={"ID":"dfa29d17-a66a-42fe-8275-1526f8fb6dc9","Type":"ContainerStarted","Data":"5ef387f6ad8a8d3b2228d71c65732676cadf36a3a39c10b8cce49edd482081d2"} Feb 26 11:27:08 crc kubenswrapper[4699]: I0226 11:27:08.539290 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-l8phj" event={"ID":"d656ca89-f955-44bb-9944-f75bf485a254","Type":"ContainerStarted","Data":"cccd11e4a1914d85bafde0a0bc8f0324748197e5ab8421f4b32496e0a983bcc1"} Feb 26 11:27:08 crc kubenswrapper[4699]: I0226 11:27:08.541185 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-bs5nk" event={"ID":"6ef6a9d7-6997-485a-a812-ded9d3a2df85","Type":"ContainerStarted","Data":"abfb559eaf2df8b227fabdc97cd10e7b3bf75b132967329540d122f07db30d08"} Feb 26 11:27:08 crc kubenswrapper[4699]: I0226 11:27:08.541225 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-bs5nk" event={"ID":"6ef6a9d7-6997-485a-a812-ded9d3a2df85","Type":"ContainerStarted","Data":"8ed671135b943a49812112a63ded16910b3411204ed900f6d6c2e5b474cf1d3c"} Feb 26 11:27:08 crc kubenswrapper[4699]: I0226 11:27:08.541346 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-bs5nk" Feb 26 11:27:08 crc kubenswrapper[4699]: I0226 11:27:08.566566 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-bs5nk" podStartSLOduration=2.566549206 podStartE2EDuration="2.566549206s" podCreationTimestamp="2026-02-26 11:27:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:27:08.563997893 +0000 UTC m=+974.374824347" watchObservedRunningTime="2026-02-26 11:27:08.566549206 +0000 UTC m=+974.377375640" Feb 26 11:27:09 crc kubenswrapper[4699]: I0226 11:27:09.595574 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-l8phj" event={"ID":"d656ca89-f955-44bb-9944-f75bf485a254","Type":"ContainerStarted","Data":"681998004cee42d0a153586aabffbdfee4c3452ac997f259a94185dfa7c96b01"} Feb 26 11:27:09 crc kubenswrapper[4699]: I0226 11:27:09.595876 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-l8phj" event={"ID":"d656ca89-f955-44bb-9944-f75bf485a254","Type":"ContainerStarted","Data":"0df399b4331ab1b5d9d958dbd2a1435311625e3cb5abd934bab72d2e6c93415d"} Feb 26 11:27:09 crc kubenswrapper[4699]: I0226 11:27:09.595900 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-l8phj" Feb 26 11:27:09 crc kubenswrapper[4699]: I0226 11:27:09.632172 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-l8phj" podStartSLOduration=3.632155353 podStartE2EDuration="3.632155353s" podCreationTimestamp="2026-02-26 11:27:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:27:09.630408542 +0000 UTC m=+975.441234976" watchObservedRunningTime="2026-02-26 11:27:09.632155353 +0000 UTC m=+975.442981787" Feb 26 11:27:11 crc kubenswrapper[4699]: I0226 11:27:11.585210 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:27:11 crc kubenswrapper[4699]: I0226 11:27:11.585583 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:27:15 crc kubenswrapper[4699]: I0226 11:27:15.635235 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-svsrb" event={"ID":"35357e2c-2a03-46f8-bc28-f7daad3b679d","Type":"ContainerStarted","Data":"fdbf34372fdba9313eb7c15b9b8f16cb8b02ae8a3989978a732171eb3899389f"} Feb 26 11:27:15 crc kubenswrapper[4699]: I0226 11:27:15.635535 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-svsrb" Feb 26 11:27:15 crc kubenswrapper[4699]: I0226 11:27:15.636977 4699 generic.go:334] "Generic (PLEG): container finished" podID="dfa29d17-a66a-42fe-8275-1526f8fb6dc9" containerID="261b01236f9fb41efd0623279170275ff90996043ed0669d29633d7b8600b866" exitCode=0 Feb 26 11:27:15 crc kubenswrapper[4699]: I0226 11:27:15.637092 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wszs7" event={"ID":"dfa29d17-a66a-42fe-8275-1526f8fb6dc9","Type":"ContainerDied","Data":"261b01236f9fb41efd0623279170275ff90996043ed0669d29633d7b8600b866"} Feb 26 11:27:15 crc kubenswrapper[4699]: I0226 11:27:15.656769 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-svsrb" podStartSLOduration=2.3067763 podStartE2EDuration="9.656751009s" podCreationTimestamp="2026-02-26 11:27:06 +0000 UTC" firstStartedPulling="2026-02-26 11:27:07.385045306 +0000 UTC m=+973.195871740" lastFinishedPulling="2026-02-26 11:27:14.735020015 +0000 UTC m=+980.545846449" observedRunningTime="2026-02-26 11:27:15.652373423 +0000 UTC m=+981.463199947" watchObservedRunningTime="2026-02-26 11:27:15.656751009 +0000 UTC m=+981.467577443" Feb 26 11:27:16 crc kubenswrapper[4699]: I0226 11:27:16.647770 4699 generic.go:334] "Generic (PLEG): container finished" podID="dfa29d17-a66a-42fe-8275-1526f8fb6dc9" containerID="dfb9a7698c5d387075f4be606b07d8b47db95cefab1e9938cbf04f21cb0a6158" exitCode=0 Feb 26 11:27:16 crc kubenswrapper[4699]: I0226 11:27:16.647880 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wszs7" event={"ID":"dfa29d17-a66a-42fe-8275-1526f8fb6dc9","Type":"ContainerDied","Data":"dfb9a7698c5d387075f4be606b07d8b47db95cefab1e9938cbf04f21cb0a6158"} Feb 26 11:27:17 crc kubenswrapper[4699]: I0226 11:27:17.656517 4699 generic.go:334] "Generic (PLEG): container finished" podID="dfa29d17-a66a-42fe-8275-1526f8fb6dc9" containerID="daaffb47283e3534600ea7af26f7849913fe543978df0d1cb643ecaa3c98b251" exitCode=0 Feb 26 11:27:17 crc kubenswrapper[4699]: I0226 11:27:17.656584 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wszs7" event={"ID":"dfa29d17-a66a-42fe-8275-1526f8fb6dc9","Type":"ContainerDied","Data":"daaffb47283e3534600ea7af26f7849913fe543978df0d1cb643ecaa3c98b251"} Feb 26 11:27:18 crc kubenswrapper[4699]: I0226 11:27:18.476737 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-l8phj" Feb 26 11:27:18 crc kubenswrapper[4699]: I0226 11:27:18.677185 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wszs7" event={"ID":"dfa29d17-a66a-42fe-8275-1526f8fb6dc9","Type":"ContainerStarted","Data":"2a8083fdf034370ef52d6d7d21d33efd71b3363f5d5e4f79c3c9d5ac7c381aa7"} Feb 26 11:27:18 crc kubenswrapper[4699]: I0226 11:27:18.677236 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wszs7" event={"ID":"dfa29d17-a66a-42fe-8275-1526f8fb6dc9","Type":"ContainerStarted","Data":"992cde06cda987e365a666f1898a916a2d85784dd4cd68e146069de3babd2e61"} Feb 26 11:27:18 crc kubenswrapper[4699]: I0226 11:27:18.677255 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wszs7" event={"ID":"dfa29d17-a66a-42fe-8275-1526f8fb6dc9","Type":"ContainerStarted","Data":"97b0708f178e32a7989588c33b2d46e2468ae3173dc6dc68075301987b71bcba"} Feb 26 11:27:18 crc kubenswrapper[4699]: I0226 11:27:18.677265 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wszs7" event={"ID":"dfa29d17-a66a-42fe-8275-1526f8fb6dc9","Type":"ContainerStarted","Data":"57dffa9df503a62eee0346e241488af2ccf89bcd0fc2e79fd96f293eed9a5ae0"} Feb 26 11:27:19 crc kubenswrapper[4699]: I0226 11:27:19.694193 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wszs7" event={"ID":"dfa29d17-a66a-42fe-8275-1526f8fb6dc9","Type":"ContainerStarted","Data":"0e1f797ed8377e27dc40cb8a80160c0bfd25198075e343c11b66c8182a0955a3"} Feb 26 11:27:19 crc kubenswrapper[4699]: I0226 11:27:19.694656 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wszs7" event={"ID":"dfa29d17-a66a-42fe-8275-1526f8fb6dc9","Type":"ContainerStarted","Data":"33acc66c12b78c941624bbf6dc15f24f63826732cff6426dd644b061a288760b"} Feb 26 11:27:19 crc kubenswrapper[4699]: I0226 11:27:19.694799 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:19 crc kubenswrapper[4699]: I0226 11:27:19.727230 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-wszs7" podStartSLOduration=6.575731016 podStartE2EDuration="13.727197675s" podCreationTimestamp="2026-02-26 11:27:06 +0000 UTC" firstStartedPulling="2026-02-26 11:27:07.591293438 +0000 UTC m=+973.402119872" lastFinishedPulling="2026-02-26 11:27:14.742760097 +0000 UTC m=+980.553586531" observedRunningTime="2026-02-26 11:27:19.719942677 +0000 UTC m=+985.530769131" watchObservedRunningTime="2026-02-26 11:27:19.727197675 +0000 UTC m=+985.538024129" Feb 26 11:27:21 crc kubenswrapper[4699]: I0226 11:27:21.420283 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-5ckfn"] Feb 26 11:27:21 crc kubenswrapper[4699]: I0226 11:27:21.420986 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5ckfn" Feb 26 11:27:21 crc kubenswrapper[4699]: I0226 11:27:21.422972 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 26 11:27:21 crc kubenswrapper[4699]: I0226 11:27:21.423027 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-gr4l9" Feb 26 11:27:21 crc kubenswrapper[4699]: I0226 11:27:21.423177 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 26 11:27:21 crc kubenswrapper[4699]: I0226 11:27:21.447310 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5ckfn"] Feb 26 11:27:21 crc kubenswrapper[4699]: I0226 11:27:21.498857 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzrhj\" (UniqueName: \"kubernetes.io/projected/46dbbdb1-7181-4c0f-a593-3536bad6290c-kube-api-access-qzrhj\") pod \"openstack-operator-index-5ckfn\" (UID: \"46dbbdb1-7181-4c0f-a593-3536bad6290c\") " pod="openstack-operators/openstack-operator-index-5ckfn" Feb 26 11:27:21 crc kubenswrapper[4699]: I0226 11:27:21.600202 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzrhj\" (UniqueName: \"kubernetes.io/projected/46dbbdb1-7181-4c0f-a593-3536bad6290c-kube-api-access-qzrhj\") pod \"openstack-operator-index-5ckfn\" (UID: \"46dbbdb1-7181-4c0f-a593-3536bad6290c\") " pod="openstack-operators/openstack-operator-index-5ckfn" Feb 26 11:27:21 crc kubenswrapper[4699]: I0226 11:27:21.626328 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzrhj\" (UniqueName: \"kubernetes.io/projected/46dbbdb1-7181-4c0f-a593-3536bad6290c-kube-api-access-qzrhj\") pod \"openstack-operator-index-5ckfn\" (UID: \"46dbbdb1-7181-4c0f-a593-3536bad6290c\") " pod="openstack-operators/openstack-operator-index-5ckfn" Feb 26 11:27:21 crc kubenswrapper[4699]: I0226 11:27:21.741476 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5ckfn" Feb 26 11:27:22 crc kubenswrapper[4699]: I0226 11:27:22.148003 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5ckfn"] Feb 26 11:27:22 crc kubenswrapper[4699]: I0226 11:27:22.452578 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:22 crc kubenswrapper[4699]: I0226 11:27:22.493003 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:22 crc kubenswrapper[4699]: I0226 11:27:22.714210 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5ckfn" event={"ID":"46dbbdb1-7181-4c0f-a593-3536bad6290c","Type":"ContainerStarted","Data":"a0c95212ad762b90262b586cd0a018ca600aeeb8aaf5bb8f18c20bd3e5190bb7"} Feb 26 11:27:24 crc kubenswrapper[4699]: I0226 11:27:24.793270 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5ckfn"] Feb 26 11:27:25 crc kubenswrapper[4699]: I0226 11:27:25.401086 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-gmh8j"] Feb 26 11:27:25 crc kubenswrapper[4699]: I0226 11:27:25.401946 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gmh8j" Feb 26 11:27:25 crc kubenswrapper[4699]: I0226 11:27:25.408328 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-gmh8j"] Feb 26 11:27:25 crc kubenswrapper[4699]: I0226 11:27:25.551428 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vj7b\" (UniqueName: \"kubernetes.io/projected/22cfe789-87ae-4b23-91c2-cbb5112e4285-kube-api-access-5vj7b\") pod \"openstack-operator-index-gmh8j\" (UID: \"22cfe789-87ae-4b23-91c2-cbb5112e4285\") " pod="openstack-operators/openstack-operator-index-gmh8j" Feb 26 11:27:25 crc kubenswrapper[4699]: I0226 11:27:25.653766 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vj7b\" (UniqueName: \"kubernetes.io/projected/22cfe789-87ae-4b23-91c2-cbb5112e4285-kube-api-access-5vj7b\") pod \"openstack-operator-index-gmh8j\" (UID: \"22cfe789-87ae-4b23-91c2-cbb5112e4285\") " pod="openstack-operators/openstack-operator-index-gmh8j" Feb 26 11:27:25 crc kubenswrapper[4699]: I0226 11:27:25.692365 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vj7b\" (UniqueName: \"kubernetes.io/projected/22cfe789-87ae-4b23-91c2-cbb5112e4285-kube-api-access-5vj7b\") pod \"openstack-operator-index-gmh8j\" (UID: \"22cfe789-87ae-4b23-91c2-cbb5112e4285\") " pod="openstack-operators/openstack-operator-index-gmh8j" Feb 26 11:27:25 crc kubenswrapper[4699]: I0226 11:27:25.724717 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gmh8j" Feb 26 11:27:26 crc kubenswrapper[4699]: I0226 11:27:26.425281 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-gmh8j"] Feb 26 11:27:26 crc kubenswrapper[4699]: W0226 11:27:26.427712 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22cfe789_87ae_4b23_91c2_cbb5112e4285.slice/crio-14cad797602d39bf559a3de2c76af4186814e8696471ea7b8a2457177441373a WatchSource:0}: Error finding container 14cad797602d39bf559a3de2c76af4186814e8696471ea7b8a2457177441373a: Status 404 returned error can't find the container with id 14cad797602d39bf559a3de2c76af4186814e8696471ea7b8a2457177441373a Feb 26 11:27:26 crc kubenswrapper[4699]: I0226 11:27:26.738678 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gmh8j" event={"ID":"22cfe789-87ae-4b23-91c2-cbb5112e4285","Type":"ContainerStarted","Data":"65250f92fd240c8235daf4676095c1c679111ac998e059a2d7def87554e89884"} Feb 26 11:27:26 crc kubenswrapper[4699]: I0226 11:27:26.738901 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gmh8j" event={"ID":"22cfe789-87ae-4b23-91c2-cbb5112e4285","Type":"ContainerStarted","Data":"14cad797602d39bf559a3de2c76af4186814e8696471ea7b8a2457177441373a"} Feb 26 11:27:26 crc kubenswrapper[4699]: I0226 11:27:26.740483 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5ckfn" event={"ID":"46dbbdb1-7181-4c0f-a593-3536bad6290c","Type":"ContainerStarted","Data":"df14266671fd692c1bfed6867681c81dcfd8f90bac3b5b1e23224ac3948f1dd5"} Feb 26 11:27:26 crc kubenswrapper[4699]: I0226 11:27:26.740565 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-5ckfn" podUID="46dbbdb1-7181-4c0f-a593-3536bad6290c" containerName="registry-server" containerID="cri-o://df14266671fd692c1bfed6867681c81dcfd8f90bac3b5b1e23224ac3948f1dd5" gracePeriod=2 Feb 26 11:27:26 crc kubenswrapper[4699]: I0226 11:27:26.756613 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-gmh8j" podStartSLOduration=1.70828504 podStartE2EDuration="1.756593525s" podCreationTimestamp="2026-02-26 11:27:25 +0000 UTC" firstStartedPulling="2026-02-26 11:27:26.431151205 +0000 UTC m=+992.241977639" lastFinishedPulling="2026-02-26 11:27:26.47945969 +0000 UTC m=+992.290286124" observedRunningTime="2026-02-26 11:27:26.751939581 +0000 UTC m=+992.562766015" watchObservedRunningTime="2026-02-26 11:27:26.756593525 +0000 UTC m=+992.567419959" Feb 26 11:27:26 crc kubenswrapper[4699]: I0226 11:27:26.771309 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-5ckfn" podStartSLOduration=1.876372302 podStartE2EDuration="5.771287596s" podCreationTimestamp="2026-02-26 11:27:21 +0000 UTC" firstStartedPulling="2026-02-26 11:27:22.153795308 +0000 UTC m=+987.964621742" lastFinishedPulling="2026-02-26 11:27:26.048710602 +0000 UTC m=+991.859537036" observedRunningTime="2026-02-26 11:27:26.769732061 +0000 UTC m=+992.580558495" watchObservedRunningTime="2026-02-26 11:27:26.771287596 +0000 UTC m=+992.582114040" Feb 26 11:27:26 crc kubenswrapper[4699]: I0226 11:27:26.999813 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-bs5nk" Feb 26 11:27:27 crc kubenswrapper[4699]: I0226 11:27:27.073164 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5ckfn" Feb 26 11:27:27 crc kubenswrapper[4699]: I0226 11:27:27.173187 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzrhj\" (UniqueName: \"kubernetes.io/projected/46dbbdb1-7181-4c0f-a593-3536bad6290c-kube-api-access-qzrhj\") pod \"46dbbdb1-7181-4c0f-a593-3536bad6290c\" (UID: \"46dbbdb1-7181-4c0f-a593-3536bad6290c\") " Feb 26 11:27:27 crc kubenswrapper[4699]: I0226 11:27:27.181039 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46dbbdb1-7181-4c0f-a593-3536bad6290c-kube-api-access-qzrhj" (OuterVolumeSpecName: "kube-api-access-qzrhj") pod "46dbbdb1-7181-4c0f-a593-3536bad6290c" (UID: "46dbbdb1-7181-4c0f-a593-3536bad6290c"). InnerVolumeSpecName "kube-api-access-qzrhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:27:27 crc kubenswrapper[4699]: I0226 11:27:27.195543 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-svsrb" Feb 26 11:27:27 crc kubenswrapper[4699]: I0226 11:27:27.274845 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzrhj\" (UniqueName: \"kubernetes.io/projected/46dbbdb1-7181-4c0f-a593-3536bad6290c-kube-api-access-qzrhj\") on node \"crc\" DevicePath \"\"" Feb 26 11:27:27 crc kubenswrapper[4699]: I0226 11:27:27.454551 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-wszs7" Feb 26 11:27:27 crc kubenswrapper[4699]: I0226 11:27:27.746884 4699 generic.go:334] "Generic (PLEG): container finished" podID="46dbbdb1-7181-4c0f-a593-3536bad6290c" containerID="df14266671fd692c1bfed6867681c81dcfd8f90bac3b5b1e23224ac3948f1dd5" exitCode=0 Feb 26 11:27:27 crc kubenswrapper[4699]: I0226 11:27:27.746924 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5ckfn" event={"ID":"46dbbdb1-7181-4c0f-a593-3536bad6290c","Type":"ContainerDied","Data":"df14266671fd692c1bfed6867681c81dcfd8f90bac3b5b1e23224ac3948f1dd5"} Feb 26 11:27:27 crc kubenswrapper[4699]: I0226 11:27:27.746956 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5ckfn" Feb 26 11:27:27 crc kubenswrapper[4699]: I0226 11:27:27.746979 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5ckfn" event={"ID":"46dbbdb1-7181-4c0f-a593-3536bad6290c","Type":"ContainerDied","Data":"a0c95212ad762b90262b586cd0a018ca600aeeb8aaf5bb8f18c20bd3e5190bb7"} Feb 26 11:27:27 crc kubenswrapper[4699]: I0226 11:27:27.747005 4699 scope.go:117] "RemoveContainer" containerID="df14266671fd692c1bfed6867681c81dcfd8f90bac3b5b1e23224ac3948f1dd5" Feb 26 11:27:27 crc kubenswrapper[4699]: I0226 11:27:27.769182 4699 scope.go:117] "RemoveContainer" containerID="df14266671fd692c1bfed6867681c81dcfd8f90bac3b5b1e23224ac3948f1dd5" Feb 26 11:27:27 crc kubenswrapper[4699]: E0226 11:27:27.770164 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df14266671fd692c1bfed6867681c81dcfd8f90bac3b5b1e23224ac3948f1dd5\": container with ID starting with df14266671fd692c1bfed6867681c81dcfd8f90bac3b5b1e23224ac3948f1dd5 not found: ID does not exist" containerID="df14266671fd692c1bfed6867681c81dcfd8f90bac3b5b1e23224ac3948f1dd5" Feb 26 11:27:27 crc kubenswrapper[4699]: I0226 11:27:27.770199 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df14266671fd692c1bfed6867681c81dcfd8f90bac3b5b1e23224ac3948f1dd5"} err="failed to get container status \"df14266671fd692c1bfed6867681c81dcfd8f90bac3b5b1e23224ac3948f1dd5\": rpc error: code = NotFound desc = could not find container \"df14266671fd692c1bfed6867681c81dcfd8f90bac3b5b1e23224ac3948f1dd5\": container with ID starting with df14266671fd692c1bfed6867681c81dcfd8f90bac3b5b1e23224ac3948f1dd5 not found: ID does not exist" Feb 26 11:27:27 crc kubenswrapper[4699]: I0226 11:27:27.777072 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5ckfn"] Feb 26 11:27:27 crc kubenswrapper[4699]: I0226 11:27:27.781772 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-5ckfn"] Feb 26 11:27:28 crc kubenswrapper[4699]: I0226 11:27:28.268392 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46dbbdb1-7181-4c0f-a593-3536bad6290c" path="/var/lib/kubelet/pods/46dbbdb1-7181-4c0f-a593-3536bad6290c/volumes" Feb 26 11:27:35 crc kubenswrapper[4699]: I0226 11:27:35.725878 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-gmh8j" Feb 26 11:27:35 crc kubenswrapper[4699]: I0226 11:27:35.726450 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-gmh8j" Feb 26 11:27:35 crc kubenswrapper[4699]: I0226 11:27:35.750534 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-gmh8j" Feb 26 11:27:35 crc kubenswrapper[4699]: I0226 11:27:35.818384 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-gmh8j" Feb 26 11:27:37 crc kubenswrapper[4699]: I0226 11:27:37.444296 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8"] Feb 26 11:27:37 crc kubenswrapper[4699]: E0226 11:27:37.444829 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46dbbdb1-7181-4c0f-a593-3536bad6290c" containerName="registry-server" Feb 26 11:27:37 crc kubenswrapper[4699]: I0226 11:27:37.444841 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="46dbbdb1-7181-4c0f-a593-3536bad6290c" containerName="registry-server" Feb 26 11:27:37 crc kubenswrapper[4699]: I0226 11:27:37.444968 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="46dbbdb1-7181-4c0f-a593-3536bad6290c" containerName="registry-server" Feb 26 11:27:37 crc kubenswrapper[4699]: I0226 11:27:37.445951 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8" Feb 26 11:27:37 crc kubenswrapper[4699]: I0226 11:27:37.448461 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-4fpg2" Feb 26 11:27:37 crc kubenswrapper[4699]: I0226 11:27:37.460928 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8"] Feb 26 11:27:37 crc kubenswrapper[4699]: I0226 11:27:37.616490 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfp82\" (UniqueName: \"kubernetes.io/projected/449351cd-8256-4e21-b27e-be3c4db11ca5-kube-api-access-qfp82\") pod \"dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8\" (UID: \"449351cd-8256-4e21-b27e-be3c4db11ca5\") " pod="openstack-operators/dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8" Feb 26 11:27:37 crc kubenswrapper[4699]: I0226 11:27:37.616536 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/449351cd-8256-4e21-b27e-be3c4db11ca5-util\") pod \"dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8\" (UID: \"449351cd-8256-4e21-b27e-be3c4db11ca5\") " pod="openstack-operators/dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8" Feb 26 11:27:37 crc kubenswrapper[4699]: I0226 11:27:37.616563 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/449351cd-8256-4e21-b27e-be3c4db11ca5-bundle\") pod \"dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8\" (UID: \"449351cd-8256-4e21-b27e-be3c4db11ca5\") " pod="openstack-operators/dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8" Feb 26 11:27:37 crc kubenswrapper[4699]: I0226 11:27:37.717373 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfp82\" (UniqueName: \"kubernetes.io/projected/449351cd-8256-4e21-b27e-be3c4db11ca5-kube-api-access-qfp82\") pod \"dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8\" (UID: \"449351cd-8256-4e21-b27e-be3c4db11ca5\") " pod="openstack-operators/dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8" Feb 26 11:27:37 crc kubenswrapper[4699]: I0226 11:27:37.717760 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/449351cd-8256-4e21-b27e-be3c4db11ca5-util\") pod \"dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8\" (UID: \"449351cd-8256-4e21-b27e-be3c4db11ca5\") " pod="openstack-operators/dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8" Feb 26 11:27:37 crc kubenswrapper[4699]: I0226 11:27:37.717912 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/449351cd-8256-4e21-b27e-be3c4db11ca5-bundle\") pod \"dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8\" (UID: \"449351cd-8256-4e21-b27e-be3c4db11ca5\") " pod="openstack-operators/dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8" Feb 26 11:27:37 crc kubenswrapper[4699]: I0226 11:27:37.718359 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/449351cd-8256-4e21-b27e-be3c4db11ca5-util\") pod \"dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8\" (UID: \"449351cd-8256-4e21-b27e-be3c4db11ca5\") " pod="openstack-operators/dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8" Feb 26 11:27:37 crc kubenswrapper[4699]: I0226 11:27:37.718389 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/449351cd-8256-4e21-b27e-be3c4db11ca5-bundle\") pod \"dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8\" (UID: \"449351cd-8256-4e21-b27e-be3c4db11ca5\") " pod="openstack-operators/dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8" Feb 26 11:27:37 crc kubenswrapper[4699]: I0226 11:27:37.741154 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfp82\" (UniqueName: \"kubernetes.io/projected/449351cd-8256-4e21-b27e-be3c4db11ca5-kube-api-access-qfp82\") pod \"dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8\" (UID: \"449351cd-8256-4e21-b27e-be3c4db11ca5\") " pod="openstack-operators/dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8" Feb 26 11:27:37 crc kubenswrapper[4699]: I0226 11:27:37.766921 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8" Feb 26 11:27:38 crc kubenswrapper[4699]: I0226 11:27:38.230718 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8"] Feb 26 11:27:38 crc kubenswrapper[4699]: W0226 11:27:38.234357 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod449351cd_8256_4e21_b27e_be3c4db11ca5.slice/crio-59fbb8ef004b1a4cf86a817f17537a83fc7a638063f9abe4dd2c819f23229f7d WatchSource:0}: Error finding container 59fbb8ef004b1a4cf86a817f17537a83fc7a638063f9abe4dd2c819f23229f7d: Status 404 returned error can't find the container with id 59fbb8ef004b1a4cf86a817f17537a83fc7a638063f9abe4dd2c819f23229f7d Feb 26 11:27:38 crc kubenswrapper[4699]: I0226 11:27:38.819652 4699 generic.go:334] "Generic (PLEG): container finished" podID="449351cd-8256-4e21-b27e-be3c4db11ca5" containerID="f2f7f6c58efd9c486cb559854986255a713481e572462d9eed87f2bc7e30f241" exitCode=0 Feb 26 11:27:38 crc kubenswrapper[4699]: I0226 11:27:38.819914 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8" event={"ID":"449351cd-8256-4e21-b27e-be3c4db11ca5","Type":"ContainerDied","Data":"f2f7f6c58efd9c486cb559854986255a713481e572462d9eed87f2bc7e30f241"} Feb 26 11:27:38 crc kubenswrapper[4699]: I0226 11:27:38.820396 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8" event={"ID":"449351cd-8256-4e21-b27e-be3c4db11ca5","Type":"ContainerStarted","Data":"59fbb8ef004b1a4cf86a817f17537a83fc7a638063f9abe4dd2c819f23229f7d"} Feb 26 11:27:39 crc kubenswrapper[4699]: E0226 11:27:39.395265 4699 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod449351cd_8256_4e21_b27e_be3c4db11ca5.slice/crio-314ddf81598d848087d655652620d779cfd60ffaee228700c37de9468ffba078.scope\": RecentStats: unable to find data in memory cache]" Feb 26 11:27:39 crc kubenswrapper[4699]: I0226 11:27:39.838712 4699 generic.go:334] "Generic (PLEG): container finished" podID="449351cd-8256-4e21-b27e-be3c4db11ca5" containerID="314ddf81598d848087d655652620d779cfd60ffaee228700c37de9468ffba078" exitCode=0 Feb 26 11:27:39 crc kubenswrapper[4699]: I0226 11:27:39.838765 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8" event={"ID":"449351cd-8256-4e21-b27e-be3c4db11ca5","Type":"ContainerDied","Data":"314ddf81598d848087d655652620d779cfd60ffaee228700c37de9468ffba078"} Feb 26 11:27:40 crc kubenswrapper[4699]: I0226 11:27:40.851743 4699 generic.go:334] "Generic (PLEG): container finished" podID="449351cd-8256-4e21-b27e-be3c4db11ca5" containerID="5307481d5c588fb811e1d79efc2ddc4f8eef7eda8bb05246530f8cb377179764" exitCode=0 Feb 26 11:27:40 crc kubenswrapper[4699]: I0226 11:27:40.852183 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8" event={"ID":"449351cd-8256-4e21-b27e-be3c4db11ca5","Type":"ContainerDied","Data":"5307481d5c588fb811e1d79efc2ddc4f8eef7eda8bb05246530f8cb377179764"} Feb 26 11:27:41 crc kubenswrapper[4699]: I0226 11:27:41.585091 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:27:41 crc kubenswrapper[4699]: I0226 11:27:41.585189 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:27:42 crc kubenswrapper[4699]: I0226 11:27:42.114033 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8" Feb 26 11:27:42 crc kubenswrapper[4699]: I0226 11:27:42.282171 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfp82\" (UniqueName: \"kubernetes.io/projected/449351cd-8256-4e21-b27e-be3c4db11ca5-kube-api-access-qfp82\") pod \"449351cd-8256-4e21-b27e-be3c4db11ca5\" (UID: \"449351cd-8256-4e21-b27e-be3c4db11ca5\") " Feb 26 11:27:42 crc kubenswrapper[4699]: I0226 11:27:42.282281 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/449351cd-8256-4e21-b27e-be3c4db11ca5-util\") pod \"449351cd-8256-4e21-b27e-be3c4db11ca5\" (UID: \"449351cd-8256-4e21-b27e-be3c4db11ca5\") " Feb 26 11:27:42 crc kubenswrapper[4699]: I0226 11:27:42.282333 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/449351cd-8256-4e21-b27e-be3c4db11ca5-bundle\") pod \"449351cd-8256-4e21-b27e-be3c4db11ca5\" (UID: \"449351cd-8256-4e21-b27e-be3c4db11ca5\") " Feb 26 11:27:42 crc kubenswrapper[4699]: I0226 11:27:42.283040 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/449351cd-8256-4e21-b27e-be3c4db11ca5-bundle" (OuterVolumeSpecName: "bundle") pod "449351cd-8256-4e21-b27e-be3c4db11ca5" (UID: "449351cd-8256-4e21-b27e-be3c4db11ca5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:27:42 crc kubenswrapper[4699]: I0226 11:27:42.290016 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/449351cd-8256-4e21-b27e-be3c4db11ca5-kube-api-access-qfp82" (OuterVolumeSpecName: "kube-api-access-qfp82") pod "449351cd-8256-4e21-b27e-be3c4db11ca5" (UID: "449351cd-8256-4e21-b27e-be3c4db11ca5"). InnerVolumeSpecName "kube-api-access-qfp82". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:27:42 crc kubenswrapper[4699]: I0226 11:27:42.297313 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/449351cd-8256-4e21-b27e-be3c4db11ca5-util" (OuterVolumeSpecName: "util") pod "449351cd-8256-4e21-b27e-be3c4db11ca5" (UID: "449351cd-8256-4e21-b27e-be3c4db11ca5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:27:42 crc kubenswrapper[4699]: I0226 11:27:42.383711 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfp82\" (UniqueName: \"kubernetes.io/projected/449351cd-8256-4e21-b27e-be3c4db11ca5-kube-api-access-qfp82\") on node \"crc\" DevicePath \"\"" Feb 26 11:27:42 crc kubenswrapper[4699]: I0226 11:27:42.383749 4699 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/449351cd-8256-4e21-b27e-be3c4db11ca5-util\") on node \"crc\" DevicePath \"\"" Feb 26 11:27:42 crc kubenswrapper[4699]: I0226 11:27:42.383767 4699 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/449351cd-8256-4e21-b27e-be3c4db11ca5-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:27:42 crc kubenswrapper[4699]: I0226 11:27:42.881988 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8" event={"ID":"449351cd-8256-4e21-b27e-be3c4db11ca5","Type":"ContainerDied","Data":"59fbb8ef004b1a4cf86a817f17537a83fc7a638063f9abe4dd2c819f23229f7d"} Feb 26 11:27:42 crc kubenswrapper[4699]: I0226 11:27:42.882066 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59fbb8ef004b1a4cf86a817f17537a83fc7a638063f9abe4dd2c819f23229f7d" Feb 26 11:27:42 crc kubenswrapper[4699]: I0226 11:27:42.882101 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8" Feb 26 11:27:49 crc kubenswrapper[4699]: I0226 11:27:49.622154 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-7c5cc54f9c-wjrrd"] Feb 26 11:27:49 crc kubenswrapper[4699]: E0226 11:27:49.623021 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="449351cd-8256-4e21-b27e-be3c4db11ca5" containerName="extract" Feb 26 11:27:49 crc kubenswrapper[4699]: I0226 11:27:49.623037 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="449351cd-8256-4e21-b27e-be3c4db11ca5" containerName="extract" Feb 26 11:27:49 crc kubenswrapper[4699]: E0226 11:27:49.623062 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="449351cd-8256-4e21-b27e-be3c4db11ca5" containerName="util" Feb 26 11:27:49 crc kubenswrapper[4699]: I0226 11:27:49.623069 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="449351cd-8256-4e21-b27e-be3c4db11ca5" containerName="util" Feb 26 11:27:49 crc kubenswrapper[4699]: E0226 11:27:49.623085 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="449351cd-8256-4e21-b27e-be3c4db11ca5" containerName="pull" Feb 26 11:27:49 crc kubenswrapper[4699]: I0226 11:27:49.623093 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="449351cd-8256-4e21-b27e-be3c4db11ca5" containerName="pull" Feb 26 11:27:49 crc kubenswrapper[4699]: I0226 11:27:49.623253 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="449351cd-8256-4e21-b27e-be3c4db11ca5" containerName="extract" Feb 26 11:27:49 crc kubenswrapper[4699]: I0226 11:27:49.623778 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7c5cc54f9c-wjrrd" Feb 26 11:27:49 crc kubenswrapper[4699]: I0226 11:27:49.626429 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-b88ct" Feb 26 11:27:49 crc kubenswrapper[4699]: I0226 11:27:49.647320 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7c5cc54f9c-wjrrd"] Feb 26 11:27:49 crc kubenswrapper[4699]: I0226 11:27:49.785100 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbbhn\" (UniqueName: \"kubernetes.io/projected/3a6d1210-ece5-4666-80bf-c7c7821e441c-kube-api-access-hbbhn\") pod \"openstack-operator-controller-init-7c5cc54f9c-wjrrd\" (UID: \"3a6d1210-ece5-4666-80bf-c7c7821e441c\") " pod="openstack-operators/openstack-operator-controller-init-7c5cc54f9c-wjrrd" Feb 26 11:27:49 crc kubenswrapper[4699]: I0226 11:27:49.886277 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbbhn\" (UniqueName: \"kubernetes.io/projected/3a6d1210-ece5-4666-80bf-c7c7821e441c-kube-api-access-hbbhn\") pod \"openstack-operator-controller-init-7c5cc54f9c-wjrrd\" (UID: \"3a6d1210-ece5-4666-80bf-c7c7821e441c\") " pod="openstack-operators/openstack-operator-controller-init-7c5cc54f9c-wjrrd" Feb 26 11:27:49 crc kubenswrapper[4699]: I0226 11:27:49.904799 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbbhn\" (UniqueName: \"kubernetes.io/projected/3a6d1210-ece5-4666-80bf-c7c7821e441c-kube-api-access-hbbhn\") pod \"openstack-operator-controller-init-7c5cc54f9c-wjrrd\" (UID: \"3a6d1210-ece5-4666-80bf-c7c7821e441c\") " pod="openstack-operators/openstack-operator-controller-init-7c5cc54f9c-wjrrd" Feb 26 11:27:49 crc kubenswrapper[4699]: I0226 11:27:49.945995 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7c5cc54f9c-wjrrd" Feb 26 11:27:50 crc kubenswrapper[4699]: I0226 11:27:50.375330 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7c5cc54f9c-wjrrd"] Feb 26 11:27:50 crc kubenswrapper[4699]: I0226 11:27:50.926184 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7c5cc54f9c-wjrrd" event={"ID":"3a6d1210-ece5-4666-80bf-c7c7821e441c","Type":"ContainerStarted","Data":"14f18c43e995834048418451242f186b58d9e588a4009246b7c26d94d2fc0672"} Feb 26 11:27:55 crc kubenswrapper[4699]: I0226 11:27:55.955998 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7c5cc54f9c-wjrrd" event={"ID":"3a6d1210-ece5-4666-80bf-c7c7821e441c","Type":"ContainerStarted","Data":"e9a618a01d013f465ebb83060bd50d4d1a228b78a25c5b2f5d6a002e0b768a20"} Feb 26 11:27:55 crc kubenswrapper[4699]: I0226 11:27:55.956593 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-7c5cc54f9c-wjrrd" Feb 26 11:27:55 crc kubenswrapper[4699]: I0226 11:27:55.983692 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-7c5cc54f9c-wjrrd" podStartSLOduration=2.325617336 podStartE2EDuration="6.983674208s" podCreationTimestamp="2026-02-26 11:27:49 +0000 UTC" firstStartedPulling="2026-02-26 11:27:50.382244324 +0000 UTC m=+1016.193070758" lastFinishedPulling="2026-02-26 11:27:55.040301186 +0000 UTC m=+1020.851127630" observedRunningTime="2026-02-26 11:27:55.980636141 +0000 UTC m=+1021.791462595" watchObservedRunningTime="2026-02-26 11:27:55.983674208 +0000 UTC m=+1021.794500652" Feb 26 11:27:57 crc kubenswrapper[4699]: I0226 11:27:57.699415 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v2mgq"] Feb 26 11:27:57 crc kubenswrapper[4699]: I0226 11:27:57.700917 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v2mgq" Feb 26 11:27:57 crc kubenswrapper[4699]: I0226 11:27:57.718570 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v2mgq"] Feb 26 11:27:57 crc kubenswrapper[4699]: I0226 11:27:57.800331 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a164bc1f-d5f1-4538-86c7-98edbe73d0af-catalog-content\") pod \"community-operators-v2mgq\" (UID: \"a164bc1f-d5f1-4538-86c7-98edbe73d0af\") " pod="openshift-marketplace/community-operators-v2mgq" Feb 26 11:27:57 crc kubenswrapper[4699]: I0226 11:27:57.800403 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq6hk\" (UniqueName: \"kubernetes.io/projected/a164bc1f-d5f1-4538-86c7-98edbe73d0af-kube-api-access-jq6hk\") pod \"community-operators-v2mgq\" (UID: \"a164bc1f-d5f1-4538-86c7-98edbe73d0af\") " pod="openshift-marketplace/community-operators-v2mgq" Feb 26 11:27:57 crc kubenswrapper[4699]: I0226 11:27:57.800452 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a164bc1f-d5f1-4538-86c7-98edbe73d0af-utilities\") pod \"community-operators-v2mgq\" (UID: \"a164bc1f-d5f1-4538-86c7-98edbe73d0af\") " pod="openshift-marketplace/community-operators-v2mgq" Feb 26 11:27:57 crc kubenswrapper[4699]: I0226 11:27:57.901819 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq6hk\" (UniqueName: \"kubernetes.io/projected/a164bc1f-d5f1-4538-86c7-98edbe73d0af-kube-api-access-jq6hk\") pod \"community-operators-v2mgq\" (UID: \"a164bc1f-d5f1-4538-86c7-98edbe73d0af\") " pod="openshift-marketplace/community-operators-v2mgq" Feb 26 11:27:57 crc kubenswrapper[4699]: I0226 11:27:57.902334 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a164bc1f-d5f1-4538-86c7-98edbe73d0af-utilities\") pod \"community-operators-v2mgq\" (UID: \"a164bc1f-d5f1-4538-86c7-98edbe73d0af\") " pod="openshift-marketplace/community-operators-v2mgq" Feb 26 11:27:57 crc kubenswrapper[4699]: I0226 11:27:57.902411 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a164bc1f-d5f1-4538-86c7-98edbe73d0af-catalog-content\") pod \"community-operators-v2mgq\" (UID: \"a164bc1f-d5f1-4538-86c7-98edbe73d0af\") " pod="openshift-marketplace/community-operators-v2mgq" Feb 26 11:27:57 crc kubenswrapper[4699]: I0226 11:27:57.903089 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a164bc1f-d5f1-4538-86c7-98edbe73d0af-utilities\") pod \"community-operators-v2mgq\" (UID: \"a164bc1f-d5f1-4538-86c7-98edbe73d0af\") " pod="openshift-marketplace/community-operators-v2mgq" Feb 26 11:27:57 crc kubenswrapper[4699]: I0226 11:27:57.903135 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a164bc1f-d5f1-4538-86c7-98edbe73d0af-catalog-content\") pod \"community-operators-v2mgq\" (UID: \"a164bc1f-d5f1-4538-86c7-98edbe73d0af\") " pod="openshift-marketplace/community-operators-v2mgq" Feb 26 11:27:57 crc kubenswrapper[4699]: I0226 11:27:57.921840 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq6hk\" (UniqueName: \"kubernetes.io/projected/a164bc1f-d5f1-4538-86c7-98edbe73d0af-kube-api-access-jq6hk\") pod \"community-operators-v2mgq\" (UID: \"a164bc1f-d5f1-4538-86c7-98edbe73d0af\") " pod="openshift-marketplace/community-operators-v2mgq" Feb 26 11:27:58 crc kubenswrapper[4699]: I0226 11:27:58.017746 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v2mgq" Feb 26 11:27:58 crc kubenswrapper[4699]: I0226 11:27:58.338210 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v2mgq"] Feb 26 11:27:58 crc kubenswrapper[4699]: I0226 11:27:58.976546 4699 generic.go:334] "Generic (PLEG): container finished" podID="a164bc1f-d5f1-4538-86c7-98edbe73d0af" containerID="d21c4a1c2d1695fcffcda48cd540319b245b3271d7e80b7aed7da48b650046d0" exitCode=0 Feb 26 11:27:58 crc kubenswrapper[4699]: I0226 11:27:58.976587 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v2mgq" event={"ID":"a164bc1f-d5f1-4538-86c7-98edbe73d0af","Type":"ContainerDied","Data":"d21c4a1c2d1695fcffcda48cd540319b245b3271d7e80b7aed7da48b650046d0"} Feb 26 11:27:58 crc kubenswrapper[4699]: I0226 11:27:58.976612 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v2mgq" event={"ID":"a164bc1f-d5f1-4538-86c7-98edbe73d0af","Type":"ContainerStarted","Data":"753782b8fb80b4cc413697d69c7fb60630bc30f6fc60cfabadcf0bf9e2078d1c"} Feb 26 11:27:59 crc kubenswrapper[4699]: I0226 11:27:59.983752 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v2mgq" event={"ID":"a164bc1f-d5f1-4538-86c7-98edbe73d0af","Type":"ContainerStarted","Data":"9732faae7de71e9de9c0e1daa7b9482435142bd3504f1581216d9da8919621e4"} Feb 26 11:28:00 crc kubenswrapper[4699]: I0226 11:28:00.134103 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535088-rwpx5"] Feb 26 11:28:00 crc kubenswrapper[4699]: I0226 11:28:00.136641 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535088-rwpx5" Feb 26 11:28:00 crc kubenswrapper[4699]: I0226 11:28:00.138835 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 11:28:00 crc kubenswrapper[4699]: I0226 11:28:00.139378 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 11:28:00 crc kubenswrapper[4699]: I0226 11:28:00.142108 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 11:28:00 crc kubenswrapper[4699]: I0226 11:28:00.157713 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535088-rwpx5"] Feb 26 11:28:00 crc kubenswrapper[4699]: I0226 11:28:00.238477 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdbpt\" (UniqueName: \"kubernetes.io/projected/d05b2b3d-2906-4acc-aaa2-2f2674e46f27-kube-api-access-hdbpt\") pod \"auto-csr-approver-29535088-rwpx5\" (UID: \"d05b2b3d-2906-4acc-aaa2-2f2674e46f27\") " pod="openshift-infra/auto-csr-approver-29535088-rwpx5" Feb 26 11:28:00 crc kubenswrapper[4699]: I0226 11:28:00.340211 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdbpt\" (UniqueName: \"kubernetes.io/projected/d05b2b3d-2906-4acc-aaa2-2f2674e46f27-kube-api-access-hdbpt\") pod \"auto-csr-approver-29535088-rwpx5\" (UID: \"d05b2b3d-2906-4acc-aaa2-2f2674e46f27\") " pod="openshift-infra/auto-csr-approver-29535088-rwpx5" Feb 26 11:28:00 crc kubenswrapper[4699]: I0226 11:28:00.361975 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdbpt\" (UniqueName: \"kubernetes.io/projected/d05b2b3d-2906-4acc-aaa2-2f2674e46f27-kube-api-access-hdbpt\") pod \"auto-csr-approver-29535088-rwpx5\" (UID: \"d05b2b3d-2906-4acc-aaa2-2f2674e46f27\") " pod="openshift-infra/auto-csr-approver-29535088-rwpx5" Feb 26 11:28:00 crc kubenswrapper[4699]: I0226 11:28:00.469467 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535088-rwpx5" Feb 26 11:28:00 crc kubenswrapper[4699]: I0226 11:28:00.637021 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535088-rwpx5"] Feb 26 11:28:00 crc kubenswrapper[4699]: W0226 11:28:00.645029 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd05b2b3d_2906_4acc_aaa2_2f2674e46f27.slice/crio-0dac7df2c4897357eb059b21b5fc653d4db4b06327bc9ee7f07988361f4b297c WatchSource:0}: Error finding container 0dac7df2c4897357eb059b21b5fc653d4db4b06327bc9ee7f07988361f4b297c: Status 404 returned error can't find the container with id 0dac7df2c4897357eb059b21b5fc653d4db4b06327bc9ee7f07988361f4b297c Feb 26 11:28:00 crc kubenswrapper[4699]: I0226 11:28:00.991604 4699 generic.go:334] "Generic (PLEG): container finished" podID="a164bc1f-d5f1-4538-86c7-98edbe73d0af" containerID="9732faae7de71e9de9c0e1daa7b9482435142bd3504f1581216d9da8919621e4" exitCode=0 Feb 26 11:28:00 crc kubenswrapper[4699]: I0226 11:28:00.991676 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v2mgq" event={"ID":"a164bc1f-d5f1-4538-86c7-98edbe73d0af","Type":"ContainerDied","Data":"9732faae7de71e9de9c0e1daa7b9482435142bd3504f1581216d9da8919621e4"} Feb 26 11:28:00 crc kubenswrapper[4699]: I0226 11:28:00.993047 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535088-rwpx5" event={"ID":"d05b2b3d-2906-4acc-aaa2-2f2674e46f27","Type":"ContainerStarted","Data":"0dac7df2c4897357eb059b21b5fc653d4db4b06327bc9ee7f07988361f4b297c"} Feb 26 11:28:02 crc kubenswrapper[4699]: I0226 11:28:02.001861 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v2mgq" event={"ID":"a164bc1f-d5f1-4538-86c7-98edbe73d0af","Type":"ContainerStarted","Data":"d70f1b379ebd7cd6ccf6ab55b5a33fec830b34554148591d7c2c273009123d7c"} Feb 26 11:28:02 crc kubenswrapper[4699]: I0226 11:28:02.024389 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v2mgq" podStartSLOduration=2.609304583 podStartE2EDuration="5.024373575s" podCreationTimestamp="2026-02-26 11:27:57 +0000 UTC" firstStartedPulling="2026-02-26 11:27:58.977741598 +0000 UTC m=+1024.788568032" lastFinishedPulling="2026-02-26 11:28:01.39281059 +0000 UTC m=+1027.203637024" observedRunningTime="2026-02-26 11:28:02.019551597 +0000 UTC m=+1027.830378031" watchObservedRunningTime="2026-02-26 11:28:02.024373575 +0000 UTC m=+1027.835200009" Feb 26 11:28:03 crc kubenswrapper[4699]: I0226 11:28:03.009448 4699 generic.go:334] "Generic (PLEG): container finished" podID="d05b2b3d-2906-4acc-aaa2-2f2674e46f27" containerID="1a0ef1ef6d99c76627fc03dba6d4f740ea96e617f11be2b18231f70b40dd8703" exitCode=0 Feb 26 11:28:03 crc kubenswrapper[4699]: I0226 11:28:03.009573 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535088-rwpx5" event={"ID":"d05b2b3d-2906-4acc-aaa2-2f2674e46f27","Type":"ContainerDied","Data":"1a0ef1ef6d99c76627fc03dba6d4f740ea96e617f11be2b18231f70b40dd8703"} Feb 26 11:28:04 crc kubenswrapper[4699]: I0226 11:28:04.244474 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535088-rwpx5" Feb 26 11:28:04 crc kubenswrapper[4699]: I0226 11:28:04.389069 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdbpt\" (UniqueName: \"kubernetes.io/projected/d05b2b3d-2906-4acc-aaa2-2f2674e46f27-kube-api-access-hdbpt\") pod \"d05b2b3d-2906-4acc-aaa2-2f2674e46f27\" (UID: \"d05b2b3d-2906-4acc-aaa2-2f2674e46f27\") " Feb 26 11:28:04 crc kubenswrapper[4699]: I0226 11:28:04.394343 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d05b2b3d-2906-4acc-aaa2-2f2674e46f27-kube-api-access-hdbpt" (OuterVolumeSpecName: "kube-api-access-hdbpt") pod "d05b2b3d-2906-4acc-aaa2-2f2674e46f27" (UID: "d05b2b3d-2906-4acc-aaa2-2f2674e46f27"). InnerVolumeSpecName "kube-api-access-hdbpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:28:04 crc kubenswrapper[4699]: I0226 11:28:04.490252 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdbpt\" (UniqueName: \"kubernetes.io/projected/d05b2b3d-2906-4acc-aaa2-2f2674e46f27-kube-api-access-hdbpt\") on node \"crc\" DevicePath \"\"" Feb 26 11:28:05 crc kubenswrapper[4699]: I0226 11:28:05.030916 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535088-rwpx5" event={"ID":"d05b2b3d-2906-4acc-aaa2-2f2674e46f27","Type":"ContainerDied","Data":"0dac7df2c4897357eb059b21b5fc653d4db4b06327bc9ee7f07988361f4b297c"} Feb 26 11:28:05 crc kubenswrapper[4699]: I0226 11:28:05.031344 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dac7df2c4897357eb059b21b5fc653d4db4b06327bc9ee7f07988361f4b297c" Feb 26 11:28:05 crc kubenswrapper[4699]: I0226 11:28:05.031025 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535088-rwpx5" Feb 26 11:28:05 crc kubenswrapper[4699]: I0226 11:28:05.288389 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535082-2l88q"] Feb 26 11:28:05 crc kubenswrapper[4699]: I0226 11:28:05.292714 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535082-2l88q"] Feb 26 11:28:06 crc kubenswrapper[4699]: I0226 11:28:06.269301 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b96109ee-edc2-496a-b6bc-cffad5fb9a40" path="/var/lib/kubelet/pods/b96109ee-edc2-496a-b6bc-cffad5fb9a40/volumes" Feb 26 11:28:07 crc kubenswrapper[4699]: I0226 11:28:07.695207 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s5dgj"] Feb 26 11:28:07 crc kubenswrapper[4699]: E0226 11:28:07.695697 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d05b2b3d-2906-4acc-aaa2-2f2674e46f27" containerName="oc" Feb 26 11:28:07 crc kubenswrapper[4699]: I0226 11:28:07.695708 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="d05b2b3d-2906-4acc-aaa2-2f2674e46f27" containerName="oc" Feb 26 11:28:07 crc kubenswrapper[4699]: I0226 11:28:07.695810 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="d05b2b3d-2906-4acc-aaa2-2f2674e46f27" containerName="oc" Feb 26 11:28:07 crc kubenswrapper[4699]: I0226 11:28:07.696648 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s5dgj" Feb 26 11:28:07 crc kubenswrapper[4699]: I0226 11:28:07.716502 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s5dgj"] Feb 26 11:28:07 crc kubenswrapper[4699]: I0226 11:28:07.832610 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9301b29-bf1c-45bc-9192-d9513b5b0726-catalog-content\") pod \"certified-operators-s5dgj\" (UID: \"f9301b29-bf1c-45bc-9192-d9513b5b0726\") " pod="openshift-marketplace/certified-operators-s5dgj" Feb 26 11:28:07 crc kubenswrapper[4699]: I0226 11:28:07.832686 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9301b29-bf1c-45bc-9192-d9513b5b0726-utilities\") pod \"certified-operators-s5dgj\" (UID: \"f9301b29-bf1c-45bc-9192-d9513b5b0726\") " pod="openshift-marketplace/certified-operators-s5dgj" Feb 26 11:28:07 crc kubenswrapper[4699]: I0226 11:28:07.832712 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t2hz\" (UniqueName: \"kubernetes.io/projected/f9301b29-bf1c-45bc-9192-d9513b5b0726-kube-api-access-4t2hz\") pod \"certified-operators-s5dgj\" (UID: \"f9301b29-bf1c-45bc-9192-d9513b5b0726\") " pod="openshift-marketplace/certified-operators-s5dgj" Feb 26 11:28:07 crc kubenswrapper[4699]: I0226 11:28:07.933504 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9301b29-bf1c-45bc-9192-d9513b5b0726-utilities\") pod \"certified-operators-s5dgj\" (UID: \"f9301b29-bf1c-45bc-9192-d9513b5b0726\") " pod="openshift-marketplace/certified-operators-s5dgj" Feb 26 11:28:07 crc kubenswrapper[4699]: I0226 11:28:07.933571 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t2hz\" (UniqueName: \"kubernetes.io/projected/f9301b29-bf1c-45bc-9192-d9513b5b0726-kube-api-access-4t2hz\") pod \"certified-operators-s5dgj\" (UID: \"f9301b29-bf1c-45bc-9192-d9513b5b0726\") " pod="openshift-marketplace/certified-operators-s5dgj" Feb 26 11:28:07 crc kubenswrapper[4699]: I0226 11:28:07.933653 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9301b29-bf1c-45bc-9192-d9513b5b0726-catalog-content\") pod \"certified-operators-s5dgj\" (UID: \"f9301b29-bf1c-45bc-9192-d9513b5b0726\") " pod="openshift-marketplace/certified-operators-s5dgj" Feb 26 11:28:07 crc kubenswrapper[4699]: I0226 11:28:07.934049 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9301b29-bf1c-45bc-9192-d9513b5b0726-utilities\") pod \"certified-operators-s5dgj\" (UID: \"f9301b29-bf1c-45bc-9192-d9513b5b0726\") " pod="openshift-marketplace/certified-operators-s5dgj" Feb 26 11:28:07 crc kubenswrapper[4699]: I0226 11:28:07.934097 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9301b29-bf1c-45bc-9192-d9513b5b0726-catalog-content\") pod \"certified-operators-s5dgj\" (UID: \"f9301b29-bf1c-45bc-9192-d9513b5b0726\") " pod="openshift-marketplace/certified-operators-s5dgj" Feb 26 11:28:07 crc kubenswrapper[4699]: I0226 11:28:07.955397 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t2hz\" (UniqueName: \"kubernetes.io/projected/f9301b29-bf1c-45bc-9192-d9513b5b0726-kube-api-access-4t2hz\") pod \"certified-operators-s5dgj\" (UID: \"f9301b29-bf1c-45bc-9192-d9513b5b0726\") " pod="openshift-marketplace/certified-operators-s5dgj" Feb 26 11:28:08 crc kubenswrapper[4699]: I0226 11:28:08.013877 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s5dgj" Feb 26 11:28:08 crc kubenswrapper[4699]: I0226 11:28:08.018024 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v2mgq" Feb 26 11:28:08 crc kubenswrapper[4699]: I0226 11:28:08.018333 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v2mgq" Feb 26 11:28:08 crc kubenswrapper[4699]: I0226 11:28:08.066874 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v2mgq" Feb 26 11:28:08 crc kubenswrapper[4699]: I0226 11:28:08.842297 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s5dgj"] Feb 26 11:28:08 crc kubenswrapper[4699]: W0226 11:28:08.851732 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9301b29_bf1c_45bc_9192_d9513b5b0726.slice/crio-89b87751f0ad4bddeec5fe0cb86b72e858b80a42b2279b3ac84b208c0e16ab83 WatchSource:0}: Error finding container 89b87751f0ad4bddeec5fe0cb86b72e858b80a42b2279b3ac84b208c0e16ab83: Status 404 returned error can't find the container with id 89b87751f0ad4bddeec5fe0cb86b72e858b80a42b2279b3ac84b208c0e16ab83 Feb 26 11:28:09 crc kubenswrapper[4699]: I0226 11:28:09.057342 4699 generic.go:334] "Generic (PLEG): container finished" podID="f9301b29-bf1c-45bc-9192-d9513b5b0726" containerID="7960725ec9599c885caf90f1afa64d37fa9657019482f56a5ed54538ab6dd21e" exitCode=0 Feb 26 11:28:09 crc kubenswrapper[4699]: I0226 11:28:09.059167 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s5dgj" event={"ID":"f9301b29-bf1c-45bc-9192-d9513b5b0726","Type":"ContainerDied","Data":"7960725ec9599c885caf90f1afa64d37fa9657019482f56a5ed54538ab6dd21e"} Feb 26 11:28:09 crc kubenswrapper[4699]: I0226 11:28:09.059205 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s5dgj" event={"ID":"f9301b29-bf1c-45bc-9192-d9513b5b0726","Type":"ContainerStarted","Data":"89b87751f0ad4bddeec5fe0cb86b72e858b80a42b2279b3ac84b208c0e16ab83"} Feb 26 11:28:09 crc kubenswrapper[4699]: I0226 11:28:09.099946 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v2mgq" Feb 26 11:28:09 crc kubenswrapper[4699]: I0226 11:28:09.961058 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-7c5cc54f9c-wjrrd" Feb 26 11:28:11 crc kubenswrapper[4699]: I0226 11:28:11.075160 4699 generic.go:334] "Generic (PLEG): container finished" podID="f9301b29-bf1c-45bc-9192-d9513b5b0726" containerID="eb6628a2b056813a322b32dff4bfd3967ebe2e37d7c7dfee12782f9f18908173" exitCode=0 Feb 26 11:28:11 crc kubenswrapper[4699]: I0226 11:28:11.075217 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s5dgj" event={"ID":"f9301b29-bf1c-45bc-9192-d9513b5b0726","Type":"ContainerDied","Data":"eb6628a2b056813a322b32dff4bfd3967ebe2e37d7c7dfee12782f9f18908173"} Feb 26 11:28:11 crc kubenswrapper[4699]: I0226 11:28:11.489559 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v2mgq"] Feb 26 11:28:11 crc kubenswrapper[4699]: I0226 11:28:11.489771 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v2mgq" podUID="a164bc1f-d5f1-4538-86c7-98edbe73d0af" containerName="registry-server" containerID="cri-o://d70f1b379ebd7cd6ccf6ab55b5a33fec830b34554148591d7c2c273009123d7c" gracePeriod=2 Feb 26 11:28:11 crc kubenswrapper[4699]: I0226 11:28:11.585508 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:28:11 crc kubenswrapper[4699]: I0226 11:28:11.585762 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:28:11 crc kubenswrapper[4699]: I0226 11:28:11.585833 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:28:11 crc kubenswrapper[4699]: I0226 11:28:11.597854 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"119837a96f7eb017f5f7e56268e9cf0e4a17276f8f8dd21ae8a57f4864ea4cf7"} pod="openshift-machine-config-operator/machine-config-daemon-28p79" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 11:28:11 crc kubenswrapper[4699]: I0226 11:28:11.598107 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" containerID="cri-o://119837a96f7eb017f5f7e56268e9cf0e4a17276f8f8dd21ae8a57f4864ea4cf7" gracePeriod=600 Feb 26 11:28:11 crc kubenswrapper[4699]: I0226 11:28:11.904604 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v2mgq" Feb 26 11:28:11 crc kubenswrapper[4699]: I0226 11:28:11.987858 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jq6hk\" (UniqueName: \"kubernetes.io/projected/a164bc1f-d5f1-4538-86c7-98edbe73d0af-kube-api-access-jq6hk\") pod \"a164bc1f-d5f1-4538-86c7-98edbe73d0af\" (UID: \"a164bc1f-d5f1-4538-86c7-98edbe73d0af\") " Feb 26 11:28:11 crc kubenswrapper[4699]: I0226 11:28:11.987904 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a164bc1f-d5f1-4538-86c7-98edbe73d0af-catalog-content\") pod \"a164bc1f-d5f1-4538-86c7-98edbe73d0af\" (UID: \"a164bc1f-d5f1-4538-86c7-98edbe73d0af\") " Feb 26 11:28:11 crc kubenswrapper[4699]: I0226 11:28:11.987927 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a164bc1f-d5f1-4538-86c7-98edbe73d0af-utilities\") pod \"a164bc1f-d5f1-4538-86c7-98edbe73d0af\" (UID: \"a164bc1f-d5f1-4538-86c7-98edbe73d0af\") " Feb 26 11:28:11 crc kubenswrapper[4699]: I0226 11:28:11.988900 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a164bc1f-d5f1-4538-86c7-98edbe73d0af-utilities" (OuterVolumeSpecName: "utilities") pod "a164bc1f-d5f1-4538-86c7-98edbe73d0af" (UID: "a164bc1f-d5f1-4538-86c7-98edbe73d0af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:28:11 crc kubenswrapper[4699]: I0226 11:28:11.992809 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a164bc1f-d5f1-4538-86c7-98edbe73d0af-kube-api-access-jq6hk" (OuterVolumeSpecName: "kube-api-access-jq6hk") pod "a164bc1f-d5f1-4538-86c7-98edbe73d0af" (UID: "a164bc1f-d5f1-4538-86c7-98edbe73d0af"). InnerVolumeSpecName "kube-api-access-jq6hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.043364 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a164bc1f-d5f1-4538-86c7-98edbe73d0af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a164bc1f-d5f1-4538-86c7-98edbe73d0af" (UID: "a164bc1f-d5f1-4538-86c7-98edbe73d0af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.084432 4699 generic.go:334] "Generic (PLEG): container finished" podID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerID="119837a96f7eb017f5f7e56268e9cf0e4a17276f8f8dd21ae8a57f4864ea4cf7" exitCode=0 Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.084505 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerDied","Data":"119837a96f7eb017f5f7e56268e9cf0e4a17276f8f8dd21ae8a57f4864ea4cf7"} Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.084556 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerStarted","Data":"2c2d25c558a927e58d9962b6f55de97dac3f222cb5bc89a35791fca832759b03"} Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.084573 4699 scope.go:117] "RemoveContainer" containerID="bb4262ffa74d3c4cd8ca9d3a4ee81267fb75459ec0f5e9e96d1dd3934b8627ca" Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.090587 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jq6hk\" (UniqueName: \"kubernetes.io/projected/a164bc1f-d5f1-4538-86c7-98edbe73d0af-kube-api-access-jq6hk\") on node \"crc\" DevicePath \"\"" Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.090621 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a164bc1f-d5f1-4538-86c7-98edbe73d0af-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.090913 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a164bc1f-d5f1-4538-86c7-98edbe73d0af-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.094485 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s5dgj" event={"ID":"f9301b29-bf1c-45bc-9192-d9513b5b0726","Type":"ContainerStarted","Data":"3bbe57bc2e73648e372826fca5459292ff76808cee8395184b212e51692ed135"} Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.097968 4699 generic.go:334] "Generic (PLEG): container finished" podID="a164bc1f-d5f1-4538-86c7-98edbe73d0af" containerID="d70f1b379ebd7cd6ccf6ab55b5a33fec830b34554148591d7c2c273009123d7c" exitCode=0 Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.098007 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v2mgq" event={"ID":"a164bc1f-d5f1-4538-86c7-98edbe73d0af","Type":"ContainerDied","Data":"d70f1b379ebd7cd6ccf6ab55b5a33fec830b34554148591d7c2c273009123d7c"} Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.098027 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v2mgq" event={"ID":"a164bc1f-d5f1-4538-86c7-98edbe73d0af","Type":"ContainerDied","Data":"753782b8fb80b4cc413697d69c7fb60630bc30f6fc60cfabadcf0bf9e2078d1c"} Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.098067 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v2mgq" Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.127916 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s5dgj" podStartSLOduration=2.655592584 podStartE2EDuration="5.127893555s" podCreationTimestamp="2026-02-26 11:28:07 +0000 UTC" firstStartedPulling="2026-02-26 11:28:09.060013047 +0000 UTC m=+1034.870839481" lastFinishedPulling="2026-02-26 11:28:11.532314018 +0000 UTC m=+1037.343140452" observedRunningTime="2026-02-26 11:28:12.121472453 +0000 UTC m=+1037.932298897" watchObservedRunningTime="2026-02-26 11:28:12.127893555 +0000 UTC m=+1037.938719989" Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.128982 4699 scope.go:117] "RemoveContainer" containerID="d70f1b379ebd7cd6ccf6ab55b5a33fec830b34554148591d7c2c273009123d7c" Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.145291 4699 scope.go:117] "RemoveContainer" containerID="9732faae7de71e9de9c0e1daa7b9482435142bd3504f1581216d9da8919621e4" Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.146716 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v2mgq"] Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.153639 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v2mgq"] Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.161521 4699 scope.go:117] "RemoveContainer" containerID="d21c4a1c2d1695fcffcda48cd540319b245b3271d7e80b7aed7da48b650046d0" Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.176196 4699 scope.go:117] "RemoveContainer" containerID="d70f1b379ebd7cd6ccf6ab55b5a33fec830b34554148591d7c2c273009123d7c" Feb 26 11:28:12 crc kubenswrapper[4699]: E0226 11:28:12.176914 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d70f1b379ebd7cd6ccf6ab55b5a33fec830b34554148591d7c2c273009123d7c\": container with ID starting with d70f1b379ebd7cd6ccf6ab55b5a33fec830b34554148591d7c2c273009123d7c not found: ID does not exist" containerID="d70f1b379ebd7cd6ccf6ab55b5a33fec830b34554148591d7c2c273009123d7c" Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.176951 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d70f1b379ebd7cd6ccf6ab55b5a33fec830b34554148591d7c2c273009123d7c"} err="failed to get container status \"d70f1b379ebd7cd6ccf6ab55b5a33fec830b34554148591d7c2c273009123d7c\": rpc error: code = NotFound desc = could not find container \"d70f1b379ebd7cd6ccf6ab55b5a33fec830b34554148591d7c2c273009123d7c\": container with ID starting with d70f1b379ebd7cd6ccf6ab55b5a33fec830b34554148591d7c2c273009123d7c not found: ID does not exist" Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.176979 4699 scope.go:117] "RemoveContainer" containerID="9732faae7de71e9de9c0e1daa7b9482435142bd3504f1581216d9da8919621e4" Feb 26 11:28:12 crc kubenswrapper[4699]: E0226 11:28:12.177376 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9732faae7de71e9de9c0e1daa7b9482435142bd3504f1581216d9da8919621e4\": container with ID starting with 9732faae7de71e9de9c0e1daa7b9482435142bd3504f1581216d9da8919621e4 not found: ID does not exist" containerID="9732faae7de71e9de9c0e1daa7b9482435142bd3504f1581216d9da8919621e4" Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.177395 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9732faae7de71e9de9c0e1daa7b9482435142bd3504f1581216d9da8919621e4"} err="failed to get container status \"9732faae7de71e9de9c0e1daa7b9482435142bd3504f1581216d9da8919621e4\": rpc error: code = NotFound desc = could not find container \"9732faae7de71e9de9c0e1daa7b9482435142bd3504f1581216d9da8919621e4\": container with ID starting with 9732faae7de71e9de9c0e1daa7b9482435142bd3504f1581216d9da8919621e4 not found: ID does not exist" Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.177410 4699 scope.go:117] "RemoveContainer" containerID="d21c4a1c2d1695fcffcda48cd540319b245b3271d7e80b7aed7da48b650046d0" Feb 26 11:28:12 crc kubenswrapper[4699]: E0226 11:28:12.177648 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d21c4a1c2d1695fcffcda48cd540319b245b3271d7e80b7aed7da48b650046d0\": container with ID starting with d21c4a1c2d1695fcffcda48cd540319b245b3271d7e80b7aed7da48b650046d0 not found: ID does not exist" containerID="d21c4a1c2d1695fcffcda48cd540319b245b3271d7e80b7aed7da48b650046d0" Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.177687 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d21c4a1c2d1695fcffcda48cd540319b245b3271d7e80b7aed7da48b650046d0"} err="failed to get container status \"d21c4a1c2d1695fcffcda48cd540319b245b3271d7e80b7aed7da48b650046d0\": rpc error: code = NotFound desc = could not find container \"d21c4a1c2d1695fcffcda48cd540319b245b3271d7e80b7aed7da48b650046d0\": container with ID starting with d21c4a1c2d1695fcffcda48cd540319b245b3271d7e80b7aed7da48b650046d0 not found: ID does not exist" Feb 26 11:28:12 crc kubenswrapper[4699]: I0226 11:28:12.268385 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a164bc1f-d5f1-4538-86c7-98edbe73d0af" path="/var/lib/kubelet/pods/a164bc1f-d5f1-4538-86c7-98edbe73d0af/volumes" Feb 26 11:28:18 crc kubenswrapper[4699]: I0226 11:28:18.014796 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s5dgj" Feb 26 11:28:18 crc kubenswrapper[4699]: I0226 11:28:18.015456 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s5dgj" Feb 26 11:28:18 crc kubenswrapper[4699]: I0226 11:28:18.052716 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s5dgj" Feb 26 11:28:18 crc kubenswrapper[4699]: I0226 11:28:18.186323 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s5dgj" Feb 26 11:28:20 crc kubenswrapper[4699]: I0226 11:28:20.291610 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s5dgj"] Feb 26 11:28:20 crc kubenswrapper[4699]: I0226 11:28:20.292218 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s5dgj" podUID="f9301b29-bf1c-45bc-9192-d9513b5b0726" containerName="registry-server" containerID="cri-o://3bbe57bc2e73648e372826fca5459292ff76808cee8395184b212e51692ed135" gracePeriod=2 Feb 26 11:28:21 crc kubenswrapper[4699]: I0226 11:28:21.108356 4699 scope.go:117] "RemoveContainer" containerID="13f8b1b98d014497027ee7037eac5f0ce1bbfdb9879bcfae0154cb4a61717ad1" Feb 26 11:28:21 crc kubenswrapper[4699]: I0226 11:28:21.171364 4699 generic.go:334] "Generic (PLEG): container finished" podID="f9301b29-bf1c-45bc-9192-d9513b5b0726" containerID="3bbe57bc2e73648e372826fca5459292ff76808cee8395184b212e51692ed135" exitCode=0 Feb 26 11:28:21 crc kubenswrapper[4699]: I0226 11:28:21.171446 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s5dgj" event={"ID":"f9301b29-bf1c-45bc-9192-d9513b5b0726","Type":"ContainerDied","Data":"3bbe57bc2e73648e372826fca5459292ff76808cee8395184b212e51692ed135"} Feb 26 11:28:21 crc kubenswrapper[4699]: I0226 11:28:21.909820 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s5dgj" Feb 26 11:28:22 crc kubenswrapper[4699]: I0226 11:28:22.070546 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9301b29-bf1c-45bc-9192-d9513b5b0726-catalog-content\") pod \"f9301b29-bf1c-45bc-9192-d9513b5b0726\" (UID: \"f9301b29-bf1c-45bc-9192-d9513b5b0726\") " Feb 26 11:28:22 crc kubenswrapper[4699]: I0226 11:28:22.070639 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t2hz\" (UniqueName: \"kubernetes.io/projected/f9301b29-bf1c-45bc-9192-d9513b5b0726-kube-api-access-4t2hz\") pod \"f9301b29-bf1c-45bc-9192-d9513b5b0726\" (UID: \"f9301b29-bf1c-45bc-9192-d9513b5b0726\") " Feb 26 11:28:22 crc kubenswrapper[4699]: I0226 11:28:22.070727 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9301b29-bf1c-45bc-9192-d9513b5b0726-utilities\") pod \"f9301b29-bf1c-45bc-9192-d9513b5b0726\" (UID: \"f9301b29-bf1c-45bc-9192-d9513b5b0726\") " Feb 26 11:28:22 crc kubenswrapper[4699]: I0226 11:28:22.071951 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9301b29-bf1c-45bc-9192-d9513b5b0726-utilities" (OuterVolumeSpecName: "utilities") pod "f9301b29-bf1c-45bc-9192-d9513b5b0726" (UID: "f9301b29-bf1c-45bc-9192-d9513b5b0726"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:28:22 crc kubenswrapper[4699]: I0226 11:28:22.077280 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9301b29-bf1c-45bc-9192-d9513b5b0726-kube-api-access-4t2hz" (OuterVolumeSpecName: "kube-api-access-4t2hz") pod "f9301b29-bf1c-45bc-9192-d9513b5b0726" (UID: "f9301b29-bf1c-45bc-9192-d9513b5b0726"). InnerVolumeSpecName "kube-api-access-4t2hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:28:22 crc kubenswrapper[4699]: I0226 11:28:22.172824 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9301b29-bf1c-45bc-9192-d9513b5b0726-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:28:22 crc kubenswrapper[4699]: I0226 11:28:22.172857 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t2hz\" (UniqueName: \"kubernetes.io/projected/f9301b29-bf1c-45bc-9192-d9513b5b0726-kube-api-access-4t2hz\") on node \"crc\" DevicePath \"\"" Feb 26 11:28:22 crc kubenswrapper[4699]: I0226 11:28:22.183940 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s5dgj" event={"ID":"f9301b29-bf1c-45bc-9192-d9513b5b0726","Type":"ContainerDied","Data":"89b87751f0ad4bddeec5fe0cb86b72e858b80a42b2279b3ac84b208c0e16ab83"} Feb 26 11:28:22 crc kubenswrapper[4699]: I0226 11:28:22.183969 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s5dgj" Feb 26 11:28:22 crc kubenswrapper[4699]: I0226 11:28:22.184001 4699 scope.go:117] "RemoveContainer" containerID="3bbe57bc2e73648e372826fca5459292ff76808cee8395184b212e51692ed135" Feb 26 11:28:22 crc kubenswrapper[4699]: I0226 11:28:22.202161 4699 scope.go:117] "RemoveContainer" containerID="eb6628a2b056813a322b32dff4bfd3967ebe2e37d7c7dfee12782f9f18908173" Feb 26 11:28:22 crc kubenswrapper[4699]: I0226 11:28:22.221174 4699 scope.go:117] "RemoveContainer" containerID="7960725ec9599c885caf90f1afa64d37fa9657019482f56a5ed54538ab6dd21e" Feb 26 11:28:22 crc kubenswrapper[4699]: I0226 11:28:22.680395 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9301b29-bf1c-45bc-9192-d9513b5b0726-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9301b29-bf1c-45bc-9192-d9513b5b0726" (UID: "f9301b29-bf1c-45bc-9192-d9513b5b0726"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:28:22 crc kubenswrapper[4699]: I0226 11:28:22.779904 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9301b29-bf1c-45bc-9192-d9513b5b0726-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:28:22 crc kubenswrapper[4699]: I0226 11:28:22.811382 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s5dgj"] Feb 26 11:28:22 crc kubenswrapper[4699]: I0226 11:28:22.816805 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s5dgj"] Feb 26 11:28:24 crc kubenswrapper[4699]: I0226 11:28:24.272656 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9301b29-bf1c-45bc-9192-d9513b5b0726" path="/var/lib/kubelet/pods/f9301b29-bf1c-45bc-9192-d9513b5b0726/volumes" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.759543 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-sndb9"] Feb 26 11:28:29 crc kubenswrapper[4699]: E0226 11:28:29.761166 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a164bc1f-d5f1-4538-86c7-98edbe73d0af" containerName="extract-content" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.761188 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="a164bc1f-d5f1-4538-86c7-98edbe73d0af" containerName="extract-content" Feb 26 11:28:29 crc kubenswrapper[4699]: E0226 11:28:29.761203 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9301b29-bf1c-45bc-9192-d9513b5b0726" containerName="extract-content" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.761211 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9301b29-bf1c-45bc-9192-d9513b5b0726" containerName="extract-content" Feb 26 11:28:29 crc kubenswrapper[4699]: E0226 11:28:29.761230 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a164bc1f-d5f1-4538-86c7-98edbe73d0af" containerName="registry-server" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.761237 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="a164bc1f-d5f1-4538-86c7-98edbe73d0af" containerName="registry-server" Feb 26 11:28:29 crc kubenswrapper[4699]: E0226 11:28:29.761248 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9301b29-bf1c-45bc-9192-d9513b5b0726" containerName="registry-server" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.761255 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9301b29-bf1c-45bc-9192-d9513b5b0726" containerName="registry-server" Feb 26 11:28:29 crc kubenswrapper[4699]: E0226 11:28:29.761265 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9301b29-bf1c-45bc-9192-d9513b5b0726" containerName="extract-utilities" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.761273 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9301b29-bf1c-45bc-9192-d9513b5b0726" containerName="extract-utilities" Feb 26 11:28:29 crc kubenswrapper[4699]: E0226 11:28:29.761281 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a164bc1f-d5f1-4538-86c7-98edbe73d0af" containerName="extract-utilities" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.761287 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="a164bc1f-d5f1-4538-86c7-98edbe73d0af" containerName="extract-utilities" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.761405 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="a164bc1f-d5f1-4538-86c7-98edbe73d0af" containerName="registry-server" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.761420 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9301b29-bf1c-45bc-9192-d9513b5b0726" containerName="registry-server" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.761857 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-sndb9" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.763254 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-lhv6d" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.764439 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-xw85z"] Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.765366 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-xw85z" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.767415 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-9gzn2" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.771661 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-xw85z"] Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.775643 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn9kw\" (UniqueName: \"kubernetes.io/projected/35555f68-d5c4-44b2-9dfa-af5f91f57c7c-kube-api-access-gn9kw\") pod \"cinder-operator-controller-manager-55d77d7b5c-xw85z\" (UID: \"35555f68-d5c4-44b2-9dfa-af5f91f57c7c\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-xw85z" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.775780 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwvsl\" (UniqueName: \"kubernetes.io/projected/1814471e-5f82-4464-9528-75da66d7235b-kube-api-access-nwvsl\") pod \"barbican-operator-controller-manager-868647ff47-sndb9\" (UID: \"1814471e-5f82-4464-9528-75da66d7235b\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-sndb9" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.781165 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-sndb9"] Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.787088 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-4k4sm"] Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.788222 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-4k4sm" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.803884 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-87dx4" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.811572 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-jh7vz"] Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.812505 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-jh7vz" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.816316 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-qfq8w" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.844423 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-4k4sm"] Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.864068 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-jh7vz"] Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.877636 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-t8c9f"] Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.878230 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn9kw\" (UniqueName: \"kubernetes.io/projected/35555f68-d5c4-44b2-9dfa-af5f91f57c7c-kube-api-access-gn9kw\") pod \"cinder-operator-controller-manager-55d77d7b5c-xw85z\" (UID: \"35555f68-d5c4-44b2-9dfa-af5f91f57c7c\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-xw85z" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.878316 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cnkw\" (UniqueName: \"kubernetes.io/projected/07c2552c-8182-4cfe-a397-39ad287029e5-kube-api-access-2cnkw\") pod \"designate-operator-controller-manager-6d8bf5c495-4k4sm\" (UID: \"07c2552c-8182-4cfe-a397-39ad287029e5\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-4k4sm" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.878354 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwnhp\" (UniqueName: \"kubernetes.io/projected/27e251bb-8f9b-48d4-9ea3-81d03fd85244-kube-api-access-gwnhp\") pod \"glance-operator-controller-manager-784b5bb6c5-jh7vz\" (UID: \"27e251bb-8f9b-48d4-9ea3-81d03fd85244\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-jh7vz" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.878385 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwvsl\" (UniqueName: \"kubernetes.io/projected/1814471e-5f82-4464-9528-75da66d7235b-kube-api-access-nwvsl\") pod \"barbican-operator-controller-manager-868647ff47-sndb9\" (UID: \"1814471e-5f82-4464-9528-75da66d7235b\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-sndb9" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.878925 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-t8c9f" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.881959 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-vxg4r" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.919303 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwvsl\" (UniqueName: \"kubernetes.io/projected/1814471e-5f82-4464-9528-75da66d7235b-kube-api-access-nwvsl\") pod \"barbican-operator-controller-manager-868647ff47-sndb9\" (UID: \"1814471e-5f82-4464-9528-75da66d7235b\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-sndb9" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.919374 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn9kw\" (UniqueName: \"kubernetes.io/projected/35555f68-d5c4-44b2-9dfa-af5f91f57c7c-kube-api-access-gn9kw\") pod \"cinder-operator-controller-manager-55d77d7b5c-xw85z\" (UID: \"35555f68-d5c4-44b2-9dfa-af5f91f57c7c\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-xw85z" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.940086 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qf9vd"] Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.940917 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qf9vd" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.946322 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-wsnzb" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.946833 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qf9vd"] Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.974328 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-t8c9f"] Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.979378 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj2d2\" (UniqueName: \"kubernetes.io/projected/7b204025-d5ff-4c74-96b9-6774b62e0cc4-kube-api-access-pj2d2\") pod \"heat-operator-controller-manager-69f49c598c-t8c9f\" (UID: \"7b204025-d5ff-4c74-96b9-6774b62e0cc4\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-t8c9f" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.979471 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrm4v\" (UniqueName: \"kubernetes.io/projected/619dff06-7255-4aab-9ffe-9f2561bcc904-kube-api-access-wrm4v\") pod \"horizon-operator-controller-manager-5b9b8895d5-qf9vd\" (UID: \"619dff06-7255-4aab-9ffe-9f2561bcc904\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qf9vd" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.979507 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cnkw\" (UniqueName: \"kubernetes.io/projected/07c2552c-8182-4cfe-a397-39ad287029e5-kube-api-access-2cnkw\") pod \"designate-operator-controller-manager-6d8bf5c495-4k4sm\" (UID: \"07c2552c-8182-4cfe-a397-39ad287029e5\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-4k4sm" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.979554 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwnhp\" (UniqueName: \"kubernetes.io/projected/27e251bb-8f9b-48d4-9ea3-81d03fd85244-kube-api-access-gwnhp\") pod \"glance-operator-controller-manager-784b5bb6c5-jh7vz\" (UID: \"27e251bb-8f9b-48d4-9ea3-81d03fd85244\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-jh7vz" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.980234 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-mtrs6"] Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.981230 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-mtrs6" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.987302 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-5k85p"] Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.988138 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5mgbn" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.988287 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 26 11:28:29 crc kubenswrapper[4699]: I0226 11:28:29.989778 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-5k85p" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.000215 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-mtrs6"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.002505 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-mqvph" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.023714 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cnkw\" (UniqueName: \"kubernetes.io/projected/07c2552c-8182-4cfe-a397-39ad287029e5-kube-api-access-2cnkw\") pod \"designate-operator-controller-manager-6d8bf5c495-4k4sm\" (UID: \"07c2552c-8182-4cfe-a397-39ad287029e5\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-4k4sm" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.028701 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-5k85p"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.040573 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwnhp\" (UniqueName: \"kubernetes.io/projected/27e251bb-8f9b-48d4-9ea3-81d03fd85244-kube-api-access-gwnhp\") pod \"glance-operator-controller-manager-784b5bb6c5-jh7vz\" (UID: \"27e251bb-8f9b-48d4-9ea3-81d03fd85244\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-jh7vz" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.080294 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj2d2\" (UniqueName: \"kubernetes.io/projected/7b204025-d5ff-4c74-96b9-6774b62e0cc4-kube-api-access-pj2d2\") pod \"heat-operator-controller-manager-69f49c598c-t8c9f\" (UID: \"7b204025-d5ff-4c74-96b9-6774b62e0cc4\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-t8c9f" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.080367 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5jwl\" (UniqueName: \"kubernetes.io/projected/afbeb2d8-c332-447b-a931-9fe7b246914d-kube-api-access-s5jwl\") pod \"infra-operator-controller-manager-79d975b745-mtrs6\" (UID: \"afbeb2d8-c332-447b-a931-9fe7b246914d\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-mtrs6" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.080423 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrm4v\" (UniqueName: \"kubernetes.io/projected/619dff06-7255-4aab-9ffe-9f2561bcc904-kube-api-access-wrm4v\") pod \"horizon-operator-controller-manager-5b9b8895d5-qf9vd\" (UID: \"619dff06-7255-4aab-9ffe-9f2561bcc904\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qf9vd" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.080466 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fjxs\" (UniqueName: \"kubernetes.io/projected/d56efcbf-3414-4bd1-9cbf-d56c434ac529-kube-api-access-2fjxs\") pod \"ironic-operator-controller-manager-554564d7fc-5k85p\" (UID: \"d56efcbf-3414-4bd1-9cbf-d56c434ac529\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-5k85p" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.080491 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afbeb2d8-c332-447b-a931-9fe7b246914d-cert\") pod \"infra-operator-controller-manager-79d975b745-mtrs6\" (UID: \"afbeb2d8-c332-447b-a931-9fe7b246914d\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-mtrs6" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.085337 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-d2pxc"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.086184 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d2pxc" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.097884 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-v465z" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.107169 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-9gwwj"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.108178 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-9gwwj" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.110554 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-sndb9" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.119598 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-6jz96" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.120054 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-95whc"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.120834 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-95whc" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.123731 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-9d64c" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.127083 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-xw85z" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.139405 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-9gwwj"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.139871 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-4k4sm" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.157494 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-jh7vz" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.158553 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrm4v\" (UniqueName: \"kubernetes.io/projected/619dff06-7255-4aab-9ffe-9f2561bcc904-kube-api-access-wrm4v\") pod \"horizon-operator-controller-manager-5b9b8895d5-qf9vd\" (UID: \"619dff06-7255-4aab-9ffe-9f2561bcc904\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qf9vd" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.173161 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-d2pxc"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.175806 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-6gblm"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.176998 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6gblm" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.181234 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj2d2\" (UniqueName: \"kubernetes.io/projected/7b204025-d5ff-4c74-96b9-6774b62e0cc4-kube-api-access-pj2d2\") pod \"heat-operator-controller-manager-69f49c598c-t8c9f\" (UID: \"7b204025-d5ff-4c74-96b9-6774b62e0cc4\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-t8c9f" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.181824 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kgcg\" (UniqueName: \"kubernetes.io/projected/caabfe5b-420c-47c7-9ed2-b4ac9b2d54f2-kube-api-access-8kgcg\") pod \"manila-operator-controller-manager-67d996989d-9gwwj\" (UID: \"caabfe5b-420c-47c7-9ed2-b4ac9b2d54f2\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-9gwwj" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.181860 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h4qj\" (UniqueName: \"kubernetes.io/projected/38eef260-c32f-4568-9936-6197ba984f05-kube-api-access-5h4qj\") pod \"mariadb-operator-controller-manager-6994f66f48-95whc\" (UID: \"38eef260-c32f-4568-9936-6197ba984f05\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-95whc" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.181894 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fjxs\" (UniqueName: \"kubernetes.io/projected/d56efcbf-3414-4bd1-9cbf-d56c434ac529-kube-api-access-2fjxs\") pod \"ironic-operator-controller-manager-554564d7fc-5k85p\" (UID: \"d56efcbf-3414-4bd1-9cbf-d56c434ac529\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-5k85p" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.181928 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afbeb2d8-c332-447b-a931-9fe7b246914d-cert\") pod \"infra-operator-controller-manager-79d975b745-mtrs6\" (UID: \"afbeb2d8-c332-447b-a931-9fe7b246914d\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-mtrs6" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.181960 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5jwl\" (UniqueName: \"kubernetes.io/projected/afbeb2d8-c332-447b-a931-9fe7b246914d-kube-api-access-s5jwl\") pod \"infra-operator-controller-manager-79d975b745-mtrs6\" (UID: \"afbeb2d8-c332-447b-a931-9fe7b246914d\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-mtrs6" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.181994 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb6zf\" (UniqueName: \"kubernetes.io/projected/a2c419ab-2a99-4d37-b46c-b84024f24b2e-kube-api-access-mb6zf\") pod \"keystone-operator-controller-manager-b4d948c87-d2pxc\" (UID: \"a2c419ab-2a99-4d37-b46c-b84024f24b2e\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d2pxc" Feb 26 11:28:30 crc kubenswrapper[4699]: E0226 11:28:30.182530 4699 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 11:28:30 crc kubenswrapper[4699]: E0226 11:28:30.182589 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afbeb2d8-c332-447b-a931-9fe7b246914d-cert podName:afbeb2d8-c332-447b-a931-9fe7b246914d nodeName:}" failed. No retries permitted until 2026-02-26 11:28:30.682567414 +0000 UTC m=+1056.493393848 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/afbeb2d8-c332-447b-a931-9fe7b246914d-cert") pod "infra-operator-controller-manager-79d975b745-mtrs6" (UID: "afbeb2d8-c332-447b-a931-9fe7b246914d") : secret "infra-operator-webhook-server-cert" not found Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.184355 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-4mghs"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.185430 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mghs" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.189348 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-gltb9" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.205431 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-6gblm"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.205725 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-t8c9f" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.224458 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-95whc"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.224500 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2wj2n"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.247668 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2wj2n" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.272887 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-6d6ph" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.273300 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-kllrb" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.287243 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qf9vd" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.293865 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-4mghs"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.317830 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fjxs\" (UniqueName: \"kubernetes.io/projected/d56efcbf-3414-4bd1-9cbf-d56c434ac529-kube-api-access-2fjxs\") pod \"ironic-operator-controller-manager-554564d7fc-5k85p\" (UID: \"d56efcbf-3414-4bd1-9cbf-d56c434ac529\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-5k85p" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.335284 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5jwl\" (UniqueName: \"kubernetes.io/projected/afbeb2d8-c332-447b-a931-9fe7b246914d-kube-api-access-s5jwl\") pod \"infra-operator-controller-manager-79d975b745-mtrs6\" (UID: \"afbeb2d8-c332-447b-a931-9fe7b246914d\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-mtrs6" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.386491 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kgcg\" (UniqueName: \"kubernetes.io/projected/caabfe5b-420c-47c7-9ed2-b4ac9b2d54f2-kube-api-access-8kgcg\") pod \"manila-operator-controller-manager-67d996989d-9gwwj\" (UID: \"caabfe5b-420c-47c7-9ed2-b4ac9b2d54f2\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-9gwwj" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.386558 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h4qj\" (UniqueName: \"kubernetes.io/projected/38eef260-c32f-4568-9936-6197ba984f05-kube-api-access-5h4qj\") pod \"mariadb-operator-controller-manager-6994f66f48-95whc\" (UID: \"38eef260-c32f-4568-9936-6197ba984f05\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-95whc" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.387018 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb6zf\" (UniqueName: \"kubernetes.io/projected/a2c419ab-2a99-4d37-b46c-b84024f24b2e-kube-api-access-mb6zf\") pod \"keystone-operator-controller-manager-b4d948c87-d2pxc\" (UID: \"a2c419ab-2a99-4d37-b46c-b84024f24b2e\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d2pxc" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.403169 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-5k85p" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.455631 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h4qj\" (UniqueName: \"kubernetes.io/projected/38eef260-c32f-4568-9936-6197ba984f05-kube-api-access-5h4qj\") pod \"mariadb-operator-controller-manager-6994f66f48-95whc\" (UID: \"38eef260-c32f-4568-9936-6197ba984f05\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-95whc" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.456341 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb6zf\" (UniqueName: \"kubernetes.io/projected/a2c419ab-2a99-4d37-b46c-b84024f24b2e-kube-api-access-mb6zf\") pod \"keystone-operator-controller-manager-b4d948c87-d2pxc\" (UID: \"a2c419ab-2a99-4d37-b46c-b84024f24b2e\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d2pxc" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.480041 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kgcg\" (UniqueName: \"kubernetes.io/projected/caabfe5b-420c-47c7-9ed2-b4ac9b2d54f2-kube-api-access-8kgcg\") pod \"manila-operator-controller-manager-67d996989d-9gwwj\" (UID: \"caabfe5b-420c-47c7-9ed2-b4ac9b2d54f2\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-9gwwj" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.489276 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vmk9\" (UniqueName: \"kubernetes.io/projected/0ca72d8e-ed5f-47e4-b9c0-62b41fd687ee-kube-api-access-7vmk9\") pod \"nova-operator-controller-manager-567668f5cf-4mghs\" (UID: \"0ca72d8e-ed5f-47e4-b9c0-62b41fd687ee\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mghs" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.489328 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2wxf\" (UniqueName: \"kubernetes.io/projected/54959b79-361c-415a-986d-1af6d8eb6701-kube-api-access-g2wxf\") pod \"neutron-operator-controller-manager-6bd4687957-6gblm\" (UID: \"54959b79-361c-415a-986d-1af6d8eb6701\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6gblm" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.489372 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97rqs\" (UniqueName: \"kubernetes.io/projected/a6e7ca85-e18b-4605-9180-316f65b82006-kube-api-access-97rqs\") pod \"octavia-operator-controller-manager-659dc6bbfc-2wj2n\" (UID: \"a6e7ca85-e18b-4605-9180-316f65b82006\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2wj2n" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.514550 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2wj2n"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.523362 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.525600 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.532154 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.532617 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-np2lz" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.570426 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d2pxc" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.577940 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.592590 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vmk9\" (UniqueName: \"kubernetes.io/projected/0ca72d8e-ed5f-47e4-b9c0-62b41fd687ee-kube-api-access-7vmk9\") pod \"nova-operator-controller-manager-567668f5cf-4mghs\" (UID: \"0ca72d8e-ed5f-47e4-b9c0-62b41fd687ee\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mghs" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.592654 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2wxf\" (UniqueName: \"kubernetes.io/projected/54959b79-361c-415a-986d-1af6d8eb6701-kube-api-access-g2wxf\") pod \"neutron-operator-controller-manager-6bd4687957-6gblm\" (UID: \"54959b79-361c-415a-986d-1af6d8eb6701\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6gblm" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.592707 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97rqs\" (UniqueName: \"kubernetes.io/projected/a6e7ca85-e18b-4605-9180-316f65b82006-kube-api-access-97rqs\") pod \"octavia-operator-controller-manager-659dc6bbfc-2wj2n\" (UID: \"a6e7ca85-e18b-4605-9180-316f65b82006\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2wj2n" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.593796 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-96png"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.596369 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-96png" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.597928 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-9gwwj" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.601503 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-vgngh" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.614067 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-jxr77"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.620205 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jxr77" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.622340 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-fflsw" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.644070 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-96png"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.646392 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vmk9\" (UniqueName: \"kubernetes.io/projected/0ca72d8e-ed5f-47e4-b9c0-62b41fd687ee-kube-api-access-7vmk9\") pod \"nova-operator-controller-manager-567668f5cf-4mghs\" (UID: \"0ca72d8e-ed5f-47e4-b9c0-62b41fd687ee\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mghs" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.646392 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2wxf\" (UniqueName: \"kubernetes.io/projected/54959b79-361c-415a-986d-1af6d8eb6701-kube-api-access-g2wxf\") pod \"neutron-operator-controller-manager-6bd4687957-6gblm\" (UID: \"54959b79-361c-415a-986d-1af6d8eb6701\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6gblm" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.647372 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97rqs\" (UniqueName: \"kubernetes.io/projected/a6e7ca85-e18b-4605-9180-316f65b82006-kube-api-access-97rqs\") pod \"octavia-operator-controller-manager-659dc6bbfc-2wj2n\" (UID: \"a6e7ca85-e18b-4605-9180-316f65b82006\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2wj2n" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.651606 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2wj2n" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.679939 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-jxr77"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.694839 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qzxb\" (UniqueName: \"kubernetes.io/projected/ce7c40ca-05ad-49ca-a091-02ac588c3eb7-kube-api-access-5qzxb\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb\" (UID: \"ce7c40ca-05ad-49ca-a091-02ac588c3eb7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.694907 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce7c40ca-05ad-49ca-a091-02ac588c3eb7-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb\" (UID: \"ce7c40ca-05ad-49ca-a091-02ac588c3eb7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.695000 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afbeb2d8-c332-447b-a931-9fe7b246914d-cert\") pod \"infra-operator-controller-manager-79d975b745-mtrs6\" (UID: \"afbeb2d8-c332-447b-a931-9fe7b246914d\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-mtrs6" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.695029 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpsrr\" (UniqueName: \"kubernetes.io/projected/a90c4025-7bd1-401b-8f92-5f15a58fb3d6-kube-api-access-tpsrr\") pod \"ovn-operator-controller-manager-5955d8c787-96png\" (UID: \"a90c4025-7bd1-401b-8f92-5f15a58fb3d6\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-96png" Feb 26 11:28:30 crc kubenswrapper[4699]: E0226 11:28:30.695221 4699 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 11:28:30 crc kubenswrapper[4699]: E0226 11:28:30.695299 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afbeb2d8-c332-447b-a931-9fe7b246914d-cert podName:afbeb2d8-c332-447b-a931-9fe7b246914d nodeName:}" failed. No retries permitted until 2026-02-26 11:28:31.695280028 +0000 UTC m=+1057.506106462 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/afbeb2d8-c332-447b-a931-9fe7b246914d-cert") pod "infra-operator-controller-manager-79d975b745-mtrs6" (UID: "afbeb2d8-c332-447b-a931-9fe7b246914d") : secret "infra-operator-webhook-server-cert" not found Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.698297 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-bqvxr"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.699345 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-bqvxr" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.702014 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-j6gsg" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.719721 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-95whc" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.734214 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-bqvxr"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.748015 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mghs" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.758964 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-f9kz5"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.760087 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-f9kz5" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.764701 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-fmhd6" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.764896 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-f9kz5"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.776314 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-mwvnr"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.777236 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-mwvnr" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.781085 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-nts9g" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.781247 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-mwvnr"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.788830 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-fnnc7"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.789177 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6gblm" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.789964 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-fnnc7" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.793507 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-qzv2j" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.797785 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28f6x\" (UniqueName: \"kubernetes.io/projected/7545763d-d2d2-4b6e-980d-737062f0a894-kube-api-access-28f6x\") pod \"placement-operator-controller-manager-8497b45c89-jxr77\" (UID: \"7545763d-d2d2-4b6e-980d-737062f0a894\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jxr77" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.797875 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpsrr\" (UniqueName: \"kubernetes.io/projected/a90c4025-7bd1-401b-8f92-5f15a58fb3d6-kube-api-access-tpsrr\") pod \"ovn-operator-controller-manager-5955d8c787-96png\" (UID: \"a90c4025-7bd1-401b-8f92-5f15a58fb3d6\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-96png" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.798014 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qzxb\" (UniqueName: \"kubernetes.io/projected/ce7c40ca-05ad-49ca-a091-02ac588c3eb7-kube-api-access-5qzxb\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb\" (UID: \"ce7c40ca-05ad-49ca-a091-02ac588c3eb7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.798075 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce7c40ca-05ad-49ca-a091-02ac588c3eb7-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb\" (UID: \"ce7c40ca-05ad-49ca-a091-02ac588c3eb7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.798100 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5nk9\" (UniqueName: \"kubernetes.io/projected/33fc0a61-18c9-4e80-b898-92a5b1b71dac-kube-api-access-p5nk9\") pod \"swift-operator-controller-manager-68f46476f-bqvxr\" (UID: \"33fc0a61-18c9-4e80-b898-92a5b1b71dac\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-bqvxr" Feb 26 11:28:30 crc kubenswrapper[4699]: E0226 11:28:30.800871 4699 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 11:28:30 crc kubenswrapper[4699]: E0226 11:28:30.800921 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce7c40ca-05ad-49ca-a091-02ac588c3eb7-cert podName:ce7c40ca-05ad-49ca-a091-02ac588c3eb7 nodeName:}" failed. No retries permitted until 2026-02-26 11:28:31.300906684 +0000 UTC m=+1057.111733118 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ce7c40ca-05ad-49ca-a091-02ac588c3eb7-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb" (UID: "ce7c40ca-05ad-49ca-a091-02ac588c3eb7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.801032 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-fnnc7"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.832214 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpsrr\" (UniqueName: \"kubernetes.io/projected/a90c4025-7bd1-401b-8f92-5f15a58fb3d6-kube-api-access-tpsrr\") pod \"ovn-operator-controller-manager-5955d8c787-96png\" (UID: \"a90c4025-7bd1-401b-8f92-5f15a58fb3d6\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-96png" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.841335 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.845988 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qzxb\" (UniqueName: \"kubernetes.io/projected/ce7c40ca-05ad-49ca-a091-02ac588c3eb7-kube-api-access-5qzxb\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb\" (UID: \"ce7c40ca-05ad-49ca-a091-02ac588c3eb7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.859827 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.859922 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.861844 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.862273 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.862731 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-x6dfs" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.868230 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ghqf4"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.869083 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ghqf4" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.870229 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-96png" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.871886 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-p5ttl" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.899504 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkj4s\" (UniqueName: \"kubernetes.io/projected/a2b3bf3b-a815-4033-983b-eedc16b8609f-kube-api-access-lkj4s\") pod \"watcher-operator-controller-manager-bccc79885-fnnc7\" (UID: \"a2b3bf3b-a815-4033-983b-eedc16b8609f\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-fnnc7" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.899568 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hdkk\" (UniqueName: \"kubernetes.io/projected/5be0c14a-e51f-4b69-ab58-c0cac66910e2-kube-api-access-6hdkk\") pod \"test-operator-controller-manager-5dc6794d5b-mwvnr\" (UID: \"5be0c14a-e51f-4b69-ab58-c0cac66910e2\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-mwvnr" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.899592 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsfpb\" (UniqueName: \"kubernetes.io/projected/15255a9b-0767-4518-8e81-ca9044f9190a-kube-api-access-wsfpb\") pod \"telemetry-operator-controller-manager-589c568786-f9kz5\" (UID: \"15255a9b-0767-4518-8e81-ca9044f9190a\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-f9kz5" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.899640 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5nk9\" (UniqueName: \"kubernetes.io/projected/33fc0a61-18c9-4e80-b898-92a5b1b71dac-kube-api-access-p5nk9\") pod \"swift-operator-controller-manager-68f46476f-bqvxr\" (UID: \"33fc0a61-18c9-4e80-b898-92a5b1b71dac\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-bqvxr" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.899680 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28f6x\" (UniqueName: \"kubernetes.io/projected/7545763d-d2d2-4b6e-980d-737062f0a894-kube-api-access-28f6x\") pod \"placement-operator-controller-manager-8497b45c89-jxr77\" (UID: \"7545763d-d2d2-4b6e-980d-737062f0a894\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jxr77" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.908493 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ghqf4"] Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.933259 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28f6x\" (UniqueName: \"kubernetes.io/projected/7545763d-d2d2-4b6e-980d-737062f0a894-kube-api-access-28f6x\") pod \"placement-operator-controller-manager-8497b45c89-jxr77\" (UID: \"7545763d-d2d2-4b6e-980d-737062f0a894\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jxr77" Feb 26 11:28:30 crc kubenswrapper[4699]: I0226 11:28:30.933785 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5nk9\" (UniqueName: \"kubernetes.io/projected/33fc0a61-18c9-4e80-b898-92a5b1b71dac-kube-api-access-p5nk9\") pod \"swift-operator-controller-manager-68f46476f-bqvxr\" (UID: \"33fc0a61-18c9-4e80-b898-92a5b1b71dac\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-bqvxr" Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.002019 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4r4l\" (UniqueName: \"kubernetes.io/projected/ebf1a568-be30-4ceb-bc67-e3158a0280b9-kube-api-access-z4r4l\") pod \"openstack-operator-controller-manager-947f4f86b-m69sv\" (UID: \"ebf1a568-be30-4ceb-bc67-e3158a0280b9\") " pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.002354 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-webhook-certs\") pod \"openstack-operator-controller-manager-947f4f86b-m69sv\" (UID: \"ebf1a568-be30-4ceb-bc67-e3158a0280b9\") " pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.002418 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-metrics-certs\") pod \"openstack-operator-controller-manager-947f4f86b-m69sv\" (UID: \"ebf1a568-be30-4ceb-bc67-e3158a0280b9\") " pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.002453 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkj4s\" (UniqueName: \"kubernetes.io/projected/a2b3bf3b-a815-4033-983b-eedc16b8609f-kube-api-access-lkj4s\") pod \"watcher-operator-controller-manager-bccc79885-fnnc7\" (UID: \"a2b3bf3b-a815-4033-983b-eedc16b8609f\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-fnnc7" Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.002482 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2sjn\" (UniqueName: \"kubernetes.io/projected/8d440653-f1c3-483c-a37d-463dcfc15224-kube-api-access-d2sjn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-ghqf4\" (UID: \"8d440653-f1c3-483c-a37d-463dcfc15224\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ghqf4" Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.002515 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hdkk\" (UniqueName: \"kubernetes.io/projected/5be0c14a-e51f-4b69-ab58-c0cac66910e2-kube-api-access-6hdkk\") pod \"test-operator-controller-manager-5dc6794d5b-mwvnr\" (UID: \"5be0c14a-e51f-4b69-ab58-c0cac66910e2\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-mwvnr" Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.002542 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsfpb\" (UniqueName: \"kubernetes.io/projected/15255a9b-0767-4518-8e81-ca9044f9190a-kube-api-access-wsfpb\") pod \"telemetry-operator-controller-manager-589c568786-f9kz5\" (UID: \"15255a9b-0767-4518-8e81-ca9044f9190a\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-f9kz5" Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.046249 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkj4s\" (UniqueName: \"kubernetes.io/projected/a2b3bf3b-a815-4033-983b-eedc16b8609f-kube-api-access-lkj4s\") pod \"watcher-operator-controller-manager-bccc79885-fnnc7\" (UID: \"a2b3bf3b-a815-4033-983b-eedc16b8609f\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-fnnc7" Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.059179 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hdkk\" (UniqueName: \"kubernetes.io/projected/5be0c14a-e51f-4b69-ab58-c0cac66910e2-kube-api-access-6hdkk\") pod \"test-operator-controller-manager-5dc6794d5b-mwvnr\" (UID: \"5be0c14a-e51f-4b69-ab58-c0cac66910e2\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-mwvnr" Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.063686 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsfpb\" (UniqueName: \"kubernetes.io/projected/15255a9b-0767-4518-8e81-ca9044f9190a-kube-api-access-wsfpb\") pod \"telemetry-operator-controller-manager-589c568786-f9kz5\" (UID: \"15255a9b-0767-4518-8e81-ca9044f9190a\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-f9kz5" Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.103878 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-webhook-certs\") pod \"openstack-operator-controller-manager-947f4f86b-m69sv\" (UID: \"ebf1a568-be30-4ceb-bc67-e3158a0280b9\") " pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.103965 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-metrics-certs\") pod \"openstack-operator-controller-manager-947f4f86b-m69sv\" (UID: \"ebf1a568-be30-4ceb-bc67-e3158a0280b9\") " pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.104001 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2sjn\" (UniqueName: \"kubernetes.io/projected/8d440653-f1c3-483c-a37d-463dcfc15224-kube-api-access-d2sjn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-ghqf4\" (UID: \"8d440653-f1c3-483c-a37d-463dcfc15224\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ghqf4" Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.104106 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4r4l\" (UniqueName: \"kubernetes.io/projected/ebf1a568-be30-4ceb-bc67-e3158a0280b9-kube-api-access-z4r4l\") pod \"openstack-operator-controller-manager-947f4f86b-m69sv\" (UID: \"ebf1a568-be30-4ceb-bc67-e3158a0280b9\") " pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" Feb 26 11:28:31 crc kubenswrapper[4699]: E0226 11:28:31.104171 4699 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 11:28:31 crc kubenswrapper[4699]: E0226 11:28:31.104250 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-metrics-certs podName:ebf1a568-be30-4ceb-bc67-e3158a0280b9 nodeName:}" failed. No retries permitted until 2026-02-26 11:28:31.60423057 +0000 UTC m=+1057.415057014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-metrics-certs") pod "openstack-operator-controller-manager-947f4f86b-m69sv" (UID: "ebf1a568-be30-4ceb-bc67-e3158a0280b9") : secret "metrics-server-cert" not found Feb 26 11:28:31 crc kubenswrapper[4699]: E0226 11:28:31.104188 4699 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 11:28:31 crc kubenswrapper[4699]: E0226 11:28:31.104510 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-webhook-certs podName:ebf1a568-be30-4ceb-bc67-e3158a0280b9 nodeName:}" failed. No retries permitted until 2026-02-26 11:28:31.604490277 +0000 UTC m=+1057.415316711 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-webhook-certs") pod "openstack-operator-controller-manager-947f4f86b-m69sv" (UID: "ebf1a568-be30-4ceb-bc67-e3158a0280b9") : secret "webhook-server-cert" not found Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.132415 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2sjn\" (UniqueName: \"kubernetes.io/projected/8d440653-f1c3-483c-a37d-463dcfc15224-kube-api-access-d2sjn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-ghqf4\" (UID: \"8d440653-f1c3-483c-a37d-463dcfc15224\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ghqf4" Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.144604 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4r4l\" (UniqueName: \"kubernetes.io/projected/ebf1a568-be30-4ceb-bc67-e3158a0280b9-kube-api-access-z4r4l\") pod \"openstack-operator-controller-manager-947f4f86b-m69sv\" (UID: \"ebf1a568-be30-4ceb-bc67-e3158a0280b9\") " pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.199015 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jxr77" Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.222184 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-bqvxr" Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.242885 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-mwvnr" Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.272457 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-f9kz5" Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.310808 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce7c40ca-05ad-49ca-a091-02ac588c3eb7-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb\" (UID: \"ce7c40ca-05ad-49ca-a091-02ac588c3eb7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb" Feb 26 11:28:31 crc kubenswrapper[4699]: E0226 11:28:31.311584 4699 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 11:28:31 crc kubenswrapper[4699]: E0226 11:28:31.311648 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce7c40ca-05ad-49ca-a091-02ac588c3eb7-cert podName:ce7c40ca-05ad-49ca-a091-02ac588c3eb7 nodeName:}" failed. No retries permitted until 2026-02-26 11:28:32.311628563 +0000 UTC m=+1058.122454997 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ce7c40ca-05ad-49ca-a091-02ac588c3eb7-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb" (UID: "ce7c40ca-05ad-49ca-a091-02ac588c3eb7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.373213 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-fnnc7" Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.421823 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ghqf4" Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.617942 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-webhook-certs\") pod \"openstack-operator-controller-manager-947f4f86b-m69sv\" (UID: \"ebf1a568-be30-4ceb-bc67-e3158a0280b9\") " pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.618036 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-metrics-certs\") pod \"openstack-operator-controller-manager-947f4f86b-m69sv\" (UID: \"ebf1a568-be30-4ceb-bc67-e3158a0280b9\") " pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" Feb 26 11:28:31 crc kubenswrapper[4699]: E0226 11:28:31.618171 4699 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 11:28:31 crc kubenswrapper[4699]: E0226 11:28:31.618231 4699 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 11:28:31 crc kubenswrapper[4699]: E0226 11:28:31.618258 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-webhook-certs podName:ebf1a568-be30-4ceb-bc67-e3158a0280b9 nodeName:}" failed. No retries permitted until 2026-02-26 11:28:32.61823233 +0000 UTC m=+1058.429058764 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-webhook-certs") pod "openstack-operator-controller-manager-947f4f86b-m69sv" (UID: "ebf1a568-be30-4ceb-bc67-e3158a0280b9") : secret "webhook-server-cert" not found Feb 26 11:28:31 crc kubenswrapper[4699]: E0226 11:28:31.618302 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-metrics-certs podName:ebf1a568-be30-4ceb-bc67-e3158a0280b9 nodeName:}" failed. No retries permitted until 2026-02-26 11:28:32.618285071 +0000 UTC m=+1058.429111585 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-metrics-certs") pod "openstack-operator-controller-manager-947f4f86b-m69sv" (UID: "ebf1a568-be30-4ceb-bc67-e3158a0280b9") : secret "metrics-server-cert" not found Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.630292 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-t8c9f"] Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.635407 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-sndb9"] Feb 26 11:28:31 crc kubenswrapper[4699]: W0226 11:28:31.636964 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b204025_d5ff_4c74_96b9_6774b62e0cc4.slice/crio-8c4fb8e66f8f1dfface81fc9e61cadc1e57ce5df81ab31b96de495e237aaa5a4 WatchSource:0}: Error finding container 8c4fb8e66f8f1dfface81fc9e61cadc1e57ce5df81ab31b96de495e237aaa5a4: Status 404 returned error can't find the container with id 8c4fb8e66f8f1dfface81fc9e61cadc1e57ce5df81ab31b96de495e237aaa5a4 Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.722268 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afbeb2d8-c332-447b-a931-9fe7b246914d-cert\") pod \"infra-operator-controller-manager-79d975b745-mtrs6\" (UID: \"afbeb2d8-c332-447b-a931-9fe7b246914d\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-mtrs6" Feb 26 11:28:31 crc kubenswrapper[4699]: E0226 11:28:31.722485 4699 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 11:28:31 crc kubenswrapper[4699]: E0226 11:28:31.722558 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afbeb2d8-c332-447b-a931-9fe7b246914d-cert podName:afbeb2d8-c332-447b-a931-9fe7b246914d nodeName:}" failed. No retries permitted until 2026-02-26 11:28:33.722523258 +0000 UTC m=+1059.533349692 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/afbeb2d8-c332-447b-a931-9fe7b246914d-cert") pod "infra-operator-controller-manager-79d975b745-mtrs6" (UID: "afbeb2d8-c332-447b-a931-9fe7b246914d") : secret "infra-operator-webhook-server-cert" not found Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.773612 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-5k85p"] Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.806042 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-jh7vz"] Feb 26 11:28:31 crc kubenswrapper[4699]: I0226 11:28:31.960406 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-xw85z"] Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.025173 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-4k4sm"] Feb 26 11:28:32 crc kubenswrapper[4699]: W0226 11:28:32.036393 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07c2552c_8182_4cfe_a397_39ad287029e5.slice/crio-2a5aaba4b6ff0ac20e706fa4cb48c18ea61316825ddc6a5ac1e78e37ff5a21eb WatchSource:0}: Error finding container 2a5aaba4b6ff0ac20e706fa4cb48c18ea61316825ddc6a5ac1e78e37ff5a21eb: Status 404 returned error can't find the container with id 2a5aaba4b6ff0ac20e706fa4cb48c18ea61316825ddc6a5ac1e78e37ff5a21eb Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.083878 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-4mghs"] Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.117919 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-96png"] Feb 26 11:28:32 crc kubenswrapper[4699]: W0226 11:28:32.126036 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda90c4025_7bd1_401b_8f92_5f15a58fb3d6.slice/crio-d6aa0057c32f13b3e8adb701b0ee1ce87290118b32e08f2e34242c3151275b37 WatchSource:0}: Error finding container d6aa0057c32f13b3e8adb701b0ee1ce87290118b32e08f2e34242c3151275b37: Status 404 returned error can't find the container with id d6aa0057c32f13b3e8adb701b0ee1ce87290118b32e08f2e34242c3151275b37 Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.226554 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2wj2n"] Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.238477 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qf9vd"] Feb 26 11:28:32 crc kubenswrapper[4699]: W0226 11:28:32.253056 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6e7ca85_e18b_4605_9180_316f65b82006.slice/crio-aa96c261f60037df1a40fde6b38bfd921f422a546b721874f9515049a4d51ea0 WatchSource:0}: Error finding container aa96c261f60037df1a40fde6b38bfd921f422a546b721874f9515049a4d51ea0: Status 404 returned error can't find the container with id aa96c261f60037df1a40fde6b38bfd921f422a546b721874f9515049a4d51ea0 Feb 26 11:28:32 crc kubenswrapper[4699]: W0226 11:28:32.259282 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod619dff06_7255_4aab_9ffe_9f2561bcc904.slice/crio-db102e99dacd7e0f310a45a6a04234e31237ab1cbcd6c93742ded6628d3cb9d1 WatchSource:0}: Error finding container db102e99dacd7e0f310a45a6a04234e31237ab1cbcd6c93742ded6628d3cb9d1: Status 404 returned error can't find the container with id db102e99dacd7e0f310a45a6a04234e31237ab1cbcd6c93742ded6628d3cb9d1 Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.325788 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-d2pxc"] Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.325835 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-9gwwj"] Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.325846 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-bqvxr"] Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.325855 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-jxr77"] Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.325865 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-f9kz5"] Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.325875 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-95whc"] Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.327534 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-mwvnr"] Feb 26 11:28:32 crc kubenswrapper[4699]: W0226 11:28:32.328347 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7545763d_d2d2_4b6e_980d_737062f0a894.slice/crio-a45f38838029fc27dc6caa1da4ec104a117a6feb671d57bac438346127d9e56f WatchSource:0}: Error finding container a45f38838029fc27dc6caa1da4ec104a117a6feb671d57bac438346127d9e56f: Status 404 returned error can't find the container with id a45f38838029fc27dc6caa1da4ec104a117a6feb671d57bac438346127d9e56f Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.329501 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce7c40ca-05ad-49ca-a091-02ac588c3eb7-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb\" (UID: \"ce7c40ca-05ad-49ca-a091-02ac588c3eb7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb" Feb 26 11:28:32 crc kubenswrapper[4699]: E0226 11:28:32.330361 4699 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 11:28:32 crc kubenswrapper[4699]: E0226 11:28:32.330417 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce7c40ca-05ad-49ca-a091-02ac588c3eb7-cert podName:ce7c40ca-05ad-49ca-a091-02ac588c3eb7 nodeName:}" failed. No retries permitted until 2026-02-26 11:28:34.330398403 +0000 UTC m=+1060.141224837 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ce7c40ca-05ad-49ca-a091-02ac588c3eb7-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb" (UID: "ce7c40ca-05ad-49ca-a091-02ac588c3eb7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.333666 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-6gblm"] Feb 26 11:28:32 crc kubenswrapper[4699]: E0226 11:28:32.344535 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8kgcg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-67d996989d-9gwwj_openstack-operators(caabfe5b-420c-47c7-9ed2-b4ac9b2d54f2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 11:28:32 crc kubenswrapper[4699]: E0226 11:28:32.345676 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-67d996989d-9gwwj" podUID="caabfe5b-420c-47c7-9ed2-b4ac9b2d54f2" Feb 26 11:28:32 crc kubenswrapper[4699]: W0226 11:28:32.347865 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38eef260_c32f_4568_9936_6197ba984f05.slice/crio-03434c2603ce983f1589e4c04fcd928073db2a119d9fab5c4102bad464c1649a WatchSource:0}: Error finding container 03434c2603ce983f1589e4c04fcd928073db2a119d9fab5c4102bad464c1649a: Status 404 returned error can't find the container with id 03434c2603ce983f1589e4c04fcd928073db2a119d9fab5c4102bad464c1649a Feb 26 11:28:32 crc kubenswrapper[4699]: E0226 11:28:32.356129 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5h4qj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6994f66f48-95whc_openstack-operators(38eef260-c32f-4568-9936-6197ba984f05): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 11:28:32 crc kubenswrapper[4699]: E0226 11:28:32.357508 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p5nk9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-bqvxr_openstack-operators(33fc0a61-18c9-4e80-b898-92a5b1b71dac): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 11:28:32 crc kubenswrapper[4699]: E0226 11:28:32.357632 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-95whc" podUID="38eef260-c32f-4568-9936-6197ba984f05" Feb 26 11:28:32 crc kubenswrapper[4699]: W0226 11:28:32.358550 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5be0c14a_e51f_4b69_ab58_c0cac66910e2.slice/crio-c2fdf1600e84d512a1ae50393921855a8028a64733ca2a0a334b34e26257353b WatchSource:0}: Error finding container c2fdf1600e84d512a1ae50393921855a8028a64733ca2a0a334b34e26257353b: Status 404 returned error can't find the container with id c2fdf1600e84d512a1ae50393921855a8028a64733ca2a0a334b34e26257353b Feb 26 11:28:32 crc kubenswrapper[4699]: E0226 11:28:32.358630 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-bqvxr" podUID="33fc0a61-18c9-4e80-b898-92a5b1b71dac" Feb 26 11:28:32 crc kubenswrapper[4699]: E0226 11:28:32.364342 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g2wxf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-6bd4687957-6gblm_openstack-operators(54959b79-361c-415a-986d-1af6d8eb6701): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 11:28:32 crc kubenswrapper[4699]: E0226 11:28:32.364408 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6hdkk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5dc6794d5b-mwvnr_openstack-operators(5be0c14a-e51f-4b69-ab58-c0cac66910e2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 11:28:32 crc kubenswrapper[4699]: E0226 11:28:32.366178 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-mwvnr" podUID="5be0c14a-e51f-4b69-ab58-c0cac66910e2" Feb 26 11:28:32 crc kubenswrapper[4699]: E0226 11:28:32.366264 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6gblm" podUID="54959b79-361c-415a-986d-1af6d8eb6701" Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.484936 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-95whc" event={"ID":"38eef260-c32f-4568-9936-6197ba984f05","Type":"ContainerStarted","Data":"03434c2603ce983f1589e4c04fcd928073db2a119d9fab5c4102bad464c1649a"} Feb 26 11:28:32 crc kubenswrapper[4699]: E0226 11:28:32.486252 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-95whc" podUID="38eef260-c32f-4568-9936-6197ba984f05" Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.486718 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-t8c9f" event={"ID":"7b204025-d5ff-4c74-96b9-6774b62e0cc4","Type":"ContainerStarted","Data":"8c4fb8e66f8f1dfface81fc9e61cadc1e57ce5df81ab31b96de495e237aaa5a4"} Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.490096 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d2pxc" event={"ID":"a2c419ab-2a99-4d37-b46c-b84024f24b2e","Type":"ContainerStarted","Data":"6e84e8fdaecccbb4363b0e0667cab937ffa9a4a9f454839b6678898eef797324"} Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.492367 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-sndb9" event={"ID":"1814471e-5f82-4464-9528-75da66d7235b","Type":"ContainerStarted","Data":"b5b101ddedfa4b19b97f65b260a1ddb656db205a395b8f30f4de98903c761fb8"} Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.494206 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2wj2n" event={"ID":"a6e7ca85-e18b-4605-9180-316f65b82006","Type":"ContainerStarted","Data":"aa96c261f60037df1a40fde6b38bfd921f422a546b721874f9515049a4d51ea0"} Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.495830 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-mwvnr" event={"ID":"5be0c14a-e51f-4b69-ab58-c0cac66910e2","Type":"ContainerStarted","Data":"c2fdf1600e84d512a1ae50393921855a8028a64733ca2a0a334b34e26257353b"} Feb 26 11:28:32 crc kubenswrapper[4699]: E0226 11:28:32.497948 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98\\\"\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-mwvnr" podUID="5be0c14a-e51f-4b69-ab58-c0cac66910e2" Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.512260 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-f9kz5" event={"ID":"15255a9b-0767-4518-8e81-ca9044f9190a","Type":"ContainerStarted","Data":"7f66db4503c9cf87d41f0fc7e2f3a655fefe3e62ed6273462062d4ddfdebe629"} Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.515524 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mghs" event={"ID":"0ca72d8e-ed5f-47e4-b9c0-62b41fd687ee","Type":"ContainerStarted","Data":"0aa5533ef85057af20475853b15327d2f6b844961cfc0e9672c1d5cf095b950f"} Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.516846 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-9gwwj" event={"ID":"caabfe5b-420c-47c7-9ed2-b4ac9b2d54f2","Type":"ContainerStarted","Data":"a6d65cc2c247074d7226b3de196e27eba5b0573bad2712ac68409f726d4f7e9c"} Feb 26 11:28:32 crc kubenswrapper[4699]: E0226 11:28:32.522878 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26\\\"\"" pod="openstack-operators/manila-operator-controller-manager-67d996989d-9gwwj" podUID="caabfe5b-420c-47c7-9ed2-b4ac9b2d54f2" Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.549194 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-96png" event={"ID":"a90c4025-7bd1-401b-8f92-5f15a58fb3d6","Type":"ContainerStarted","Data":"d6aa0057c32f13b3e8adb701b0ee1ce87290118b32e08f2e34242c3151275b37"} Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.568381 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-5k85p" event={"ID":"d56efcbf-3414-4bd1-9cbf-d56c434ac529","Type":"ContainerStarted","Data":"6b0e33aeb1ddc4a7b24c34ab91d6bbd692c23ca70a55db1e39083b786c3cb891"} Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.573529 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-bqvxr" event={"ID":"33fc0a61-18c9-4e80-b898-92a5b1b71dac","Type":"ContainerStarted","Data":"f70bbd10421047762b4e3679725eeb5ed1d110a8262b0e1a765ea1618b2299a1"} Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.576068 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ghqf4"] Feb 26 11:28:32 crc kubenswrapper[4699]: E0226 11:28:32.579536 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-bqvxr" podUID="33fc0a61-18c9-4e80-b898-92a5b1b71dac" Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.579856 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-4k4sm" event={"ID":"07c2552c-8182-4cfe-a397-39ad287029e5","Type":"ContainerStarted","Data":"2a5aaba4b6ff0ac20e706fa4cb48c18ea61316825ddc6a5ac1e78e37ff5a21eb"} Feb 26 11:28:32 crc kubenswrapper[4699]: E0226 11:28:32.580579 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d2sjn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-ghqf4_openstack-operators(8d440653-f1c3-483c-a37d-463dcfc15224): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.581326 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-jh7vz" event={"ID":"27e251bb-8f9b-48d4-9ea3-81d03fd85244","Type":"ContainerStarted","Data":"61ae194977614152b39fbf549b14c2b6ba4e9a1ad2475153a43fb5b2aa76152b"} Feb 26 11:28:32 crc kubenswrapper[4699]: E0226 11:28:32.581766 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ghqf4" podUID="8d440653-f1c3-483c-a37d-463dcfc15224" Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.585439 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qf9vd" event={"ID":"619dff06-7255-4aab-9ffe-9f2561bcc904","Type":"ContainerStarted","Data":"db102e99dacd7e0f310a45a6a04234e31237ab1cbcd6c93742ded6628d3cb9d1"} Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.587137 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jxr77" event={"ID":"7545763d-d2d2-4b6e-980d-737062f0a894","Type":"ContainerStarted","Data":"a45f38838029fc27dc6caa1da4ec104a117a6feb671d57bac438346127d9e56f"} Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.590493 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-xw85z" event={"ID":"35555f68-d5c4-44b2-9dfa-af5f91f57c7c","Type":"ContainerStarted","Data":"2e24a98d5e837ec1f9775546fc7401ede85f925da653cc504122bd2164829905"} Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.591453 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6gblm" event={"ID":"54959b79-361c-415a-986d-1af6d8eb6701","Type":"ContainerStarted","Data":"19f836c78094e1fa67ae7ef4cdc0cf8b6da9e5fc2e11e645a38476256764b5d3"} Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.591819 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-fnnc7"] Feb 26 11:28:32 crc kubenswrapper[4699]: E0226 11:28:32.594284 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6gblm" podUID="54959b79-361c-415a-986d-1af6d8eb6701" Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.639852 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-webhook-certs\") pod \"openstack-operator-controller-manager-947f4f86b-m69sv\" (UID: \"ebf1a568-be30-4ceb-bc67-e3158a0280b9\") " pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" Feb 26 11:28:32 crc kubenswrapper[4699]: I0226 11:28:32.639939 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-metrics-certs\") pod \"openstack-operator-controller-manager-947f4f86b-m69sv\" (UID: \"ebf1a568-be30-4ceb-bc67-e3158a0280b9\") " pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" Feb 26 11:28:32 crc kubenswrapper[4699]: E0226 11:28:32.640065 4699 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 11:28:32 crc kubenswrapper[4699]: E0226 11:28:32.640158 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-webhook-certs podName:ebf1a568-be30-4ceb-bc67-e3158a0280b9 nodeName:}" failed. No retries permitted until 2026-02-26 11:28:34.640138999 +0000 UTC m=+1060.450965433 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-webhook-certs") pod "openstack-operator-controller-manager-947f4f86b-m69sv" (UID: "ebf1a568-be30-4ceb-bc67-e3158a0280b9") : secret "webhook-server-cert" not found Feb 26 11:28:32 crc kubenswrapper[4699]: E0226 11:28:32.641188 4699 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 11:28:32 crc kubenswrapper[4699]: E0226 11:28:32.641789 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-metrics-certs podName:ebf1a568-be30-4ceb-bc67-e3158a0280b9 nodeName:}" failed. No retries permitted until 2026-02-26 11:28:34.641760695 +0000 UTC m=+1060.452587149 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-metrics-certs") pod "openstack-operator-controller-manager-947f4f86b-m69sv" (UID: "ebf1a568-be30-4ceb-bc67-e3158a0280b9") : secret "metrics-server-cert" not found Feb 26 11:28:33 crc kubenswrapper[4699]: I0226 11:28:33.621141 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ghqf4" event={"ID":"8d440653-f1c3-483c-a37d-463dcfc15224","Type":"ContainerStarted","Data":"faa6deedd8d002c98bac7dc2db2f44b197dd5d6fac224340edf72a5d88594500"} Feb 26 11:28:33 crc kubenswrapper[4699]: I0226 11:28:33.625004 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-fnnc7" event={"ID":"a2b3bf3b-a815-4033-983b-eedc16b8609f","Type":"ContainerStarted","Data":"de1a251e348c24559c54fd374b8e5b8730720185642bbad9e6bd93882f1d1e59"} Feb 26 11:28:33 crc kubenswrapper[4699]: E0226 11:28:33.625646 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ghqf4" podUID="8d440653-f1c3-483c-a37d-463dcfc15224" Feb 26 11:28:33 crc kubenswrapper[4699]: E0226 11:28:33.636278 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98\\\"\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-mwvnr" podUID="5be0c14a-e51f-4b69-ab58-c0cac66910e2" Feb 26 11:28:33 crc kubenswrapper[4699]: E0226 11:28:33.636716 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-95whc" podUID="38eef260-c32f-4568-9936-6197ba984f05" Feb 26 11:28:33 crc kubenswrapper[4699]: E0226 11:28:33.638662 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-bqvxr" podUID="33fc0a61-18c9-4e80-b898-92a5b1b71dac" Feb 26 11:28:33 crc kubenswrapper[4699]: E0226 11:28:33.639977 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6gblm" podUID="54959b79-361c-415a-986d-1af6d8eb6701" Feb 26 11:28:33 crc kubenswrapper[4699]: E0226 11:28:33.640764 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26\\\"\"" pod="openstack-operators/manila-operator-controller-manager-67d996989d-9gwwj" podUID="caabfe5b-420c-47c7-9ed2-b4ac9b2d54f2" Feb 26 11:28:33 crc kubenswrapper[4699]: I0226 11:28:33.763191 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afbeb2d8-c332-447b-a931-9fe7b246914d-cert\") pod \"infra-operator-controller-manager-79d975b745-mtrs6\" (UID: \"afbeb2d8-c332-447b-a931-9fe7b246914d\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-mtrs6" Feb 26 11:28:33 crc kubenswrapper[4699]: E0226 11:28:33.763504 4699 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 11:28:33 crc kubenswrapper[4699]: E0226 11:28:33.763563 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afbeb2d8-c332-447b-a931-9fe7b246914d-cert podName:afbeb2d8-c332-447b-a931-9fe7b246914d nodeName:}" failed. No retries permitted until 2026-02-26 11:28:37.763543498 +0000 UTC m=+1063.574369932 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/afbeb2d8-c332-447b-a931-9fe7b246914d-cert") pod "infra-operator-controller-manager-79d975b745-mtrs6" (UID: "afbeb2d8-c332-447b-a931-9fe7b246914d") : secret "infra-operator-webhook-server-cert" not found Feb 26 11:28:34 crc kubenswrapper[4699]: I0226 11:28:34.373791 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce7c40ca-05ad-49ca-a091-02ac588c3eb7-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb\" (UID: \"ce7c40ca-05ad-49ca-a091-02ac588c3eb7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb" Feb 26 11:28:34 crc kubenswrapper[4699]: E0226 11:28:34.374853 4699 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 11:28:34 crc kubenswrapper[4699]: E0226 11:28:34.375191 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce7c40ca-05ad-49ca-a091-02ac588c3eb7-cert podName:ce7c40ca-05ad-49ca-a091-02ac588c3eb7 nodeName:}" failed. No retries permitted until 2026-02-26 11:28:38.375170288 +0000 UTC m=+1064.185996782 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ce7c40ca-05ad-49ca-a091-02ac588c3eb7-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb" (UID: "ce7c40ca-05ad-49ca-a091-02ac588c3eb7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 11:28:34 crc kubenswrapper[4699]: E0226 11:28:34.635610 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ghqf4" podUID="8d440653-f1c3-483c-a37d-463dcfc15224" Feb 26 11:28:34 crc kubenswrapper[4699]: I0226 11:28:34.679068 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-webhook-certs\") pod \"openstack-operator-controller-manager-947f4f86b-m69sv\" (UID: \"ebf1a568-be30-4ceb-bc67-e3158a0280b9\") " pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" Feb 26 11:28:34 crc kubenswrapper[4699]: I0226 11:28:34.679157 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-metrics-certs\") pod \"openstack-operator-controller-manager-947f4f86b-m69sv\" (UID: \"ebf1a568-be30-4ceb-bc67-e3158a0280b9\") " pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" Feb 26 11:28:34 crc kubenswrapper[4699]: E0226 11:28:34.679357 4699 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 11:28:34 crc kubenswrapper[4699]: E0226 11:28:34.679447 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-metrics-certs podName:ebf1a568-be30-4ceb-bc67-e3158a0280b9 nodeName:}" failed. No retries permitted until 2026-02-26 11:28:38.67943186 +0000 UTC m=+1064.490258294 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-metrics-certs") pod "openstack-operator-controller-manager-947f4f86b-m69sv" (UID: "ebf1a568-be30-4ceb-bc67-e3158a0280b9") : secret "metrics-server-cert" not found Feb 26 11:28:34 crc kubenswrapper[4699]: E0226 11:28:34.679462 4699 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 11:28:34 crc kubenswrapper[4699]: E0226 11:28:34.679540 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-webhook-certs podName:ebf1a568-be30-4ceb-bc67-e3158a0280b9 nodeName:}" failed. No retries permitted until 2026-02-26 11:28:38.679519162 +0000 UTC m=+1064.490345626 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-webhook-certs") pod "openstack-operator-controller-manager-947f4f86b-m69sv" (UID: "ebf1a568-be30-4ceb-bc67-e3158a0280b9") : secret "webhook-server-cert" not found Feb 26 11:28:37 crc kubenswrapper[4699]: I0226 11:28:37.842485 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afbeb2d8-c332-447b-a931-9fe7b246914d-cert\") pod \"infra-operator-controller-manager-79d975b745-mtrs6\" (UID: \"afbeb2d8-c332-447b-a931-9fe7b246914d\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-mtrs6" Feb 26 11:28:37 crc kubenswrapper[4699]: E0226 11:28:37.843023 4699 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 11:28:37 crc kubenswrapper[4699]: E0226 11:28:37.843084 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afbeb2d8-c332-447b-a931-9fe7b246914d-cert podName:afbeb2d8-c332-447b-a931-9fe7b246914d nodeName:}" failed. No retries permitted until 2026-02-26 11:28:45.843063225 +0000 UTC m=+1071.653889659 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/afbeb2d8-c332-447b-a931-9fe7b246914d-cert") pod "infra-operator-controller-manager-79d975b745-mtrs6" (UID: "afbeb2d8-c332-447b-a931-9fe7b246914d") : secret "infra-operator-webhook-server-cert" not found Feb 26 11:28:38 crc kubenswrapper[4699]: I0226 11:28:38.451667 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce7c40ca-05ad-49ca-a091-02ac588c3eb7-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb\" (UID: \"ce7c40ca-05ad-49ca-a091-02ac588c3eb7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb" Feb 26 11:28:38 crc kubenswrapper[4699]: E0226 11:28:38.451877 4699 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 11:28:38 crc kubenswrapper[4699]: E0226 11:28:38.451958 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce7c40ca-05ad-49ca-a091-02ac588c3eb7-cert podName:ce7c40ca-05ad-49ca-a091-02ac588c3eb7 nodeName:}" failed. No retries permitted until 2026-02-26 11:28:46.451935908 +0000 UTC m=+1072.262762342 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ce7c40ca-05ad-49ca-a091-02ac588c3eb7-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb" (UID: "ce7c40ca-05ad-49ca-a091-02ac588c3eb7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 11:28:38 crc kubenswrapper[4699]: I0226 11:28:38.756971 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-webhook-certs\") pod \"openstack-operator-controller-manager-947f4f86b-m69sv\" (UID: \"ebf1a568-be30-4ceb-bc67-e3158a0280b9\") " pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" Feb 26 11:28:38 crc kubenswrapper[4699]: I0226 11:28:38.757043 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-metrics-certs\") pod \"openstack-operator-controller-manager-947f4f86b-m69sv\" (UID: \"ebf1a568-be30-4ceb-bc67-e3158a0280b9\") " pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" Feb 26 11:28:38 crc kubenswrapper[4699]: E0226 11:28:38.757210 4699 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 11:28:38 crc kubenswrapper[4699]: E0226 11:28:38.757245 4699 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 11:28:38 crc kubenswrapper[4699]: E0226 11:28:38.757258 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-metrics-certs podName:ebf1a568-be30-4ceb-bc67-e3158a0280b9 nodeName:}" failed. No retries permitted until 2026-02-26 11:28:46.757244368 +0000 UTC m=+1072.568070792 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-metrics-certs") pod "openstack-operator-controller-manager-947f4f86b-m69sv" (UID: "ebf1a568-be30-4ceb-bc67-e3158a0280b9") : secret "metrics-server-cert" not found Feb 26 11:28:38 crc kubenswrapper[4699]: E0226 11:28:38.757342 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-webhook-certs podName:ebf1a568-be30-4ceb-bc67-e3158a0280b9 nodeName:}" failed. No retries permitted until 2026-02-26 11:28:46.75732391 +0000 UTC m=+1072.568150344 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-webhook-certs") pod "openstack-operator-controller-manager-947f4f86b-m69sv" (UID: "ebf1a568-be30-4ceb-bc67-e3158a0280b9") : secret "webhook-server-cert" not found Feb 26 11:28:45 crc kubenswrapper[4699]: I0226 11:28:45.890211 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afbeb2d8-c332-447b-a931-9fe7b246914d-cert\") pod \"infra-operator-controller-manager-79d975b745-mtrs6\" (UID: \"afbeb2d8-c332-447b-a931-9fe7b246914d\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-mtrs6" Feb 26 11:28:45 crc kubenswrapper[4699]: I0226 11:28:45.901895 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afbeb2d8-c332-447b-a931-9fe7b246914d-cert\") pod \"infra-operator-controller-manager-79d975b745-mtrs6\" (UID: \"afbeb2d8-c332-447b-a931-9fe7b246914d\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-mtrs6" Feb 26 11:28:45 crc kubenswrapper[4699]: I0226 11:28:45.917761 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-mtrs6" Feb 26 11:28:46 crc kubenswrapper[4699]: I0226 11:28:46.499798 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce7c40ca-05ad-49ca-a091-02ac588c3eb7-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb\" (UID: \"ce7c40ca-05ad-49ca-a091-02ac588c3eb7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb" Feb 26 11:28:46 crc kubenswrapper[4699]: I0226 11:28:46.509382 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce7c40ca-05ad-49ca-a091-02ac588c3eb7-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb\" (UID: \"ce7c40ca-05ad-49ca-a091-02ac588c3eb7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb" Feb 26 11:28:46 crc kubenswrapper[4699]: I0226 11:28:46.628921 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb" Feb 26 11:28:46 crc kubenswrapper[4699]: I0226 11:28:46.803388 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-metrics-certs\") pod \"openstack-operator-controller-manager-947f4f86b-m69sv\" (UID: \"ebf1a568-be30-4ceb-bc67-e3158a0280b9\") " pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" Feb 26 11:28:46 crc kubenswrapper[4699]: I0226 11:28:46.803536 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-webhook-certs\") pod \"openstack-operator-controller-manager-947f4f86b-m69sv\" (UID: \"ebf1a568-be30-4ceb-bc67-e3158a0280b9\") " pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" Feb 26 11:28:46 crc kubenswrapper[4699]: I0226 11:28:46.811862 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-metrics-certs\") pod \"openstack-operator-controller-manager-947f4f86b-m69sv\" (UID: \"ebf1a568-be30-4ceb-bc67-e3158a0280b9\") " pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" Feb 26 11:28:46 crc kubenswrapper[4699]: I0226 11:28:46.811972 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ebf1a568-be30-4ceb-bc67-e3158a0280b9-webhook-certs\") pod \"openstack-operator-controller-manager-947f4f86b-m69sv\" (UID: \"ebf1a568-be30-4ceb-bc67-e3158a0280b9\") " pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" Feb 26 11:28:46 crc kubenswrapper[4699]: I0226 11:28:46.986187 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" Feb 26 11:28:47 crc kubenswrapper[4699]: E0226 11:28:47.838426 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da" Feb 26 11:28:47 crc kubenswrapper[4699]: E0226 11:28:47.838662 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wrm4v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-5b9b8895d5-qf9vd_openstack-operators(619dff06-7255-4aab-9ffe-9f2561bcc904): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 11:28:47 crc kubenswrapper[4699]: E0226 11:28:47.841143 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qf9vd" podUID="619dff06-7255-4aab-9ffe-9f2561bcc904" Feb 26 11:28:48 crc kubenswrapper[4699]: E0226 11:28:48.516586 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06" Feb 26 11:28:48 crc kubenswrapper[4699]: E0226 11:28:48.517086 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-97rqs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-659dc6bbfc-2wj2n_openstack-operators(a6e7ca85-e18b-4605-9180-316f65b82006): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 11:28:48 crc kubenswrapper[4699]: E0226 11:28:48.518285 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2wj2n" podUID="a6e7ca85-e18b-4605-9180-316f65b82006" Feb 26 11:28:48 crc kubenswrapper[4699]: E0226 11:28:48.721803 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qf9vd" podUID="619dff06-7255-4aab-9ffe-9f2561bcc904" Feb 26 11:28:48 crc kubenswrapper[4699]: E0226 11:28:48.722054 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2wj2n" podUID="a6e7ca85-e18b-4605-9180-316f65b82006" Feb 26 11:28:49 crc kubenswrapper[4699]: E0226 11:28:49.094179 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97" Feb 26 11:28:49 crc kubenswrapper[4699]: E0226 11:28:49.095292 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lkj4s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-fnnc7_openstack-operators(a2b3bf3b-a815-4033-983b-eedc16b8609f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 11:28:49 crc kubenswrapper[4699]: E0226 11:28:49.097161 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-fnnc7" podUID="a2b3bf3b-a815-4033-983b-eedc16b8609f" Feb 26 11:28:49 crc kubenswrapper[4699]: E0226 11:28:49.728220 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-fnnc7" podUID="a2b3bf3b-a815-4033-983b-eedc16b8609f" Feb 26 11:28:49 crc kubenswrapper[4699]: E0226 11:28:49.965004 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 26 11:28:49 crc kubenswrapper[4699]: E0226 11:28:49.965562 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7vmk9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-4mghs_openstack-operators(0ca72d8e-ed5f-47e4-b9c0-62b41fd687ee): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 11:28:49 crc kubenswrapper[4699]: E0226 11:28:49.967101 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mghs" podUID="0ca72d8e-ed5f-47e4-b9c0-62b41fd687ee" Feb 26 11:28:50 crc kubenswrapper[4699]: E0226 11:28:50.483931 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd" Feb 26 11:28:50 crc kubenswrapper[4699]: E0226 11:28:50.484599 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-28f6x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-jxr77_openstack-operators(7545763d-d2d2-4b6e-980d-737062f0a894): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 11:28:50 crc kubenswrapper[4699]: E0226 11:28:50.488301 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jxr77" podUID="7545763d-d2d2-4b6e-980d-737062f0a894" Feb 26 11:28:50 crc kubenswrapper[4699]: E0226 11:28:50.735561 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mghs" podUID="0ca72d8e-ed5f-47e4-b9c0-62b41fd687ee" Feb 26 11:28:50 crc kubenswrapper[4699]: E0226 11:28:50.735576 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jxr77" podUID="7545763d-d2d2-4b6e-980d-737062f0a894" Feb 26 11:28:50 crc kubenswrapper[4699]: I0226 11:28:50.930160 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv"] Feb 26 11:28:51 crc kubenswrapper[4699]: E0226 11:28:51.266815 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 26 11:28:51 crc kubenswrapper[4699]: E0226 11:28:51.267016 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mb6zf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-d2pxc_openstack-operators(a2c419ab-2a99-4d37-b46c-b84024f24b2e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 11:28:51 crc kubenswrapper[4699]: E0226 11:28:51.270215 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d2pxc" podUID="a2c419ab-2a99-4d37-b46c-b84024f24b2e" Feb 26 11:28:51 crc kubenswrapper[4699]: W0226 11:28:51.271729 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebf1a568_be30_4ceb_bc67_e3158a0280b9.slice/crio-ea200e71db244b9336eb7a08d6d473be11b178679db63a369d1ad3c770199d02 WatchSource:0}: Error finding container ea200e71db244b9336eb7a08d6d473be11b178679db63a369d1ad3c770199d02: Status 404 returned error can't find the container with id ea200e71db244b9336eb7a08d6d473be11b178679db63a369d1ad3c770199d02 Feb 26 11:28:51 crc kubenswrapper[4699]: I0226 11:28:51.704565 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb"] Feb 26 11:28:51 crc kubenswrapper[4699]: I0226 11:28:51.753296 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" event={"ID":"ebf1a568-be30-4ceb-bc67-e3158a0280b9","Type":"ContainerStarted","Data":"ea200e71db244b9336eb7a08d6d473be11b178679db63a369d1ad3c770199d02"} Feb 26 11:28:51 crc kubenswrapper[4699]: I0226 11:28:51.764375 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-4k4sm" event={"ID":"07c2552c-8182-4cfe-a397-39ad287029e5","Type":"ContainerStarted","Data":"6c11046fbd5bea3301881e9a1c591e718bc6ddf795eb2e2f617c6ab877a9b08f"} Feb 26 11:28:51 crc kubenswrapper[4699]: I0226 11:28:51.766610 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-4k4sm" Feb 26 11:28:51 crc kubenswrapper[4699]: I0226 11:28:51.775731 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb" event={"ID":"ce7c40ca-05ad-49ca-a091-02ac588c3eb7","Type":"ContainerStarted","Data":"5bec5464a696581ba804d2a53c257cae642531445e768a7a0f2319d83e69a268"} Feb 26 11:28:51 crc kubenswrapper[4699]: I0226 11:28:51.781523 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-sndb9" event={"ID":"1814471e-5f82-4464-9528-75da66d7235b","Type":"ContainerStarted","Data":"fa74f8b865e2119b7b535fbc125f5f43932d2aeac97440dfebd4b5039419ec0f"} Feb 26 11:28:51 crc kubenswrapper[4699]: I0226 11:28:51.781603 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-sndb9" Feb 26 11:28:51 crc kubenswrapper[4699]: I0226 11:28:51.787605 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-jh7vz" Feb 26 11:28:51 crc kubenswrapper[4699]: E0226 11:28:51.791398 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d2pxc" podUID="a2c419ab-2a99-4d37-b46c-b84024f24b2e" Feb 26 11:28:51 crc kubenswrapper[4699]: I0226 11:28:51.813797 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-mtrs6"] Feb 26 11:28:51 crc kubenswrapper[4699]: I0226 11:28:51.817417 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-4k4sm" podStartSLOduration=4.389839014 podStartE2EDuration="22.817393715s" podCreationTimestamp="2026-02-26 11:28:29 +0000 UTC" firstStartedPulling="2026-02-26 11:28:32.04729574 +0000 UTC m=+1057.858122174" lastFinishedPulling="2026-02-26 11:28:50.474850441 +0000 UTC m=+1076.285676875" observedRunningTime="2026-02-26 11:28:51.805873659 +0000 UTC m=+1077.616700083" watchObservedRunningTime="2026-02-26 11:28:51.817393715 +0000 UTC m=+1077.628220149" Feb 26 11:28:51 crc kubenswrapper[4699]: W0226 11:28:51.824307 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafbeb2d8_c332_447b_a931_9fe7b246914d.slice/crio-57d928a069c0f4a62bd37bdb55b7db4621d8b61ab8483ee5844ea927364f139f WatchSource:0}: Error finding container 57d928a069c0f4a62bd37bdb55b7db4621d8b61ab8483ee5844ea927364f139f: Status 404 returned error can't find the container with id 57d928a069c0f4a62bd37bdb55b7db4621d8b61ab8483ee5844ea927364f139f Feb 26 11:28:51 crc kubenswrapper[4699]: I0226 11:28:51.889653 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-sndb9" podStartSLOduration=3.325654838 podStartE2EDuration="22.889635527s" podCreationTimestamp="2026-02-26 11:28:29 +0000 UTC" firstStartedPulling="2026-02-26 11:28:31.680191251 +0000 UTC m=+1057.491017685" lastFinishedPulling="2026-02-26 11:28:51.24417194 +0000 UTC m=+1077.054998374" observedRunningTime="2026-02-26 11:28:51.881533348 +0000 UTC m=+1077.692359782" watchObservedRunningTime="2026-02-26 11:28:51.889635527 +0000 UTC m=+1077.700461961" Feb 26 11:28:51 crc kubenswrapper[4699]: I0226 11:28:51.916366 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-jh7vz" podStartSLOduration=4.272735453 podStartE2EDuration="22.916348852s" podCreationTimestamp="2026-02-26 11:28:29 +0000 UTC" firstStartedPulling="2026-02-26 11:28:31.830276545 +0000 UTC m=+1057.641102979" lastFinishedPulling="2026-02-26 11:28:50.473889944 +0000 UTC m=+1076.284716378" observedRunningTime="2026-02-26 11:28:51.913753179 +0000 UTC m=+1077.724579633" watchObservedRunningTime="2026-02-26 11:28:51.916348852 +0000 UTC m=+1077.727175286" Feb 26 11:28:52 crc kubenswrapper[4699]: I0226 11:28:52.797984 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-96png" event={"ID":"a90c4025-7bd1-401b-8f92-5f15a58fb3d6","Type":"ContainerStarted","Data":"66e80b6ecdd81f6a591e87b879141a96e4ef50753f60557357b793024413feb1"} Feb 26 11:28:52 crc kubenswrapper[4699]: I0226 11:28:52.798403 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-96png" Feb 26 11:28:52 crc kubenswrapper[4699]: I0226 11:28:52.799592 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-5k85p" event={"ID":"d56efcbf-3414-4bd1-9cbf-d56c434ac529","Type":"ContainerStarted","Data":"272f949db65f166b94d5c631bee16c7e1f418a1af6aaa2732c89b04a26218d51"} Feb 26 11:28:52 crc kubenswrapper[4699]: I0226 11:28:52.799788 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-5k85p" Feb 26 11:28:52 crc kubenswrapper[4699]: I0226 11:28:52.802013 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" event={"ID":"ebf1a568-be30-4ceb-bc67-e3158a0280b9","Type":"ContainerStarted","Data":"94eac119d488dc7fd77ada5781bc981fa79d0eccb15b772532326268990baf17"} Feb 26 11:28:52 crc kubenswrapper[4699]: I0226 11:28:52.802181 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" Feb 26 11:28:52 crc kubenswrapper[4699]: I0226 11:28:52.804336 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-xw85z" event={"ID":"35555f68-d5c4-44b2-9dfa-af5f91f57c7c","Type":"ContainerStarted","Data":"29b9415426dd1cf00735e4ad95da2d37939b5adddbfff72c69ca7d2285781200"} Feb 26 11:28:52 crc kubenswrapper[4699]: I0226 11:28:52.804569 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-xw85z" Feb 26 11:28:52 crc kubenswrapper[4699]: I0226 11:28:52.806721 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-jh7vz" event={"ID":"27e251bb-8f9b-48d4-9ea3-81d03fd85244","Type":"ContainerStarted","Data":"fa8d386511243b5c1f8ec81d6765dfe78e6bf9dceecc149bdaa7b5032edf7d43"} Feb 26 11:28:52 crc kubenswrapper[4699]: I0226 11:28:52.808265 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-f9kz5" event={"ID":"15255a9b-0767-4518-8e81-ca9044f9190a","Type":"ContainerStarted","Data":"bbc67fb061f1462c637b2cb3a2f13c7b36d78c90e5193e1aa51941ff4adb6697"} Feb 26 11:28:52 crc kubenswrapper[4699]: I0226 11:28:52.808386 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-f9kz5" Feb 26 11:28:52 crc kubenswrapper[4699]: I0226 11:28:52.811691 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-mtrs6" event={"ID":"afbeb2d8-c332-447b-a931-9fe7b246914d","Type":"ContainerStarted","Data":"57d928a069c0f4a62bd37bdb55b7db4621d8b61ab8483ee5844ea927364f139f"} Feb 26 11:28:52 crc kubenswrapper[4699]: I0226 11:28:52.823940 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-96png" podStartSLOduration=4.478343606 podStartE2EDuration="22.823900689s" podCreationTimestamp="2026-02-26 11:28:30 +0000 UTC" firstStartedPulling="2026-02-26 11:28:32.128594818 +0000 UTC m=+1057.939421252" lastFinishedPulling="2026-02-26 11:28:50.474151901 +0000 UTC m=+1076.284978335" observedRunningTime="2026-02-26 11:28:52.813396922 +0000 UTC m=+1078.624223366" watchObservedRunningTime="2026-02-26 11:28:52.823900689 +0000 UTC m=+1078.634727123" Feb 26 11:28:52 crc kubenswrapper[4699]: I0226 11:28:52.833548 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-t8c9f" event={"ID":"7b204025-d5ff-4c74-96b9-6774b62e0cc4","Type":"ContainerStarted","Data":"7636ab711872fcf872fecf3291e8b2b61bdc994423a76d8dcc516609dbe02d72"} Feb 26 11:28:52 crc kubenswrapper[4699]: I0226 11:28:52.834457 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-t8c9f" Feb 26 11:28:52 crc kubenswrapper[4699]: I0226 11:28:52.868294 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" podStartSLOduration=22.868271413 podStartE2EDuration="22.868271413s" podCreationTimestamp="2026-02-26 11:28:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:28:52.862639954 +0000 UTC m=+1078.673466388" watchObservedRunningTime="2026-02-26 11:28:52.868271413 +0000 UTC m=+1078.679097847" Feb 26 11:28:52 crc kubenswrapper[4699]: I0226 11:28:52.895306 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-f9kz5" podStartSLOduration=3.991846613 podStartE2EDuration="22.895289447s" podCreationTimestamp="2026-02-26 11:28:30 +0000 UTC" firstStartedPulling="2026-02-26 11:28:32.34053249 +0000 UTC m=+1058.151358924" lastFinishedPulling="2026-02-26 11:28:51.243975324 +0000 UTC m=+1077.054801758" observedRunningTime="2026-02-26 11:28:52.891784428 +0000 UTC m=+1078.702610872" watchObservedRunningTime="2026-02-26 11:28:52.895289447 +0000 UTC m=+1078.706115871" Feb 26 11:28:52 crc kubenswrapper[4699]: I0226 11:28:52.908568 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-xw85z" podStartSLOduration=5.420544312 podStartE2EDuration="23.908550252s" podCreationTimestamp="2026-02-26 11:28:29 +0000 UTC" firstStartedPulling="2026-02-26 11:28:31.986023498 +0000 UTC m=+1057.796849942" lastFinishedPulling="2026-02-26 11:28:50.474029448 +0000 UTC m=+1076.284855882" observedRunningTime="2026-02-26 11:28:52.905588048 +0000 UTC m=+1078.716414482" watchObservedRunningTime="2026-02-26 11:28:52.908550252 +0000 UTC m=+1078.719376686" Feb 26 11:28:52 crc kubenswrapper[4699]: I0226 11:28:52.940202 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-5k85p" podStartSLOduration=5.249730252 podStartE2EDuration="23.940188366s" podCreationTimestamp="2026-02-26 11:28:29 +0000 UTC" firstStartedPulling="2026-02-26 11:28:31.783602925 +0000 UTC m=+1057.594429359" lastFinishedPulling="2026-02-26 11:28:50.474061039 +0000 UTC m=+1076.284887473" observedRunningTime="2026-02-26 11:28:52.939313371 +0000 UTC m=+1078.750139815" watchObservedRunningTime="2026-02-26 11:28:52.940188366 +0000 UTC m=+1078.751014800" Feb 26 11:28:52 crc kubenswrapper[4699]: I0226 11:28:52.971851 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-t8c9f" podStartSLOduration=5.149085247 podStartE2EDuration="23.971831681s" podCreationTimestamp="2026-02-26 11:28:29 +0000 UTC" firstStartedPulling="2026-02-26 11:28:31.651255703 +0000 UTC m=+1057.462082137" lastFinishedPulling="2026-02-26 11:28:50.474002137 +0000 UTC m=+1076.284828571" observedRunningTime="2026-02-26 11:28:52.965616445 +0000 UTC m=+1078.776442889" watchObservedRunningTime="2026-02-26 11:28:52.971831681 +0000 UTC m=+1078.782658125" Feb 26 11:28:56 crc kubenswrapper[4699]: I0226 11:28:56.652238 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lqqfb"] Feb 26 11:28:56 crc kubenswrapper[4699]: I0226 11:28:56.654731 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqqfb" Feb 26 11:28:56 crc kubenswrapper[4699]: I0226 11:28:56.685634 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqqfb"] Feb 26 11:28:56 crc kubenswrapper[4699]: I0226 11:28:56.755990 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb-utilities\") pod \"redhat-marketplace-lqqfb\" (UID: \"dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb\") " pod="openshift-marketplace/redhat-marketplace-lqqfb" Feb 26 11:28:56 crc kubenswrapper[4699]: I0226 11:28:56.756060 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7dzn\" (UniqueName: \"kubernetes.io/projected/dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb-kube-api-access-p7dzn\") pod \"redhat-marketplace-lqqfb\" (UID: \"dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb\") " pod="openshift-marketplace/redhat-marketplace-lqqfb" Feb 26 11:28:56 crc kubenswrapper[4699]: I0226 11:28:56.756187 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb-catalog-content\") pod \"redhat-marketplace-lqqfb\" (UID: \"dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb\") " pod="openshift-marketplace/redhat-marketplace-lqqfb" Feb 26 11:28:56 crc kubenswrapper[4699]: I0226 11:28:56.857229 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb-utilities\") pod \"redhat-marketplace-lqqfb\" (UID: \"dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb\") " pod="openshift-marketplace/redhat-marketplace-lqqfb" Feb 26 11:28:56 crc kubenswrapper[4699]: I0226 11:28:56.857297 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7dzn\" (UniqueName: \"kubernetes.io/projected/dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb-kube-api-access-p7dzn\") pod \"redhat-marketplace-lqqfb\" (UID: \"dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb\") " pod="openshift-marketplace/redhat-marketplace-lqqfb" Feb 26 11:28:56 crc kubenswrapper[4699]: I0226 11:28:56.857330 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb-catalog-content\") pod \"redhat-marketplace-lqqfb\" (UID: \"dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb\") " pod="openshift-marketplace/redhat-marketplace-lqqfb" Feb 26 11:28:56 crc kubenswrapper[4699]: I0226 11:28:56.857856 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb-catalog-content\") pod \"redhat-marketplace-lqqfb\" (UID: \"dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb\") " pod="openshift-marketplace/redhat-marketplace-lqqfb" Feb 26 11:28:56 crc kubenswrapper[4699]: I0226 11:28:56.859868 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb-utilities\") pod \"redhat-marketplace-lqqfb\" (UID: \"dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb\") " pod="openshift-marketplace/redhat-marketplace-lqqfb" Feb 26 11:28:56 crc kubenswrapper[4699]: I0226 11:28:56.879912 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7dzn\" (UniqueName: \"kubernetes.io/projected/dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb-kube-api-access-p7dzn\") pod \"redhat-marketplace-lqqfb\" (UID: \"dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb\") " pod="openshift-marketplace/redhat-marketplace-lqqfb" Feb 26 11:28:56 crc kubenswrapper[4699]: I0226 11:28:56.976566 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqqfb" Feb 26 11:28:56 crc kubenswrapper[4699]: I0226 11:28:56.992483 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-947f4f86b-m69sv" Feb 26 11:29:00 crc kubenswrapper[4699]: I0226 11:29:00.116107 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-sndb9" Feb 26 11:29:00 crc kubenswrapper[4699]: I0226 11:29:00.130708 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-xw85z" Feb 26 11:29:00 crc kubenswrapper[4699]: I0226 11:29:00.147180 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-4k4sm" Feb 26 11:29:00 crc kubenswrapper[4699]: I0226 11:29:00.168377 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-jh7vz" Feb 26 11:29:00 crc kubenswrapper[4699]: I0226 11:29:00.210703 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-t8c9f" Feb 26 11:29:00 crc kubenswrapper[4699]: I0226 11:29:00.407033 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-5k85p" Feb 26 11:29:00 crc kubenswrapper[4699]: I0226 11:29:00.873019 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-96png" Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.276471 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-f9kz5" Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.842222 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqqfb"] Feb 26 11:29:01 crc kubenswrapper[4699]: W0226 11:29:01.850983 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcab2afa_9fb1_4d74_9a95_c2fe6a00bbfb.slice/crio-b20b95eb36039045cc49af319621d539e3867b2ec5748b29d5419b0ee942114d WatchSource:0}: Error finding container b20b95eb36039045cc49af319621d539e3867b2ec5748b29d5419b0ee942114d: Status 404 returned error can't find the container with id b20b95eb36039045cc49af319621d539e3867b2ec5748b29d5419b0ee942114d Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.911103 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-mwvnr" event={"ID":"5be0c14a-e51f-4b69-ab58-c0cac66910e2","Type":"ContainerStarted","Data":"4e182608610914b91c743dff33bf39a9d5d2c35ec6581b104adebb05d28d93c6"} Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.911348 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-mwvnr" Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.913252 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-mtrs6" event={"ID":"afbeb2d8-c332-447b-a931-9fe7b246914d","Type":"ContainerStarted","Data":"d2b3b5996c5e1d357eb827e231563c1dfbc1e8a7644d2936251e261afdb389a1"} Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.913340 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-mtrs6" Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.915719 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-95whc" event={"ID":"38eef260-c32f-4568-9936-6197ba984f05","Type":"ContainerStarted","Data":"2a103823842c06754087deab2ba925169b8a9452b423ea1bc09fc08779c4d9b9"} Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.915976 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-95whc" Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.917232 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqqfb" event={"ID":"dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb","Type":"ContainerStarted","Data":"b20b95eb36039045cc49af319621d539e3867b2ec5748b29d5419b0ee942114d"} Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.918679 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb" event={"ID":"ce7c40ca-05ad-49ca-a091-02ac588c3eb7","Type":"ContainerStarted","Data":"a03d27af0446736421a168d8f1e337e63624375f40c20ac3cfd7ea03bfcaf4f2"} Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.918819 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb" Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.921565 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ghqf4" event={"ID":"8d440653-f1c3-483c-a37d-463dcfc15224","Type":"ContainerStarted","Data":"690dd1cfb71b1b2f2bea9fd378e31a5b15028476b9244ea65aeacb8ab832456a"} Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.923176 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6gblm" event={"ID":"54959b79-361c-415a-986d-1af6d8eb6701","Type":"ContainerStarted","Data":"f143a83a1fd6b9178dc8c6a6191c2dda227a8eecec4c3fc19e40f514d4533fb5"} Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.923416 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6gblm" Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.925593 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-9gwwj" event={"ID":"caabfe5b-420c-47c7-9ed2-b4ac9b2d54f2","Type":"ContainerStarted","Data":"8755619bb8a703a6c7e6ffa9eb407f4828e683d0e99c7314f1cfb9d655a92858"} Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.925765 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-9gwwj" Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.927148 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qf9vd" event={"ID":"619dff06-7255-4aab-9ffe-9f2561bcc904","Type":"ContainerStarted","Data":"a2c6867160870890433508410210e626be78f0792238e0ebf285803cf300b8a2"} Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.927343 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qf9vd" Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.927844 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-mwvnr" podStartSLOduration=2.850600538 podStartE2EDuration="31.927831633s" podCreationTimestamp="2026-02-26 11:28:30 +0000 UTC" firstStartedPulling="2026-02-26 11:28:32.364227089 +0000 UTC m=+1058.175053523" lastFinishedPulling="2026-02-26 11:29:01.441458184 +0000 UTC m=+1087.252284618" observedRunningTime="2026-02-26 11:29:01.925470567 +0000 UTC m=+1087.736297001" watchObservedRunningTime="2026-02-26 11:29:01.927831633 +0000 UTC m=+1087.738658057" Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.932708 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-bqvxr" event={"ID":"33fc0a61-18c9-4e80-b898-92a5b1b71dac","Type":"ContainerStarted","Data":"05f89827978a53ff9302503378bea2956dd208c90c777008090f6b234581af52"} Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.933909 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-bqvxr" Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.952478 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-9gwwj" podStartSLOduration=3.847185543 podStartE2EDuration="32.95245965s" podCreationTimestamp="2026-02-26 11:28:29 +0000 UTC" firstStartedPulling="2026-02-26 11:28:32.344397279 +0000 UTC m=+1058.155223713" lastFinishedPulling="2026-02-26 11:29:01.449671386 +0000 UTC m=+1087.260497820" observedRunningTime="2026-02-26 11:29:01.94646773 +0000 UTC m=+1087.757294164" watchObservedRunningTime="2026-02-26 11:29:01.95245965 +0000 UTC m=+1087.763286084" Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.990048 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-95whc" podStartSLOduration=3.913113376 podStartE2EDuration="32.990029402s" podCreationTimestamp="2026-02-26 11:28:29 +0000 UTC" firstStartedPulling="2026-02-26 11:28:32.355988246 +0000 UTC m=+1058.166814680" lastFinishedPulling="2026-02-26 11:29:01.432904272 +0000 UTC m=+1087.243730706" observedRunningTime="2026-02-26 11:29:01.973887365 +0000 UTC m=+1087.784713799" watchObservedRunningTime="2026-02-26 11:29:01.990029402 +0000 UTC m=+1087.800855826" Feb 26 11:29:01 crc kubenswrapper[4699]: I0226 11:29:01.997035 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6gblm" podStartSLOduration=3.913423105 podStartE2EDuration="32.997013659s" podCreationTimestamp="2026-02-26 11:28:29 +0000 UTC" firstStartedPulling="2026-02-26 11:28:32.364064995 +0000 UTC m=+1058.174891429" lastFinishedPulling="2026-02-26 11:29:01.447655549 +0000 UTC m=+1087.258481983" observedRunningTime="2026-02-26 11:29:01.995685342 +0000 UTC m=+1087.806511786" watchObservedRunningTime="2026-02-26 11:29:01.997013659 +0000 UTC m=+1087.807840103" Feb 26 11:29:02 crc kubenswrapper[4699]: I0226 11:29:02.041134 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb" podStartSLOduration=22.338810256 podStartE2EDuration="32.041105636s" podCreationTimestamp="2026-02-26 11:28:30 +0000 UTC" firstStartedPulling="2026-02-26 11:28:51.730058906 +0000 UTC m=+1077.540885340" lastFinishedPulling="2026-02-26 11:29:01.432354286 +0000 UTC m=+1087.243180720" observedRunningTime="2026-02-26 11:29:02.037638678 +0000 UTC m=+1087.848465112" watchObservedRunningTime="2026-02-26 11:29:02.041105636 +0000 UTC m=+1087.851932070" Feb 26 11:29:02 crc kubenswrapper[4699]: I0226 11:29:02.080482 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-mtrs6" podStartSLOduration=23.475557131 podStartE2EDuration="33.080465768s" podCreationTimestamp="2026-02-26 11:28:29 +0000 UTC" firstStartedPulling="2026-02-26 11:28:51.845445418 +0000 UTC m=+1077.656271852" lastFinishedPulling="2026-02-26 11:29:01.450354045 +0000 UTC m=+1087.261180489" observedRunningTime="2026-02-26 11:29:02.070480506 +0000 UTC m=+1087.881306940" watchObservedRunningTime="2026-02-26 11:29:02.080465768 +0000 UTC m=+1087.891292192" Feb 26 11:29:02 crc kubenswrapper[4699]: I0226 11:29:02.107280 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ghqf4" podStartSLOduration=3.255878647 podStartE2EDuration="32.107258006s" podCreationTimestamp="2026-02-26 11:28:30 +0000 UTC" firstStartedPulling="2026-02-26 11:28:32.580483963 +0000 UTC m=+1058.391310397" lastFinishedPulling="2026-02-26 11:29:01.431863332 +0000 UTC m=+1087.242689756" observedRunningTime="2026-02-26 11:29:02.102260964 +0000 UTC m=+1087.913087398" watchObservedRunningTime="2026-02-26 11:29:02.107258006 +0000 UTC m=+1087.918084440" Feb 26 11:29:02 crc kubenswrapper[4699]: I0226 11:29:02.128561 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qf9vd" podStartSLOduration=3.961107473 podStartE2EDuration="33.128538597s" podCreationTimestamp="2026-02-26 11:28:29 +0000 UTC" firstStartedPulling="2026-02-26 11:28:32.282335564 +0000 UTC m=+1058.093161998" lastFinishedPulling="2026-02-26 11:29:01.449766688 +0000 UTC m=+1087.260593122" observedRunningTime="2026-02-26 11:29:02.121782786 +0000 UTC m=+1087.932609230" watchObservedRunningTime="2026-02-26 11:29:02.128538597 +0000 UTC m=+1087.939365041" Feb 26 11:29:02 crc kubenswrapper[4699]: I0226 11:29:02.286670 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-bqvxr" podStartSLOduration=3.212991624 podStartE2EDuration="32.286654117s" podCreationTimestamp="2026-02-26 11:28:30 +0000 UTC" firstStartedPulling="2026-02-26 11:28:32.357414387 +0000 UTC m=+1058.168240821" lastFinishedPulling="2026-02-26 11:29:01.43107688 +0000 UTC m=+1087.241903314" observedRunningTime="2026-02-26 11:29:02.150944001 +0000 UTC m=+1087.961770465" watchObservedRunningTime="2026-02-26 11:29:02.286654117 +0000 UTC m=+1088.097480551" Feb 26 11:29:02 crc kubenswrapper[4699]: I0226 11:29:02.951001 4699 generic.go:334] "Generic (PLEG): container finished" podID="dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb" containerID="d12c24885bdbc88a04654df5aa4a62aabfb99830c195020e90a791e630ba32a8" exitCode=0 Feb 26 11:29:02 crc kubenswrapper[4699]: I0226 11:29:02.951080 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqqfb" event={"ID":"dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb","Type":"ContainerDied","Data":"d12c24885bdbc88a04654df5aa4a62aabfb99830c195020e90a791e630ba32a8"} Feb 26 11:29:03 crc kubenswrapper[4699]: I0226 11:29:03.970092 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mghs" event={"ID":"0ca72d8e-ed5f-47e4-b9c0-62b41fd687ee","Type":"ContainerStarted","Data":"73ff91eebc7079c7bd4e4770f147253ab675f37cd319dd6fa63a66d37ecee78e"} Feb 26 11:29:03 crc kubenswrapper[4699]: I0226 11:29:03.971317 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2wj2n" event={"ID":"a6e7ca85-e18b-4605-9180-316f65b82006","Type":"ContainerStarted","Data":"69e79410d562cd7879f89803e097fb5466c498aec15a2c4d05823e8ae9dea80d"} Feb 26 11:29:06 crc kubenswrapper[4699]: I0226 11:29:06.635886 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb" Feb 26 11:29:10 crc kubenswrapper[4699]: I0226 11:29:10.017650 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2wj2n" Feb 26 11:29:10 crc kubenswrapper[4699]: I0226 11:29:10.022242 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2wj2n" Feb 26 11:29:10 crc kubenswrapper[4699]: I0226 11:29:10.037994 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2wj2n" podStartSLOduration=9.555790113 podStartE2EDuration="40.037974355s" podCreationTimestamp="2026-02-26 11:28:30 +0000 UTC" firstStartedPulling="2026-02-26 11:28:32.255727732 +0000 UTC m=+1058.066554166" lastFinishedPulling="2026-02-26 11:29:02.737911974 +0000 UTC m=+1088.548738408" observedRunningTime="2026-02-26 11:29:10.032518061 +0000 UTC m=+1095.843344505" watchObservedRunningTime="2026-02-26 11:29:10.037974355 +0000 UTC m=+1095.848800789" Feb 26 11:29:10 crc kubenswrapper[4699]: I0226 11:29:10.075671 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mghs" podStartSLOduration=10.427117225 podStartE2EDuration="41.07564533s" podCreationTimestamp="2026-02-26 11:28:29 +0000 UTC" firstStartedPulling="2026-02-26 11:28:32.088091993 +0000 UTC m=+1057.898918427" lastFinishedPulling="2026-02-26 11:29:02.736620098 +0000 UTC m=+1088.547446532" observedRunningTime="2026-02-26 11:29:10.070232617 +0000 UTC m=+1095.881059051" watchObservedRunningTime="2026-02-26 11:29:10.07564533 +0000 UTC m=+1095.886471774" Feb 26 11:29:10 crc kubenswrapper[4699]: I0226 11:29:10.304830 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qf9vd" Feb 26 11:29:10 crc kubenswrapper[4699]: I0226 11:29:10.599829 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-9gwwj" Feb 26 11:29:10 crc kubenswrapper[4699]: I0226 11:29:10.722547 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-95whc" Feb 26 11:29:10 crc kubenswrapper[4699]: I0226 11:29:10.749050 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mghs" Feb 26 11:29:10 crc kubenswrapper[4699]: I0226 11:29:10.752128 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mghs" Feb 26 11:29:10 crc kubenswrapper[4699]: I0226 11:29:10.792702 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6gblm" Feb 26 11:29:11 crc kubenswrapper[4699]: I0226 11:29:11.226085 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-bqvxr" Feb 26 11:29:11 crc kubenswrapper[4699]: I0226 11:29:11.248596 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-mwvnr" Feb 26 11:29:15 crc kubenswrapper[4699]: I0226 11:29:15.923598 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-mtrs6" Feb 26 11:29:18 crc kubenswrapper[4699]: I0226 11:29:18.076680 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jxr77" event={"ID":"7545763d-d2d2-4b6e-980d-737062f0a894","Type":"ContainerStarted","Data":"21cf3c13d4147b90bbbc3e5c8c9143dd20a29710cc9f7a42b949171e4c3e971c"} Feb 26 11:29:18 crc kubenswrapper[4699]: I0226 11:29:18.077159 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jxr77" Feb 26 11:29:18 crc kubenswrapper[4699]: I0226 11:29:18.079126 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d2pxc" event={"ID":"a2c419ab-2a99-4d37-b46c-b84024f24b2e","Type":"ContainerStarted","Data":"df6d762b81d467e3a35d1a84b9e8af797f9c64a53bf891618914fe1e9c831664"} Feb 26 11:29:18 crc kubenswrapper[4699]: I0226 11:29:18.079325 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d2pxc" Feb 26 11:29:18 crc kubenswrapper[4699]: I0226 11:29:18.081214 4699 generic.go:334] "Generic (PLEG): container finished" podID="dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb" containerID="4112987fd7819bd9acbd52ee0573be4741193978a977b5c5b5cb9d664c60cbaa" exitCode=0 Feb 26 11:29:18 crc kubenswrapper[4699]: I0226 11:29:18.081298 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqqfb" event={"ID":"dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb","Type":"ContainerDied","Data":"4112987fd7819bd9acbd52ee0573be4741193978a977b5c5b5cb9d664c60cbaa"} Feb 26 11:29:18 crc kubenswrapper[4699]: I0226 11:29:18.083073 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-fnnc7" event={"ID":"a2b3bf3b-a815-4033-983b-eedc16b8609f","Type":"ContainerStarted","Data":"b8acb2396ea2c43e9c570cc00a83fec56bd1443bc07ec2e8c0c379585fc63883"} Feb 26 11:29:18 crc kubenswrapper[4699]: I0226 11:29:18.083276 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-fnnc7" Feb 26 11:29:18 crc kubenswrapper[4699]: I0226 11:29:18.097733 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jxr77" podStartSLOduration=3.186917505 podStartE2EDuration="48.097718241s" podCreationTimestamp="2026-02-26 11:28:30 +0000 UTC" firstStartedPulling="2026-02-26 11:28:32.340624432 +0000 UTC m=+1058.151450866" lastFinishedPulling="2026-02-26 11:29:17.251425128 +0000 UTC m=+1103.062251602" observedRunningTime="2026-02-26 11:29:18.094430238 +0000 UTC m=+1103.905256692" watchObservedRunningTime="2026-02-26 11:29:18.097718241 +0000 UTC m=+1103.908544665" Feb 26 11:29:18 crc kubenswrapper[4699]: I0226 11:29:18.111282 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-fnnc7" podStartSLOduration=3.430559793 podStartE2EDuration="48.111261624s" podCreationTimestamp="2026-02-26 11:28:30 +0000 UTC" firstStartedPulling="2026-02-26 11:28:32.572390244 +0000 UTC m=+1058.383216678" lastFinishedPulling="2026-02-26 11:29:17.253092075 +0000 UTC m=+1103.063918509" observedRunningTime="2026-02-26 11:29:18.110675327 +0000 UTC m=+1103.921501781" watchObservedRunningTime="2026-02-26 11:29:18.111261624 +0000 UTC m=+1103.922088058" Feb 26 11:29:18 crc kubenswrapper[4699]: I0226 11:29:18.127786 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d2pxc" podStartSLOduration=4.205067859 podStartE2EDuration="49.127766721s" podCreationTimestamp="2026-02-26 11:28:29 +0000 UTC" firstStartedPulling="2026-02-26 11:28:32.328445388 +0000 UTC m=+1058.139271822" lastFinishedPulling="2026-02-26 11:29:17.25114425 +0000 UTC m=+1103.061970684" observedRunningTime="2026-02-26 11:29:18.126930237 +0000 UTC m=+1103.937756691" watchObservedRunningTime="2026-02-26 11:29:18.127766721 +0000 UTC m=+1103.938593165" Feb 26 11:29:19 crc kubenswrapper[4699]: I0226 11:29:19.092393 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqqfb" event={"ID":"dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb","Type":"ContainerStarted","Data":"1f8302bbf90eb87cb30aa8a818dfa3d5a4eed62fee1821d469834548965e488d"} Feb 26 11:29:19 crc kubenswrapper[4699]: I0226 11:29:19.109885 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lqqfb" podStartSLOduration=7.486145707 podStartE2EDuration="23.109867745s" podCreationTimestamp="2026-02-26 11:28:56 +0000 UTC" firstStartedPulling="2026-02-26 11:29:02.95529579 +0000 UTC m=+1088.766122224" lastFinishedPulling="2026-02-26 11:29:18.579017828 +0000 UTC m=+1104.389844262" observedRunningTime="2026-02-26 11:29:19.107614271 +0000 UTC m=+1104.918440705" watchObservedRunningTime="2026-02-26 11:29:19.109867745 +0000 UTC m=+1104.920694189" Feb 26 11:29:26 crc kubenswrapper[4699]: I0226 11:29:26.977349 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lqqfb" Feb 26 11:29:26 crc kubenswrapper[4699]: I0226 11:29:26.978974 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lqqfb" Feb 26 11:29:27 crc kubenswrapper[4699]: I0226 11:29:27.020571 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lqqfb" Feb 26 11:29:27 crc kubenswrapper[4699]: I0226 11:29:27.223797 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lqqfb" Feb 26 11:29:27 crc kubenswrapper[4699]: I0226 11:29:27.275689 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqqfb"] Feb 26 11:29:29 crc kubenswrapper[4699]: I0226 11:29:29.195849 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lqqfb" podUID="dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb" containerName="registry-server" containerID="cri-o://1f8302bbf90eb87cb30aa8a818dfa3d5a4eed62fee1821d469834548965e488d" gracePeriod=2 Feb 26 11:29:29 crc kubenswrapper[4699]: I0226 11:29:29.665870 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqqfb" Feb 26 11:29:29 crc kubenswrapper[4699]: I0226 11:29:29.767683 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb-utilities\") pod \"dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb\" (UID: \"dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb\") " Feb 26 11:29:29 crc kubenswrapper[4699]: I0226 11:29:29.767752 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb-catalog-content\") pod \"dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb\" (UID: \"dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb\") " Feb 26 11:29:29 crc kubenswrapper[4699]: I0226 11:29:29.767845 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7dzn\" (UniqueName: \"kubernetes.io/projected/dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb-kube-api-access-p7dzn\") pod \"dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb\" (UID: \"dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb\") " Feb 26 11:29:29 crc kubenswrapper[4699]: I0226 11:29:29.768800 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb-utilities" (OuterVolumeSpecName: "utilities") pod "dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb" (UID: "dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:29:29 crc kubenswrapper[4699]: I0226 11:29:29.777108 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb-kube-api-access-p7dzn" (OuterVolumeSpecName: "kube-api-access-p7dzn") pod "dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb" (UID: "dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb"). InnerVolumeSpecName "kube-api-access-p7dzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:29:29 crc kubenswrapper[4699]: I0226 11:29:29.790799 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb" (UID: "dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:29:29 crc kubenswrapper[4699]: I0226 11:29:29.868793 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:29:29 crc kubenswrapper[4699]: I0226 11:29:29.868830 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:29:29 crc kubenswrapper[4699]: I0226 11:29:29.868842 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7dzn\" (UniqueName: \"kubernetes.io/projected/dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb-kube-api-access-p7dzn\") on node \"crc\" DevicePath \"\"" Feb 26 11:29:30 crc kubenswrapper[4699]: I0226 11:29:30.207975 4699 generic.go:334] "Generic (PLEG): container finished" podID="dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb" containerID="1f8302bbf90eb87cb30aa8a818dfa3d5a4eed62fee1821d469834548965e488d" exitCode=0 Feb 26 11:29:30 crc kubenswrapper[4699]: I0226 11:29:30.208029 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqqfb" event={"ID":"dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb","Type":"ContainerDied","Data":"1f8302bbf90eb87cb30aa8a818dfa3d5a4eed62fee1821d469834548965e488d"} Feb 26 11:29:30 crc kubenswrapper[4699]: I0226 11:29:30.208085 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqqfb" event={"ID":"dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb","Type":"ContainerDied","Data":"b20b95eb36039045cc49af319621d539e3867b2ec5748b29d5419b0ee942114d"} Feb 26 11:29:30 crc kubenswrapper[4699]: I0226 11:29:30.208042 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqqfb" Feb 26 11:29:30 crc kubenswrapper[4699]: I0226 11:29:30.208104 4699 scope.go:117] "RemoveContainer" containerID="1f8302bbf90eb87cb30aa8a818dfa3d5a4eed62fee1821d469834548965e488d" Feb 26 11:29:30 crc kubenswrapper[4699]: I0226 11:29:30.228414 4699 scope.go:117] "RemoveContainer" containerID="4112987fd7819bd9acbd52ee0573be4741193978a977b5c5b5cb9d664c60cbaa" Feb 26 11:29:30 crc kubenswrapper[4699]: I0226 11:29:30.248719 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqqfb"] Feb 26 11:29:30 crc kubenswrapper[4699]: I0226 11:29:30.254426 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqqfb"] Feb 26 11:29:30 crc kubenswrapper[4699]: I0226 11:29:30.270726 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb" path="/var/lib/kubelet/pods/dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb/volumes" Feb 26 11:29:30 crc kubenswrapper[4699]: I0226 11:29:30.271677 4699 scope.go:117] "RemoveContainer" containerID="d12c24885bdbc88a04654df5aa4a62aabfb99830c195020e90a791e630ba32a8" Feb 26 11:29:30 crc kubenswrapper[4699]: I0226 11:29:30.286589 4699 scope.go:117] "RemoveContainer" containerID="1f8302bbf90eb87cb30aa8a818dfa3d5a4eed62fee1821d469834548965e488d" Feb 26 11:29:30 crc kubenswrapper[4699]: E0226 11:29:30.286973 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f8302bbf90eb87cb30aa8a818dfa3d5a4eed62fee1821d469834548965e488d\": container with ID starting with 1f8302bbf90eb87cb30aa8a818dfa3d5a4eed62fee1821d469834548965e488d not found: ID does not exist" containerID="1f8302bbf90eb87cb30aa8a818dfa3d5a4eed62fee1821d469834548965e488d" Feb 26 11:29:30 crc kubenswrapper[4699]: I0226 11:29:30.287016 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f8302bbf90eb87cb30aa8a818dfa3d5a4eed62fee1821d469834548965e488d"} err="failed to get container status \"1f8302bbf90eb87cb30aa8a818dfa3d5a4eed62fee1821d469834548965e488d\": rpc error: code = NotFound desc = could not find container \"1f8302bbf90eb87cb30aa8a818dfa3d5a4eed62fee1821d469834548965e488d\": container with ID starting with 1f8302bbf90eb87cb30aa8a818dfa3d5a4eed62fee1821d469834548965e488d not found: ID does not exist" Feb 26 11:29:30 crc kubenswrapper[4699]: I0226 11:29:30.287044 4699 scope.go:117] "RemoveContainer" containerID="4112987fd7819bd9acbd52ee0573be4741193978a977b5c5b5cb9d664c60cbaa" Feb 26 11:29:30 crc kubenswrapper[4699]: E0226 11:29:30.287580 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4112987fd7819bd9acbd52ee0573be4741193978a977b5c5b5cb9d664c60cbaa\": container with ID starting with 4112987fd7819bd9acbd52ee0573be4741193978a977b5c5b5cb9d664c60cbaa not found: ID does not exist" containerID="4112987fd7819bd9acbd52ee0573be4741193978a977b5c5b5cb9d664c60cbaa" Feb 26 11:29:30 crc kubenswrapper[4699]: I0226 11:29:30.287612 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4112987fd7819bd9acbd52ee0573be4741193978a977b5c5b5cb9d664c60cbaa"} err="failed to get container status \"4112987fd7819bd9acbd52ee0573be4741193978a977b5c5b5cb9d664c60cbaa\": rpc error: code = NotFound desc = could not find container \"4112987fd7819bd9acbd52ee0573be4741193978a977b5c5b5cb9d664c60cbaa\": container with ID starting with 4112987fd7819bd9acbd52ee0573be4741193978a977b5c5b5cb9d664c60cbaa not found: ID does not exist" Feb 26 11:29:30 crc kubenswrapper[4699]: I0226 11:29:30.287635 4699 scope.go:117] "RemoveContainer" containerID="d12c24885bdbc88a04654df5aa4a62aabfb99830c195020e90a791e630ba32a8" Feb 26 11:29:30 crc kubenswrapper[4699]: E0226 11:29:30.287835 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d12c24885bdbc88a04654df5aa4a62aabfb99830c195020e90a791e630ba32a8\": container with ID starting with d12c24885bdbc88a04654df5aa4a62aabfb99830c195020e90a791e630ba32a8 not found: ID does not exist" containerID="d12c24885bdbc88a04654df5aa4a62aabfb99830c195020e90a791e630ba32a8" Feb 26 11:29:30 crc kubenswrapper[4699]: I0226 11:29:30.287869 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d12c24885bdbc88a04654df5aa4a62aabfb99830c195020e90a791e630ba32a8"} err="failed to get container status \"d12c24885bdbc88a04654df5aa4a62aabfb99830c195020e90a791e630ba32a8\": rpc error: code = NotFound desc = could not find container \"d12c24885bdbc88a04654df5aa4a62aabfb99830c195020e90a791e630ba32a8\": container with ID starting with d12c24885bdbc88a04654df5aa4a62aabfb99830c195020e90a791e630ba32a8 not found: ID does not exist" Feb 26 11:29:30 crc kubenswrapper[4699]: I0226 11:29:30.573441 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d2pxc" Feb 26 11:29:31 crc kubenswrapper[4699]: I0226 11:29:31.203440 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jxr77" Feb 26 11:29:31 crc kubenswrapper[4699]: I0226 11:29:31.376251 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-fnnc7" Feb 26 11:29:47 crc kubenswrapper[4699]: I0226 11:29:47.915839 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vn7l5"] Feb 26 11:29:47 crc kubenswrapper[4699]: E0226 11:29:47.917067 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb" containerName="extract-content" Feb 26 11:29:47 crc kubenswrapper[4699]: I0226 11:29:47.917174 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb" containerName="extract-content" Feb 26 11:29:47 crc kubenswrapper[4699]: E0226 11:29:47.917193 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb" containerName="extract-utilities" Feb 26 11:29:47 crc kubenswrapper[4699]: I0226 11:29:47.917200 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb" containerName="extract-utilities" Feb 26 11:29:47 crc kubenswrapper[4699]: E0226 11:29:47.917207 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb" containerName="registry-server" Feb 26 11:29:47 crc kubenswrapper[4699]: I0226 11:29:47.917215 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb" containerName="registry-server" Feb 26 11:29:47 crc kubenswrapper[4699]: I0226 11:29:47.917376 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcab2afa-9fb1-4d74-9a95-c2fe6a00bbfb" containerName="registry-server" Feb 26 11:29:47 crc kubenswrapper[4699]: I0226 11:29:47.918410 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vn7l5" Feb 26 11:29:47 crc kubenswrapper[4699]: I0226 11:29:47.923280 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 26 11:29:47 crc kubenswrapper[4699]: I0226 11:29:47.923412 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-69gc7" Feb 26 11:29:47 crc kubenswrapper[4699]: I0226 11:29:47.923744 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 26 11:29:47 crc kubenswrapper[4699]: I0226 11:29:47.929590 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vn7l5"] Feb 26 11:29:47 crc kubenswrapper[4699]: I0226 11:29:47.931108 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 26 11:29:48 crc kubenswrapper[4699]: I0226 11:29:48.000750 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-65vml"] Feb 26 11:29:48 crc kubenswrapper[4699]: I0226 11:29:48.001957 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-65vml" Feb 26 11:29:48 crc kubenswrapper[4699]: I0226 11:29:48.006458 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 26 11:29:48 crc kubenswrapper[4699]: I0226 11:29:48.009560 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-65vml"] Feb 26 11:29:48 crc kubenswrapper[4699]: I0226 11:29:48.122103 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb471e3-5e11-44a3-b3cd-176785c79d76-config\") pod \"dnsmasq-dns-78dd6ddcc-65vml\" (UID: \"3eb471e3-5e11-44a3-b3cd-176785c79d76\") " pod="openstack/dnsmasq-dns-78dd6ddcc-65vml" Feb 26 11:29:48 crc kubenswrapper[4699]: I0226 11:29:48.122236 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44038b95-eefd-44cd-9781-0a2273605e75-config\") pod \"dnsmasq-dns-675f4bcbfc-vn7l5\" (UID: \"44038b95-eefd-44cd-9781-0a2273605e75\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vn7l5" Feb 26 11:29:48 crc kubenswrapper[4699]: I0226 11:29:48.122283 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4949\" (UniqueName: \"kubernetes.io/projected/3eb471e3-5e11-44a3-b3cd-176785c79d76-kube-api-access-h4949\") pod \"dnsmasq-dns-78dd6ddcc-65vml\" (UID: \"3eb471e3-5e11-44a3-b3cd-176785c79d76\") " pod="openstack/dnsmasq-dns-78dd6ddcc-65vml" Feb 26 11:29:48 crc kubenswrapper[4699]: I0226 11:29:48.122339 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5r5x\" (UniqueName: \"kubernetes.io/projected/44038b95-eefd-44cd-9781-0a2273605e75-kube-api-access-d5r5x\") pod \"dnsmasq-dns-675f4bcbfc-vn7l5\" (UID: \"44038b95-eefd-44cd-9781-0a2273605e75\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vn7l5" Feb 26 11:29:48 crc kubenswrapper[4699]: I0226 11:29:48.122385 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3eb471e3-5e11-44a3-b3cd-176785c79d76-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-65vml\" (UID: \"3eb471e3-5e11-44a3-b3cd-176785c79d76\") " pod="openstack/dnsmasq-dns-78dd6ddcc-65vml" Feb 26 11:29:48 crc kubenswrapper[4699]: I0226 11:29:48.223538 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5r5x\" (UniqueName: \"kubernetes.io/projected/44038b95-eefd-44cd-9781-0a2273605e75-kube-api-access-d5r5x\") pod \"dnsmasq-dns-675f4bcbfc-vn7l5\" (UID: \"44038b95-eefd-44cd-9781-0a2273605e75\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vn7l5" Feb 26 11:29:48 crc kubenswrapper[4699]: I0226 11:29:48.223601 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3eb471e3-5e11-44a3-b3cd-176785c79d76-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-65vml\" (UID: \"3eb471e3-5e11-44a3-b3cd-176785c79d76\") " pod="openstack/dnsmasq-dns-78dd6ddcc-65vml" Feb 26 11:29:48 crc kubenswrapper[4699]: I0226 11:29:48.223639 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb471e3-5e11-44a3-b3cd-176785c79d76-config\") pod \"dnsmasq-dns-78dd6ddcc-65vml\" (UID: \"3eb471e3-5e11-44a3-b3cd-176785c79d76\") " pod="openstack/dnsmasq-dns-78dd6ddcc-65vml" Feb 26 11:29:48 crc kubenswrapper[4699]: I0226 11:29:48.223673 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44038b95-eefd-44cd-9781-0a2273605e75-config\") pod \"dnsmasq-dns-675f4bcbfc-vn7l5\" (UID: \"44038b95-eefd-44cd-9781-0a2273605e75\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vn7l5" Feb 26 11:29:48 crc kubenswrapper[4699]: I0226 11:29:48.223707 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4949\" (UniqueName: \"kubernetes.io/projected/3eb471e3-5e11-44a3-b3cd-176785c79d76-kube-api-access-h4949\") pod \"dnsmasq-dns-78dd6ddcc-65vml\" (UID: \"3eb471e3-5e11-44a3-b3cd-176785c79d76\") " pod="openstack/dnsmasq-dns-78dd6ddcc-65vml" Feb 26 11:29:48 crc kubenswrapper[4699]: I0226 11:29:48.224842 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb471e3-5e11-44a3-b3cd-176785c79d76-config\") pod \"dnsmasq-dns-78dd6ddcc-65vml\" (UID: \"3eb471e3-5e11-44a3-b3cd-176785c79d76\") " pod="openstack/dnsmasq-dns-78dd6ddcc-65vml" Feb 26 11:29:48 crc kubenswrapper[4699]: I0226 11:29:48.224868 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3eb471e3-5e11-44a3-b3cd-176785c79d76-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-65vml\" (UID: \"3eb471e3-5e11-44a3-b3cd-176785c79d76\") " pod="openstack/dnsmasq-dns-78dd6ddcc-65vml" Feb 26 11:29:48 crc kubenswrapper[4699]: I0226 11:29:48.224888 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44038b95-eefd-44cd-9781-0a2273605e75-config\") pod \"dnsmasq-dns-675f4bcbfc-vn7l5\" (UID: \"44038b95-eefd-44cd-9781-0a2273605e75\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vn7l5" Feb 26 11:29:48 crc kubenswrapper[4699]: I0226 11:29:48.243678 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4949\" (UniqueName: \"kubernetes.io/projected/3eb471e3-5e11-44a3-b3cd-176785c79d76-kube-api-access-h4949\") pod \"dnsmasq-dns-78dd6ddcc-65vml\" (UID: \"3eb471e3-5e11-44a3-b3cd-176785c79d76\") " pod="openstack/dnsmasq-dns-78dd6ddcc-65vml" Feb 26 11:29:48 crc kubenswrapper[4699]: I0226 11:29:48.248975 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5r5x\" (UniqueName: \"kubernetes.io/projected/44038b95-eefd-44cd-9781-0a2273605e75-kube-api-access-d5r5x\") pod \"dnsmasq-dns-675f4bcbfc-vn7l5\" (UID: \"44038b95-eefd-44cd-9781-0a2273605e75\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vn7l5" Feb 26 11:29:48 crc kubenswrapper[4699]: I0226 11:29:48.325022 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-65vml" Feb 26 11:29:48 crc kubenswrapper[4699]: I0226 11:29:48.537648 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vn7l5" Feb 26 11:29:48 crc kubenswrapper[4699]: I0226 11:29:48.741469 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-65vml"] Feb 26 11:29:48 crc kubenswrapper[4699]: I0226 11:29:48.934881 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vn7l5"] Feb 26 11:29:49 crc kubenswrapper[4699]: I0226 11:29:49.346620 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-65vml" event={"ID":"3eb471e3-5e11-44a3-b3cd-176785c79d76","Type":"ContainerStarted","Data":"718ea2b3fa019be493e1d3c84030139b5efcc52cadefdd0f6d61e3832e93141d"} Feb 26 11:29:49 crc kubenswrapper[4699]: I0226 11:29:49.349437 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-vn7l5" event={"ID":"44038b95-eefd-44cd-9781-0a2273605e75","Type":"ContainerStarted","Data":"02fc15d2715e55f8c8cd19bc42d6cb612f93305db1f6bde0aa9d00c273dd8d8c"} Feb 26 11:29:50 crc kubenswrapper[4699]: I0226 11:29:50.506922 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vn7l5"] Feb 26 11:29:50 crc kubenswrapper[4699]: I0226 11:29:50.531540 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vpxrq"] Feb 26 11:29:50 crc kubenswrapper[4699]: I0226 11:29:50.533016 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vpxrq" Feb 26 11:29:50 crc kubenswrapper[4699]: I0226 11:29:50.543894 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vpxrq"] Feb 26 11:29:50 crc kubenswrapper[4699]: I0226 11:29:50.667971 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e16e518-0512-4df0-b8c7-1cd2f9c1e352-dns-svc\") pod \"dnsmasq-dns-666b6646f7-vpxrq\" (UID: \"9e16e518-0512-4df0-b8c7-1cd2f9c1e352\") " pod="openstack/dnsmasq-dns-666b6646f7-vpxrq" Feb 26 11:29:50 crc kubenswrapper[4699]: I0226 11:29:50.668368 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khq7k\" (UniqueName: \"kubernetes.io/projected/9e16e518-0512-4df0-b8c7-1cd2f9c1e352-kube-api-access-khq7k\") pod \"dnsmasq-dns-666b6646f7-vpxrq\" (UID: \"9e16e518-0512-4df0-b8c7-1cd2f9c1e352\") " pod="openstack/dnsmasq-dns-666b6646f7-vpxrq" Feb 26 11:29:50 crc kubenswrapper[4699]: I0226 11:29:50.668443 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e16e518-0512-4df0-b8c7-1cd2f9c1e352-config\") pod \"dnsmasq-dns-666b6646f7-vpxrq\" (UID: \"9e16e518-0512-4df0-b8c7-1cd2f9c1e352\") " pod="openstack/dnsmasq-dns-666b6646f7-vpxrq" Feb 26 11:29:50 crc kubenswrapper[4699]: I0226 11:29:50.770836 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e16e518-0512-4df0-b8c7-1cd2f9c1e352-dns-svc\") pod \"dnsmasq-dns-666b6646f7-vpxrq\" (UID: \"9e16e518-0512-4df0-b8c7-1cd2f9c1e352\") " pod="openstack/dnsmasq-dns-666b6646f7-vpxrq" Feb 26 11:29:50 crc kubenswrapper[4699]: I0226 11:29:50.770936 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khq7k\" (UniqueName: \"kubernetes.io/projected/9e16e518-0512-4df0-b8c7-1cd2f9c1e352-kube-api-access-khq7k\") pod \"dnsmasq-dns-666b6646f7-vpxrq\" (UID: \"9e16e518-0512-4df0-b8c7-1cd2f9c1e352\") " pod="openstack/dnsmasq-dns-666b6646f7-vpxrq" Feb 26 11:29:50 crc kubenswrapper[4699]: I0226 11:29:50.771009 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e16e518-0512-4df0-b8c7-1cd2f9c1e352-config\") pod \"dnsmasq-dns-666b6646f7-vpxrq\" (UID: \"9e16e518-0512-4df0-b8c7-1cd2f9c1e352\") " pod="openstack/dnsmasq-dns-666b6646f7-vpxrq" Feb 26 11:29:50 crc kubenswrapper[4699]: I0226 11:29:50.772543 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e16e518-0512-4df0-b8c7-1cd2f9c1e352-config\") pod \"dnsmasq-dns-666b6646f7-vpxrq\" (UID: \"9e16e518-0512-4df0-b8c7-1cd2f9c1e352\") " pod="openstack/dnsmasq-dns-666b6646f7-vpxrq" Feb 26 11:29:50 crc kubenswrapper[4699]: I0226 11:29:50.772572 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e16e518-0512-4df0-b8c7-1cd2f9c1e352-dns-svc\") pod \"dnsmasq-dns-666b6646f7-vpxrq\" (UID: \"9e16e518-0512-4df0-b8c7-1cd2f9c1e352\") " pod="openstack/dnsmasq-dns-666b6646f7-vpxrq" Feb 26 11:29:50 crc kubenswrapper[4699]: I0226 11:29:50.807655 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-65vml"] Feb 26 11:29:50 crc kubenswrapper[4699]: I0226 11:29:50.808210 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khq7k\" (UniqueName: \"kubernetes.io/projected/9e16e518-0512-4df0-b8c7-1cd2f9c1e352-kube-api-access-khq7k\") pod \"dnsmasq-dns-666b6646f7-vpxrq\" (UID: \"9e16e518-0512-4df0-b8c7-1cd2f9c1e352\") " pod="openstack/dnsmasq-dns-666b6646f7-vpxrq" Feb 26 11:29:50 crc kubenswrapper[4699]: I0226 11:29:50.833001 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mwnwv"] Feb 26 11:29:50 crc kubenswrapper[4699]: I0226 11:29:50.834607 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mwnwv" Feb 26 11:29:50 crc kubenswrapper[4699]: I0226 11:29:50.856371 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mwnwv"] Feb 26 11:29:50 crc kubenswrapper[4699]: I0226 11:29:50.889473 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vpxrq" Feb 26 11:29:50 crc kubenswrapper[4699]: I0226 11:29:50.973195 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13838b5f-5f0e-44ba-8b63-97b4e20efbce-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-mwnwv\" (UID: \"13838b5f-5f0e-44ba-8b63-97b4e20efbce\") " pod="openstack/dnsmasq-dns-57d769cc4f-mwnwv" Feb 26 11:29:50 crc kubenswrapper[4699]: I0226 11:29:50.973323 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13838b5f-5f0e-44ba-8b63-97b4e20efbce-config\") pod \"dnsmasq-dns-57d769cc4f-mwnwv\" (UID: \"13838b5f-5f0e-44ba-8b63-97b4e20efbce\") " pod="openstack/dnsmasq-dns-57d769cc4f-mwnwv" Feb 26 11:29:50 crc kubenswrapper[4699]: I0226 11:29:50.973368 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx227\" (UniqueName: \"kubernetes.io/projected/13838b5f-5f0e-44ba-8b63-97b4e20efbce-kube-api-access-vx227\") pod \"dnsmasq-dns-57d769cc4f-mwnwv\" (UID: \"13838b5f-5f0e-44ba-8b63-97b4e20efbce\") " pod="openstack/dnsmasq-dns-57d769cc4f-mwnwv" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.074144 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13838b5f-5f0e-44ba-8b63-97b4e20efbce-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-mwnwv\" (UID: \"13838b5f-5f0e-44ba-8b63-97b4e20efbce\") " pod="openstack/dnsmasq-dns-57d769cc4f-mwnwv" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.074249 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13838b5f-5f0e-44ba-8b63-97b4e20efbce-config\") pod \"dnsmasq-dns-57d769cc4f-mwnwv\" (UID: \"13838b5f-5f0e-44ba-8b63-97b4e20efbce\") " pod="openstack/dnsmasq-dns-57d769cc4f-mwnwv" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.074276 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx227\" (UniqueName: \"kubernetes.io/projected/13838b5f-5f0e-44ba-8b63-97b4e20efbce-kube-api-access-vx227\") pod \"dnsmasq-dns-57d769cc4f-mwnwv\" (UID: \"13838b5f-5f0e-44ba-8b63-97b4e20efbce\") " pod="openstack/dnsmasq-dns-57d769cc4f-mwnwv" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.075230 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13838b5f-5f0e-44ba-8b63-97b4e20efbce-config\") pod \"dnsmasq-dns-57d769cc4f-mwnwv\" (UID: \"13838b5f-5f0e-44ba-8b63-97b4e20efbce\") " pod="openstack/dnsmasq-dns-57d769cc4f-mwnwv" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.075256 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13838b5f-5f0e-44ba-8b63-97b4e20efbce-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-mwnwv\" (UID: \"13838b5f-5f0e-44ba-8b63-97b4e20efbce\") " pod="openstack/dnsmasq-dns-57d769cc4f-mwnwv" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.096364 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx227\" (UniqueName: \"kubernetes.io/projected/13838b5f-5f0e-44ba-8b63-97b4e20efbce-kube-api-access-vx227\") pod \"dnsmasq-dns-57d769cc4f-mwnwv\" (UID: \"13838b5f-5f0e-44ba-8b63-97b4e20efbce\") " pod="openstack/dnsmasq-dns-57d769cc4f-mwnwv" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.166467 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mwnwv" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.529629 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vpxrq"] Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.639756 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mwnwv"] Feb 26 11:29:51 crc kubenswrapper[4699]: W0226 11:29:51.646715 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13838b5f_5f0e_44ba_8b63_97b4e20efbce.slice/crio-30da7116ef227de51d61b599067ff253e3fbcd27cb9bf2e3d4c83d06e5a7374a WatchSource:0}: Error finding container 30da7116ef227de51d61b599067ff253e3fbcd27cb9bf2e3d4c83d06e5a7374a: Status 404 returned error can't find the container with id 30da7116ef227de51d61b599067ff253e3fbcd27cb9bf2e3d4c83d06e5a7374a Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.699165 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.700758 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.705135 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.705209 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.705267 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-g9kcp" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.705265 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.705288 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.705343 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.706641 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.712437 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.890909 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f4a652b4-5b96-4ebf-81b4-df92846455bd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.891418 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.891537 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.891633 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f4a652b4-5b96-4ebf-81b4-df92846455bd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.891714 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42595\" (UniqueName: \"kubernetes.io/projected/f4a652b4-5b96-4ebf-81b4-df92846455bd-kube-api-access-42595\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.891809 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.891914 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f4a652b4-5b96-4ebf-81b4-df92846455bd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.892016 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4a652b4-5b96-4ebf-81b4-df92846455bd-config-data\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.892229 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.892346 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.892466 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f4a652b4-5b96-4ebf-81b4-df92846455bd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.996660 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.996779 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f4a652b4-5b96-4ebf-81b4-df92846455bd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.996840 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f4a652b4-5b96-4ebf-81b4-df92846455bd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.996927 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.997893 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.997972 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.997433 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.998183 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f4a652b4-5b96-4ebf-81b4-df92846455bd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.998244 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42595\" (UniqueName: \"kubernetes.io/projected/f4a652b4-5b96-4ebf-81b4-df92846455bd-kube-api-access-42595\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.998304 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.998329 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f4a652b4-5b96-4ebf-81b4-df92846455bd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.998359 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4a652b4-5b96-4ebf-81b4-df92846455bd-config-data\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.998444 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.999204 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.999845 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f4a652b4-5b96-4ebf-81b4-df92846455bd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:51 crc kubenswrapper[4699]: I0226 11:29:51.999921 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4a652b4-5b96-4ebf-81b4-df92846455bd-config-data\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.002688 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f4a652b4-5b96-4ebf-81b4-df92846455bd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.003780 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.006447 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f4a652b4-5b96-4ebf-81b4-df92846455bd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.013012 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f4a652b4-5b96-4ebf-81b4-df92846455bd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.020947 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.021842 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.022780 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42595\" (UniqueName: \"kubernetes.io/projected/f4a652b4-5b96-4ebf-81b4-df92846455bd-kube-api-access-42595\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.033317 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.034706 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " pod="openstack/rabbitmq-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.036931 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.037238 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.037454 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.037706 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mp8r4" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.037880 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.038027 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.040520 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.044328 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.200896 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d57084d-dc87-44e4-bbc8-50c402b7165b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.200960 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.200983 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.201019 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d57084d-dc87-44e4-bbc8-50c402b7165b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.201170 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.201192 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d57084d-dc87-44e4-bbc8-50c402b7165b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.201222 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.201239 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d57084d-dc87-44e4-bbc8-50c402b7165b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.201257 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d57084d-dc87-44e4-bbc8-50c402b7165b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.201281 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.201385 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz7xp\" (UniqueName: \"kubernetes.io/projected/2d57084d-dc87-44e4-bbc8-50c402b7165b-kube-api-access-pz7xp\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.303761 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d57084d-dc87-44e4-bbc8-50c402b7165b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.304005 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.304029 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.304062 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d57084d-dc87-44e4-bbc8-50c402b7165b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.304098 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.306566 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d57084d-dc87-44e4-bbc8-50c402b7165b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.306651 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.306676 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d57084d-dc87-44e4-bbc8-50c402b7165b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.306706 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d57084d-dc87-44e4-bbc8-50c402b7165b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.306729 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.306764 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz7xp\" (UniqueName: \"kubernetes.io/projected/2d57084d-dc87-44e4-bbc8-50c402b7165b-kube-api-access-pz7xp\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.311605 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.312343 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.312991 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d57084d-dc87-44e4-bbc8-50c402b7165b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.314760 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d57084d-dc87-44e4-bbc8-50c402b7165b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.314921 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.315227 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d57084d-dc87-44e4-bbc8-50c402b7165b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.315462 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.315924 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d57084d-dc87-44e4-bbc8-50c402b7165b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.335503 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.335526 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz7xp\" (UniqueName: \"kubernetes.io/projected/2d57084d-dc87-44e4-bbc8-50c402b7165b-kube-api-access-pz7xp\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.335731 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d57084d-dc87-44e4-bbc8-50c402b7165b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.338450 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.350020 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.388318 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.390872 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-vpxrq" event={"ID":"9e16e518-0512-4df0-b8c7-1cd2f9c1e352","Type":"ContainerStarted","Data":"2aaa9042481814730657b40428f86e835d8db2be305a9194e255d49a0c3e4409"} Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.401217 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mwnwv" event={"ID":"13838b5f-5f0e-44ba-8b63-97b4e20efbce","Type":"ContainerStarted","Data":"30da7116ef227de51d61b599067ff253e3fbcd27cb9bf2e3d4c83d06e5a7374a"} Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.846672 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 11:29:52 crc kubenswrapper[4699]: I0226 11:29:52.853238 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.383977 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.388672 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.392841 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.392927 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.392936 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.393158 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-m8zx4" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.420301 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.423940 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.623940 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6fdc6b6d-ac77-4179-9864-f220d622c0f4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.623997 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.624018 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fdc6b6d-ac77-4179-9864-f220d622c0f4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.624053 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxgtj\" (UniqueName: \"kubernetes.io/projected/6fdc6b6d-ac77-4179-9864-f220d622c0f4-kube-api-access-cxgtj\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.627250 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fdc6b6d-ac77-4179-9864-f220d622c0f4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.627357 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fdc6b6d-ac77-4179-9864-f220d622c0f4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.631669 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6fdc6b6d-ac77-4179-9864-f220d622c0f4-kolla-config\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.631768 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6fdc6b6d-ac77-4179-9864-f220d622c0f4-config-data-default\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.735464 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fdc6b6d-ac77-4179-9864-f220d622c0f4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.735523 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fdc6b6d-ac77-4179-9864-f220d622c0f4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.735721 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6fdc6b6d-ac77-4179-9864-f220d622c0f4-kolla-config\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.735761 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6fdc6b6d-ac77-4179-9864-f220d622c0f4-config-data-default\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.736285 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6fdc6b6d-ac77-4179-9864-f220d622c0f4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.736319 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.736336 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fdc6b6d-ac77-4179-9864-f220d622c0f4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.736368 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxgtj\" (UniqueName: \"kubernetes.io/projected/6fdc6b6d-ac77-4179-9864-f220d622c0f4-kube-api-access-cxgtj\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.736971 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6fdc6b6d-ac77-4179-9864-f220d622c0f4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.737244 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.737633 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6fdc6b6d-ac77-4179-9864-f220d622c0f4-config-data-default\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.738858 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6fdc6b6d-ac77-4179-9864-f220d622c0f4-kolla-config\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.739290 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fdc6b6d-ac77-4179-9864-f220d622c0f4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.749568 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fdc6b6d-ac77-4179-9864-f220d622c0f4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.753827 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fdc6b6d-ac77-4179-9864-f220d622c0f4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.758139 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxgtj\" (UniqueName: \"kubernetes.io/projected/6fdc6b6d-ac77-4179-9864-f220d622c0f4-kube-api-access-cxgtj\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.767356 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"6fdc6b6d-ac77-4179-9864-f220d622c0f4\") " pod="openstack/openstack-galera-0" Feb 26 11:29:53 crc kubenswrapper[4699]: I0226 11:29:53.982013 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.780460 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.781674 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.786730 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.786936 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.787096 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.787893 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-hwww4" Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.805184 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.850360 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.851547 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.857566 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.858165 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-znbkh" Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.858413 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.868084 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.959277 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5flh5\" (UniqueName: \"kubernetes.io/projected/6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2-kube-api-access-5flh5\") pod \"memcached-0\" (UID: \"6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2\") " pod="openstack/memcached-0" Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.959460 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/edce8e75-6dd5-4fbd-8f76-bc6553cc27b9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.959497 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edce8e75-6dd5-4fbd-8f76-bc6553cc27b9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.959600 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edce8e75-6dd5-4fbd-8f76-bc6553cc27b9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.959743 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/edce8e75-6dd5-4fbd-8f76-bc6553cc27b9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.959802 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2-kolla-config\") pod \"memcached-0\" (UID: \"6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2\") " pod="openstack/memcached-0" Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.959823 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2\") " pod="openstack/memcached-0" Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.959919 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2-config-data\") pod \"memcached-0\" (UID: \"6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2\") " pod="openstack/memcached-0" Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.959990 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2\") " pod="openstack/memcached-0" Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.960033 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/edce8e75-6dd5-4fbd-8f76-bc6553cc27b9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.960060 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/edce8e75-6dd5-4fbd-8f76-bc6553cc27b9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.960081 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4ntn\" (UniqueName: \"kubernetes.io/projected/edce8e75-6dd5-4fbd-8f76-bc6553cc27b9-kube-api-access-f4ntn\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:54 crc kubenswrapper[4699]: I0226 11:29:54.960130 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.061294 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/edce8e75-6dd5-4fbd-8f76-bc6553cc27b9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.061343 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/edce8e75-6dd5-4fbd-8f76-bc6553cc27b9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.061361 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4ntn\" (UniqueName: \"kubernetes.io/projected/edce8e75-6dd5-4fbd-8f76-bc6553cc27b9-kube-api-access-f4ntn\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.061393 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.061419 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5flh5\" (UniqueName: \"kubernetes.io/projected/6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2-kube-api-access-5flh5\") pod \"memcached-0\" (UID: \"6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2\") " pod="openstack/memcached-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.061463 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/edce8e75-6dd5-4fbd-8f76-bc6553cc27b9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.061480 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edce8e75-6dd5-4fbd-8f76-bc6553cc27b9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.061507 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edce8e75-6dd5-4fbd-8f76-bc6553cc27b9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.061535 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/edce8e75-6dd5-4fbd-8f76-bc6553cc27b9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.061553 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2\") " pod="openstack/memcached-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.061568 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2-kolla-config\") pod \"memcached-0\" (UID: \"6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2\") " pod="openstack/memcached-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.061597 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2-config-data\") pod \"memcached-0\" (UID: \"6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2\") " pod="openstack/memcached-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.061615 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2\") " pod="openstack/memcached-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.062534 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/edce8e75-6dd5-4fbd-8f76-bc6553cc27b9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.062757 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/edce8e75-6dd5-4fbd-8f76-bc6553cc27b9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.062861 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.063423 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2-config-data\") pod \"memcached-0\" (UID: \"6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2\") " pod="openstack/memcached-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.063850 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edce8e75-6dd5-4fbd-8f76-bc6553cc27b9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.064136 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2-kolla-config\") pod \"memcached-0\" (UID: \"6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2\") " pod="openstack/memcached-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.064694 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/edce8e75-6dd5-4fbd-8f76-bc6553cc27b9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.065964 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edce8e75-6dd5-4fbd-8f76-bc6553cc27b9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.067067 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/edce8e75-6dd5-4fbd-8f76-bc6553cc27b9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.067358 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2\") " pod="openstack/memcached-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.077213 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2\") " pod="openstack/memcached-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.081766 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4ntn\" (UniqueName: \"kubernetes.io/projected/edce8e75-6dd5-4fbd-8f76-bc6553cc27b9-kube-api-access-f4ntn\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.083526 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5flh5\" (UniqueName: \"kubernetes.io/projected/6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2-kube-api-access-5flh5\") pod \"memcached-0\" (UID: \"6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2\") " pod="openstack/memcached-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.084231 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9\") " pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.122793 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 26 11:29:55 crc kubenswrapper[4699]: I0226 11:29:55.181803 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 26 11:29:57 crc kubenswrapper[4699]: I0226 11:29:57.385750 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 11:29:57 crc kubenswrapper[4699]: I0226 11:29:57.387077 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 11:29:57 crc kubenswrapper[4699]: I0226 11:29:57.388843 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-f8vrt" Feb 26 11:29:57 crc kubenswrapper[4699]: I0226 11:29:57.396660 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 11:29:57 crc kubenswrapper[4699]: I0226 11:29:57.506694 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4vnz\" (UniqueName: \"kubernetes.io/projected/2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf-kube-api-access-r4vnz\") pod \"kube-state-metrics-0\" (UID: \"2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf\") " pod="openstack/kube-state-metrics-0" Feb 26 11:29:57 crc kubenswrapper[4699]: I0226 11:29:57.608372 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4vnz\" (UniqueName: \"kubernetes.io/projected/2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf-kube-api-access-r4vnz\") pod \"kube-state-metrics-0\" (UID: \"2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf\") " pod="openstack/kube-state-metrics-0" Feb 26 11:29:57 crc kubenswrapper[4699]: I0226 11:29:57.631678 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4vnz\" (UniqueName: \"kubernetes.io/projected/2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf-kube-api-access-r4vnz\") pod \"kube-state-metrics-0\" (UID: \"2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf\") " pod="openstack/kube-state-metrics-0" Feb 26 11:29:57 crc kubenswrapper[4699]: I0226 11:29:57.702337 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 11:29:58 crc kubenswrapper[4699]: W0226 11:29:58.581315 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d57084d_dc87_44e4_bbc8_50c402b7165b.slice/crio-6c8df8aa27d02e0ceb8002bd8f20b8b521706c7be8fe88b152c705914906b7ae WatchSource:0}: Error finding container 6c8df8aa27d02e0ceb8002bd8f20b8b521706c7be8fe88b152c705914906b7ae: Status 404 returned error can't find the container with id 6c8df8aa27d02e0ceb8002bd8f20b8b521706c7be8fe88b152c705914906b7ae Feb 26 11:29:59 crc kubenswrapper[4699]: I0226 11:29:59.480029 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d57084d-dc87-44e4-bbc8-50c402b7165b","Type":"ContainerStarted","Data":"6c8df8aa27d02e0ceb8002bd8f20b8b521706c7be8fe88b152c705914906b7ae"} Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.139501 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535090-7v44h"] Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.140824 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535090-7v44h" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.148005 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.148219 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.148341 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.159697 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535090-n42nj"] Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.160888 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535090-n42nj" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.163988 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.164162 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.192662 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535090-7v44h"] Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.203818 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535090-n42nj"] Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.209468 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nrvng"] Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.210668 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nrvng" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.219480 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nrvng"] Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.222094 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-hplxc" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.222271 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.222518 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.256492 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-gxnxl"] Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.258770 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.259022 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvx8x\" (UniqueName: \"kubernetes.io/projected/9b298a96-eca9-49eb-a547-f88e986f326e-kube-api-access-vvx8x\") pod \"collect-profiles-29535090-n42nj\" (UID: \"9b298a96-eca9-49eb-a547-f88e986f326e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535090-n42nj" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.259082 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b298a96-eca9-49eb-a547-f88e986f326e-config-volume\") pod \"collect-profiles-29535090-n42nj\" (UID: \"9b298a96-eca9-49eb-a547-f88e986f326e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535090-n42nj" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.259302 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmg6x\" (UniqueName: \"kubernetes.io/projected/a0d38a99-b56f-423c-9c5b-c8f726bf62f9-kube-api-access-jmg6x\") pod \"auto-csr-approver-29535090-7v44h\" (UID: \"a0d38a99-b56f-423c-9c5b-c8f726bf62f9\") " pod="openshift-infra/auto-csr-approver-29535090-7v44h" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.259346 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b298a96-eca9-49eb-a547-f88e986f326e-secret-volume\") pod \"collect-profiles-29535090-n42nj\" (UID: \"9b298a96-eca9-49eb-a547-f88e986f326e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535090-n42nj" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.292158 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-gxnxl"] Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.362235 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8afc038e-11dc-4959-a6b0-61e9b1c2dc35-scripts\") pod \"ovn-controller-ovs-gxnxl\" (UID: \"8afc038e-11dc-4959-a6b0-61e9b1c2dc35\") " pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.362354 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8afc038e-11dc-4959-a6b0-61e9b1c2dc35-var-run\") pod \"ovn-controller-ovs-gxnxl\" (UID: \"8afc038e-11dc-4959-a6b0-61e9b1c2dc35\") " pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.362385 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q5c2\" (UniqueName: \"kubernetes.io/projected/8afc038e-11dc-4959-a6b0-61e9b1c2dc35-kube-api-access-5q5c2\") pod \"ovn-controller-ovs-gxnxl\" (UID: \"8afc038e-11dc-4959-a6b0-61e9b1c2dc35\") " pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.362415 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8afc038e-11dc-4959-a6b0-61e9b1c2dc35-var-lib\") pod \"ovn-controller-ovs-gxnxl\" (UID: \"8afc038e-11dc-4959-a6b0-61e9b1c2dc35\") " pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.362455 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqrz2\" (UniqueName: \"kubernetes.io/projected/cd4015f0-f1a7-40d7-ae69-089f74a6873d-kube-api-access-xqrz2\") pod \"ovn-controller-nrvng\" (UID: \"cd4015f0-f1a7-40d7-ae69-089f74a6873d\") " pod="openstack/ovn-controller-nrvng" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.362491 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cd4015f0-f1a7-40d7-ae69-089f74a6873d-var-run\") pod \"ovn-controller-nrvng\" (UID: \"cd4015f0-f1a7-40d7-ae69-089f74a6873d\") " pod="openstack/ovn-controller-nrvng" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.362511 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4015f0-f1a7-40d7-ae69-089f74a6873d-combined-ca-bundle\") pod \"ovn-controller-nrvng\" (UID: \"cd4015f0-f1a7-40d7-ae69-089f74a6873d\") " pod="openstack/ovn-controller-nrvng" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.362547 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmg6x\" (UniqueName: \"kubernetes.io/projected/a0d38a99-b56f-423c-9c5b-c8f726bf62f9-kube-api-access-jmg6x\") pod \"auto-csr-approver-29535090-7v44h\" (UID: \"a0d38a99-b56f-423c-9c5b-c8f726bf62f9\") " pod="openshift-infra/auto-csr-approver-29535090-7v44h" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.362584 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b298a96-eca9-49eb-a547-f88e986f326e-secret-volume\") pod \"collect-profiles-29535090-n42nj\" (UID: \"9b298a96-eca9-49eb-a547-f88e986f326e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535090-n42nj" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.362643 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8afc038e-11dc-4959-a6b0-61e9b1c2dc35-var-log\") pod \"ovn-controller-ovs-gxnxl\" (UID: \"8afc038e-11dc-4959-a6b0-61e9b1c2dc35\") " pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.362673 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cd4015f0-f1a7-40d7-ae69-089f74a6873d-var-log-ovn\") pod \"ovn-controller-nrvng\" (UID: \"cd4015f0-f1a7-40d7-ae69-089f74a6873d\") " pod="openstack/ovn-controller-nrvng" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.362694 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvx8x\" (UniqueName: \"kubernetes.io/projected/9b298a96-eca9-49eb-a547-f88e986f326e-kube-api-access-vvx8x\") pod \"collect-profiles-29535090-n42nj\" (UID: \"9b298a96-eca9-49eb-a547-f88e986f326e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535090-n42nj" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.362717 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b298a96-eca9-49eb-a547-f88e986f326e-config-volume\") pod \"collect-profiles-29535090-n42nj\" (UID: \"9b298a96-eca9-49eb-a547-f88e986f326e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535090-n42nj" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.362748 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd4015f0-f1a7-40d7-ae69-089f74a6873d-ovn-controller-tls-certs\") pod \"ovn-controller-nrvng\" (UID: \"cd4015f0-f1a7-40d7-ae69-089f74a6873d\") " pod="openstack/ovn-controller-nrvng" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.362775 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8afc038e-11dc-4959-a6b0-61e9b1c2dc35-etc-ovs\") pod \"ovn-controller-ovs-gxnxl\" (UID: \"8afc038e-11dc-4959-a6b0-61e9b1c2dc35\") " pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.362795 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd4015f0-f1a7-40d7-ae69-089f74a6873d-scripts\") pod \"ovn-controller-nrvng\" (UID: \"cd4015f0-f1a7-40d7-ae69-089f74a6873d\") " pod="openstack/ovn-controller-nrvng" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.362828 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cd4015f0-f1a7-40d7-ae69-089f74a6873d-var-run-ovn\") pod \"ovn-controller-nrvng\" (UID: \"cd4015f0-f1a7-40d7-ae69-089f74a6873d\") " pod="openstack/ovn-controller-nrvng" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.363912 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b298a96-eca9-49eb-a547-f88e986f326e-config-volume\") pod \"collect-profiles-29535090-n42nj\" (UID: \"9b298a96-eca9-49eb-a547-f88e986f326e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535090-n42nj" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.379209 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b298a96-eca9-49eb-a547-f88e986f326e-secret-volume\") pod \"collect-profiles-29535090-n42nj\" (UID: \"9b298a96-eca9-49eb-a547-f88e986f326e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535090-n42nj" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.381920 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmg6x\" (UniqueName: \"kubernetes.io/projected/a0d38a99-b56f-423c-9c5b-c8f726bf62f9-kube-api-access-jmg6x\") pod \"auto-csr-approver-29535090-7v44h\" (UID: \"a0d38a99-b56f-423c-9c5b-c8f726bf62f9\") " pod="openshift-infra/auto-csr-approver-29535090-7v44h" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.383317 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvx8x\" (UniqueName: \"kubernetes.io/projected/9b298a96-eca9-49eb-a547-f88e986f326e-kube-api-access-vvx8x\") pod \"collect-profiles-29535090-n42nj\" (UID: \"9b298a96-eca9-49eb-a547-f88e986f326e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535090-n42nj" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.464941 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd4015f0-f1a7-40d7-ae69-089f74a6873d-ovn-controller-tls-certs\") pod \"ovn-controller-nrvng\" (UID: \"cd4015f0-f1a7-40d7-ae69-089f74a6873d\") " pod="openstack/ovn-controller-nrvng" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.465005 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8afc038e-11dc-4959-a6b0-61e9b1c2dc35-etc-ovs\") pod \"ovn-controller-ovs-gxnxl\" (UID: \"8afc038e-11dc-4959-a6b0-61e9b1c2dc35\") " pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.465036 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd4015f0-f1a7-40d7-ae69-089f74a6873d-scripts\") pod \"ovn-controller-nrvng\" (UID: \"cd4015f0-f1a7-40d7-ae69-089f74a6873d\") " pod="openstack/ovn-controller-nrvng" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.465074 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cd4015f0-f1a7-40d7-ae69-089f74a6873d-var-run-ovn\") pod \"ovn-controller-nrvng\" (UID: \"cd4015f0-f1a7-40d7-ae69-089f74a6873d\") " pod="openstack/ovn-controller-nrvng" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.465143 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8afc038e-11dc-4959-a6b0-61e9b1c2dc35-scripts\") pod \"ovn-controller-ovs-gxnxl\" (UID: \"8afc038e-11dc-4959-a6b0-61e9b1c2dc35\") " pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.465195 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8afc038e-11dc-4959-a6b0-61e9b1c2dc35-var-run\") pod \"ovn-controller-ovs-gxnxl\" (UID: \"8afc038e-11dc-4959-a6b0-61e9b1c2dc35\") " pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.465220 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q5c2\" (UniqueName: \"kubernetes.io/projected/8afc038e-11dc-4959-a6b0-61e9b1c2dc35-kube-api-access-5q5c2\") pod \"ovn-controller-ovs-gxnxl\" (UID: \"8afc038e-11dc-4959-a6b0-61e9b1c2dc35\") " pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.465279 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8afc038e-11dc-4959-a6b0-61e9b1c2dc35-var-lib\") pod \"ovn-controller-ovs-gxnxl\" (UID: \"8afc038e-11dc-4959-a6b0-61e9b1c2dc35\") " pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.465314 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqrz2\" (UniqueName: \"kubernetes.io/projected/cd4015f0-f1a7-40d7-ae69-089f74a6873d-kube-api-access-xqrz2\") pod \"ovn-controller-nrvng\" (UID: \"cd4015f0-f1a7-40d7-ae69-089f74a6873d\") " pod="openstack/ovn-controller-nrvng" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.465340 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4015f0-f1a7-40d7-ae69-089f74a6873d-combined-ca-bundle\") pod \"ovn-controller-nrvng\" (UID: \"cd4015f0-f1a7-40d7-ae69-089f74a6873d\") " pod="openstack/ovn-controller-nrvng" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.465362 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cd4015f0-f1a7-40d7-ae69-089f74a6873d-var-run\") pod \"ovn-controller-nrvng\" (UID: \"cd4015f0-f1a7-40d7-ae69-089f74a6873d\") " pod="openstack/ovn-controller-nrvng" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.465412 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8afc038e-11dc-4959-a6b0-61e9b1c2dc35-var-log\") pod \"ovn-controller-ovs-gxnxl\" (UID: \"8afc038e-11dc-4959-a6b0-61e9b1c2dc35\") " pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.465430 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cd4015f0-f1a7-40d7-ae69-089f74a6873d-var-log-ovn\") pod \"ovn-controller-nrvng\" (UID: \"cd4015f0-f1a7-40d7-ae69-089f74a6873d\") " pod="openstack/ovn-controller-nrvng" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.465948 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cd4015f0-f1a7-40d7-ae69-089f74a6873d-var-log-ovn\") pod \"ovn-controller-nrvng\" (UID: \"cd4015f0-f1a7-40d7-ae69-089f74a6873d\") " pod="openstack/ovn-controller-nrvng" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.466349 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8afc038e-11dc-4959-a6b0-61e9b1c2dc35-etc-ovs\") pod \"ovn-controller-ovs-gxnxl\" (UID: \"8afc038e-11dc-4959-a6b0-61e9b1c2dc35\") " pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.467459 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cd4015f0-f1a7-40d7-ae69-089f74a6873d-var-run\") pod \"ovn-controller-nrvng\" (UID: \"cd4015f0-f1a7-40d7-ae69-089f74a6873d\") " pod="openstack/ovn-controller-nrvng" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.467625 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8afc038e-11dc-4959-a6b0-61e9b1c2dc35-var-log\") pod \"ovn-controller-ovs-gxnxl\" (UID: \"8afc038e-11dc-4959-a6b0-61e9b1c2dc35\") " pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.467844 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cd4015f0-f1a7-40d7-ae69-089f74a6873d-var-run-ovn\") pod \"ovn-controller-nrvng\" (UID: \"cd4015f0-f1a7-40d7-ae69-089f74a6873d\") " pod="openstack/ovn-controller-nrvng" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.467934 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8afc038e-11dc-4959-a6b0-61e9b1c2dc35-var-run\") pod \"ovn-controller-ovs-gxnxl\" (UID: \"8afc038e-11dc-4959-a6b0-61e9b1c2dc35\") " pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.468507 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8afc038e-11dc-4959-a6b0-61e9b1c2dc35-var-lib\") pod \"ovn-controller-ovs-gxnxl\" (UID: \"8afc038e-11dc-4959-a6b0-61e9b1c2dc35\") " pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.469013 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8afc038e-11dc-4959-a6b0-61e9b1c2dc35-scripts\") pod \"ovn-controller-ovs-gxnxl\" (UID: \"8afc038e-11dc-4959-a6b0-61e9b1c2dc35\") " pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.473141 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd4015f0-f1a7-40d7-ae69-089f74a6873d-ovn-controller-tls-certs\") pod \"ovn-controller-nrvng\" (UID: \"cd4015f0-f1a7-40d7-ae69-089f74a6873d\") " pod="openstack/ovn-controller-nrvng" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.473401 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd4015f0-f1a7-40d7-ae69-089f74a6873d-scripts\") pod \"ovn-controller-nrvng\" (UID: \"cd4015f0-f1a7-40d7-ae69-089f74a6873d\") " pod="openstack/ovn-controller-nrvng" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.474963 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4015f0-f1a7-40d7-ae69-089f74a6873d-combined-ca-bundle\") pod \"ovn-controller-nrvng\" (UID: \"cd4015f0-f1a7-40d7-ae69-089f74a6873d\") " pod="openstack/ovn-controller-nrvng" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.481995 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqrz2\" (UniqueName: \"kubernetes.io/projected/cd4015f0-f1a7-40d7-ae69-089f74a6873d-kube-api-access-xqrz2\") pod \"ovn-controller-nrvng\" (UID: \"cd4015f0-f1a7-40d7-ae69-089f74a6873d\") " pod="openstack/ovn-controller-nrvng" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.482921 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535090-7v44h" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.483270 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q5c2\" (UniqueName: \"kubernetes.io/projected/8afc038e-11dc-4959-a6b0-61e9b1c2dc35-kube-api-access-5q5c2\") pod \"ovn-controller-ovs-gxnxl\" (UID: \"8afc038e-11dc-4959-a6b0-61e9b1c2dc35\") " pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.507772 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535090-n42nj" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.553257 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nrvng" Feb 26 11:30:00 crc kubenswrapper[4699]: I0226 11:30:00.587786 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:03 crc kubenswrapper[4699]: I0226 11:30:03.511254 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f4a652b4-5b96-4ebf-81b4-df92846455bd","Type":"ContainerStarted","Data":"c653f2114aeba63b01bf441458d5ec8f8a6f7c0f66f8ee44c878928901c377ac"} Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.453423 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.457214 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.461496 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.461715 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-phqm4" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.461981 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.462173 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.462337 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.465483 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.542705 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef805480-81ec-4d0b-b2ca-06db4bf74383-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.542743 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ef805480-81ec-4d0b-b2ca-06db4bf74383-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.542773 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jf5w\" (UniqueName: \"kubernetes.io/projected/ef805480-81ec-4d0b-b2ca-06db4bf74383-kube-api-access-4jf5w\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.542846 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.542964 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef805480-81ec-4d0b-b2ca-06db4bf74383-config\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.543004 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef805480-81ec-4d0b-b2ca-06db4bf74383-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.543024 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef805480-81ec-4d0b-b2ca-06db4bf74383-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.543040 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef805480-81ec-4d0b-b2ca-06db4bf74383-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.646339 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef805480-81ec-4d0b-b2ca-06db4bf74383-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.646384 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ef805480-81ec-4d0b-b2ca-06db4bf74383-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.646409 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jf5w\" (UniqueName: \"kubernetes.io/projected/ef805480-81ec-4d0b-b2ca-06db4bf74383-kube-api-access-4jf5w\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.646426 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.646454 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef805480-81ec-4d0b-b2ca-06db4bf74383-config\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.646496 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef805480-81ec-4d0b-b2ca-06db4bf74383-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.646522 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef805480-81ec-4d0b-b2ca-06db4bf74383-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.646544 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef805480-81ec-4d0b-b2ca-06db4bf74383-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.647084 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.648311 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef805480-81ec-4d0b-b2ca-06db4bf74383-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.648324 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef805480-81ec-4d0b-b2ca-06db4bf74383-config\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.648917 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ef805480-81ec-4d0b-b2ca-06db4bf74383-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.652568 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.655074 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef805480-81ec-4d0b-b2ca-06db4bf74383-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.655091 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef805480-81ec-4d0b-b2ca-06db4bf74383-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.655866 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef805480-81ec-4d0b-b2ca-06db4bf74383-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.672811 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.676240 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-6kllt" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.676487 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.676507 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jf5w\" (UniqueName: \"kubernetes.io/projected/ef805480-81ec-4d0b-b2ca-06db4bf74383-kube-api-access-4jf5w\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.677905 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.680993 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.685012 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.688412 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ef805480-81ec-4d0b-b2ca-06db4bf74383\") " pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.784610 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.848821 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b981c8a5-ce76-4bc1-a018-28255391e3f2-config\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.848880 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b981c8a5-ce76-4bc1-a018-28255391e3f2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.848939 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b981c8a5-ce76-4bc1-a018-28255391e3f2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.849023 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b981c8a5-ce76-4bc1-a018-28255391e3f2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.849058 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b981c8a5-ce76-4bc1-a018-28255391e3f2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.849097 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.849146 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz9jn\" (UniqueName: \"kubernetes.io/projected/b981c8a5-ce76-4bc1-a018-28255391e3f2-kube-api-access-sz9jn\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.849175 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b981c8a5-ce76-4bc1-a018-28255391e3f2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.951498 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b981c8a5-ce76-4bc1-a018-28255391e3f2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.951665 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b981c8a5-ce76-4bc1-a018-28255391e3f2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.951718 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b981c8a5-ce76-4bc1-a018-28255391e3f2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.951770 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.951796 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz9jn\" (UniqueName: \"kubernetes.io/projected/b981c8a5-ce76-4bc1-a018-28255391e3f2-kube-api-access-sz9jn\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.951836 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b981c8a5-ce76-4bc1-a018-28255391e3f2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.951887 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b981c8a5-ce76-4bc1-a018-28255391e3f2-config\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.951915 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b981c8a5-ce76-4bc1-a018-28255391e3f2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.952141 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.953171 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b981c8a5-ce76-4bc1-a018-28255391e3f2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.953648 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b981c8a5-ce76-4bc1-a018-28255391e3f2-config\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.954157 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b981c8a5-ce76-4bc1-a018-28255391e3f2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.957912 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b981c8a5-ce76-4bc1-a018-28255391e3f2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.968729 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b981c8a5-ce76-4bc1-a018-28255391e3f2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.968985 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b981c8a5-ce76-4bc1-a018-28255391e3f2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.977563 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz9jn\" (UniqueName: \"kubernetes.io/projected/b981c8a5-ce76-4bc1-a018-28255391e3f2-kube-api-access-sz9jn\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:04 crc kubenswrapper[4699]: I0226 11:30:04.979673 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b981c8a5-ce76-4bc1-a018-28255391e3f2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:05 crc kubenswrapper[4699]: I0226 11:30:05.047300 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:07 crc kubenswrapper[4699]: E0226 11:30:07.740279 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 26 11:30:07 crc kubenswrapper[4699]: E0226 11:30:07.740787 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h4949,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-65vml_openstack(3eb471e3-5e11-44a3-b3cd-176785c79d76): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 11:30:07 crc kubenswrapper[4699]: E0226 11:30:07.742088 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-65vml" podUID="3eb471e3-5e11-44a3-b3cd-176785c79d76" Feb 26 11:30:08 crc kubenswrapper[4699]: I0226 11:30:08.213537 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 26 11:30:08 crc kubenswrapper[4699]: I0226 11:30:08.318844 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 11:30:09 crc kubenswrapper[4699]: W0226 11:30:09.179713 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fdc6b6d_ac77_4179_9864_f220d622c0f4.slice/crio-180daa314286a6e53251b4285a9fb298155a547e2d8c9a8fe119ff5f5519e021 WatchSource:0}: Error finding container 180daa314286a6e53251b4285a9fb298155a547e2d8c9a8fe119ff5f5519e021: Status 404 returned error can't find the container with id 180daa314286a6e53251b4285a9fb298155a547e2d8c9a8fe119ff5f5519e021 Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.262069 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-65vml" Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.428768 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb471e3-5e11-44a3-b3cd-176785c79d76-config\") pod \"3eb471e3-5e11-44a3-b3cd-176785c79d76\" (UID: \"3eb471e3-5e11-44a3-b3cd-176785c79d76\") " Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.429280 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4949\" (UniqueName: \"kubernetes.io/projected/3eb471e3-5e11-44a3-b3cd-176785c79d76-kube-api-access-h4949\") pod \"3eb471e3-5e11-44a3-b3cd-176785c79d76\" (UID: \"3eb471e3-5e11-44a3-b3cd-176785c79d76\") " Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.429390 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3eb471e3-5e11-44a3-b3cd-176785c79d76-dns-svc\") pod \"3eb471e3-5e11-44a3-b3cd-176785c79d76\" (UID: \"3eb471e3-5e11-44a3-b3cd-176785c79d76\") " Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.429274 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3eb471e3-5e11-44a3-b3cd-176785c79d76-config" (OuterVolumeSpecName: "config") pod "3eb471e3-5e11-44a3-b3cd-176785c79d76" (UID: "3eb471e3-5e11-44a3-b3cd-176785c79d76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.429877 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3eb471e3-5e11-44a3-b3cd-176785c79d76-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3eb471e3-5e11-44a3-b3cd-176785c79d76" (UID: "3eb471e3-5e11-44a3-b3cd-176785c79d76"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.430072 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3eb471e3-5e11-44a3-b3cd-176785c79d76-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.430096 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb471e3-5e11-44a3-b3cd-176785c79d76-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.433900 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb471e3-5e11-44a3-b3cd-176785c79d76-kube-api-access-h4949" (OuterVolumeSpecName: "kube-api-access-h4949") pod "3eb471e3-5e11-44a3-b3cd-176785c79d76" (UID: "3eb471e3-5e11-44a3-b3cd-176785c79d76"). InnerVolumeSpecName "kube-api-access-h4949". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.495383 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.500959 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535090-n42nj"] Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.531723 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4949\" (UniqueName: \"kubernetes.io/projected/3eb471e3-5e11-44a3-b3cd-176785c79d76-kube-api-access-h4949\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:09 crc kubenswrapper[4699]: W0226 11:30:09.547298 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b298a96_eca9_49eb_a547_f88e986f326e.slice/crio-73fa3e6eb5ecf757d1fa8efa2ff8a94d5bbdeb6c98a459819b28cec05f8462fa WatchSource:0}: Error finding container 73fa3e6eb5ecf757d1fa8efa2ff8a94d5bbdeb6c98a459819b28cec05f8462fa: Status 404 returned error can't find the container with id 73fa3e6eb5ecf757d1fa8efa2ff8a94d5bbdeb6c98a459819b28cec05f8462fa Feb 26 11:30:09 crc kubenswrapper[4699]: W0226 11:30:09.549022 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6530fcf8_efdc_4f91_96cb_4f4bdc8bd1d2.slice/crio-015d7330286830e04c4c9f823e3781a8f3e132ae1217540e363d506a4ef6dc91 WatchSource:0}: Error finding container 015d7330286830e04c4c9f823e3781a8f3e132ae1217540e363d506a4ef6dc91: Status 404 returned error can't find the container with id 015d7330286830e04c4c9f823e3781a8f3e132ae1217540e363d506a4ef6dc91 Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.553662 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-65vml" event={"ID":"3eb471e3-5e11-44a3-b3cd-176785c79d76","Type":"ContainerDied","Data":"718ea2b3fa019be493e1d3c84030139b5efcc52cadefdd0f6d61e3832e93141d"} Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.553758 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-65vml" Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.558094 4699 generic.go:334] "Generic (PLEG): container finished" podID="9e16e518-0512-4df0-b8c7-1cd2f9c1e352" containerID="4e432431b1637ae6370a4d90a6195b4e6436618155eeb3ae004d2fe9bb9b8958" exitCode=0 Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.558187 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-vpxrq" event={"ID":"9e16e518-0512-4df0-b8c7-1cd2f9c1e352","Type":"ContainerDied","Data":"4e432431b1637ae6370a4d90a6195b4e6436618155eeb3ae004d2fe9bb9b8958"} Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.560742 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf","Type":"ContainerStarted","Data":"d71534977c30792b789d4e1ac180ec5af3f9ed3738ad0ab651747396010424ea"} Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.564241 4699 generic.go:334] "Generic (PLEG): container finished" podID="13838b5f-5f0e-44ba-8b63-97b4e20efbce" containerID="d9b6954a7b897d0e33bc9da8a8d509e007749e72356e89501274b2dcc58768fe" exitCode=0 Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.564290 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mwnwv" event={"ID":"13838b5f-5f0e-44ba-8b63-97b4e20efbce","Type":"ContainerDied","Data":"d9b6954a7b897d0e33bc9da8a8d509e007749e72356e89501274b2dcc58768fe"} Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.582106 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-vn7l5" event={"ID":"44038b95-eefd-44cd-9781-0a2273605e75","Type":"ContainerStarted","Data":"80050d8650124cdda213563d70066e26f43de8d356825ac23d9b4fdfcc1d3b22"} Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.620390 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6fdc6b6d-ac77-4179-9864-f220d622c0f4","Type":"ContainerStarted","Data":"180daa314286a6e53251b4285a9fb298155a547e2d8c9a8fe119ff5f5519e021"} Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.698405 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-65vml"] Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.702532 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-65vml"] Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.725608 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.742752 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nrvng"] Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.833330 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 26 11:30:09 crc kubenswrapper[4699]: W0226 11:30:09.869091 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd4015f0_f1a7_40d7_ae69_089f74a6873d.slice/crio-e0ddd082de6aa81c77716d005cbb1ffb13e1a074048298bbd3dbcb794e695dc8 WatchSource:0}: Error finding container e0ddd082de6aa81c77716d005cbb1ffb13e1a074048298bbd3dbcb794e695dc8: Status 404 returned error can't find the container with id e0ddd082de6aa81c77716d005cbb1ffb13e1a074048298bbd3dbcb794e695dc8 Feb 26 11:30:09 crc kubenswrapper[4699]: I0226 11:30:09.937680 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535090-7v44h"] Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.033658 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.134741 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-gxnxl"] Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.287828 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eb471e3-5e11-44a3-b3cd-176785c79d76" path="/var/lib/kubelet/pods/3eb471e3-5e11-44a3-b3cd-176785c79d76/volumes" Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.578467 4699 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.631456 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-vpxrq" event={"ID":"9e16e518-0512-4df0-b8c7-1cd2f9c1e352","Type":"ContainerStarted","Data":"fbd96fb959f10fe807f04f188f2e74e23275f80b68cbba2481fd65c49eae67f6"} Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.631526 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-vpxrq" Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.633279 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d57084d-dc87-44e4-bbc8-50c402b7165b","Type":"ContainerStarted","Data":"4d1eb8a447a752838461f0d636790cc5233f2885f6f184a34b386abf15a99629"} Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.635399 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9","Type":"ContainerStarted","Data":"35748eb519ea6086c036c295d69a5cb3c52e1a34fc5cccd2cdf67e3b46b840e7"} Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.636497 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nrvng" event={"ID":"cd4015f0-f1a7-40d7-ae69-089f74a6873d","Type":"ContainerStarted","Data":"e0ddd082de6aa81c77716d005cbb1ffb13e1a074048298bbd3dbcb794e695dc8"} Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.637737 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b981c8a5-ce76-4bc1-a018-28255391e3f2","Type":"ContainerStarted","Data":"125649a15ff4d41dbc758db636b019e0dbee2d3932c70b8de8de8d9909f37601"} Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.638979 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2","Type":"ContainerStarted","Data":"015d7330286830e04c4c9f823e3781a8f3e132ae1217540e363d506a4ef6dc91"} Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.639834 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gxnxl" event={"ID":"8afc038e-11dc-4959-a6b0-61e9b1c2dc35","Type":"ContainerStarted","Data":"7a1471dd7a177467bafda49b607bab61b7b07a37e78e1062c61ad6831146cbf5"} Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.640997 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f4a652b4-5b96-4ebf-81b4-df92846455bd","Type":"ContainerStarted","Data":"01e22f9568e3ffe75e59f3efb64eaf87a7d36c70e421e1930b11972704fa1c7f"} Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.642786 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ef805480-81ec-4d0b-b2ca-06db4bf74383","Type":"ContainerStarted","Data":"12c7d2a9a29d81a322a2f794b5e7d85e9cdca114161fa1145fb259ecb38d8916"} Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.647531 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535090-7v44h" event={"ID":"a0d38a99-b56f-423c-9c5b-c8f726bf62f9","Type":"ContainerStarted","Data":"f0b01a3b5e5254bcfd2326338d3cde12bc7d1f83ca8ef3d4f65d618963dce401"} Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.649235 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-vpxrq" podStartSLOduration=2.952203198 podStartE2EDuration="20.649224467s" podCreationTimestamp="2026-02-26 11:29:50 +0000 UTC" firstStartedPulling="2026-02-26 11:29:51.53095071 +0000 UTC m=+1137.341777134" lastFinishedPulling="2026-02-26 11:30:09.227971929 +0000 UTC m=+1155.038798403" observedRunningTime="2026-02-26 11:30:10.646765638 +0000 UTC m=+1156.457592082" watchObservedRunningTime="2026-02-26 11:30:10.649224467 +0000 UTC m=+1156.460050901" Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.650278 4699 generic.go:334] "Generic (PLEG): container finished" podID="44038b95-eefd-44cd-9781-0a2273605e75" containerID="80050d8650124cdda213563d70066e26f43de8d356825ac23d9b4fdfcc1d3b22" exitCode=0 Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.650314 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-vn7l5" event={"ID":"44038b95-eefd-44cd-9781-0a2273605e75","Type":"ContainerDied","Data":"80050d8650124cdda213563d70066e26f43de8d356825ac23d9b4fdfcc1d3b22"} Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.650377 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-vn7l5" event={"ID":"44038b95-eefd-44cd-9781-0a2273605e75","Type":"ContainerDied","Data":"02fc15d2715e55f8c8cd19bc42d6cb612f93305db1f6bde0aa9d00c273dd8d8c"} Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.650390 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02fc15d2715e55f8c8cd19bc42d6cb612f93305db1f6bde0aa9d00c273dd8d8c" Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.653312 4699 generic.go:334] "Generic (PLEG): container finished" podID="9b298a96-eca9-49eb-a547-f88e986f326e" containerID="81dc18175a458a0d1e57583f805b2614af5b4f06183622336860874df0cedc4e" exitCode=0 Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.653362 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535090-n42nj" event={"ID":"9b298a96-eca9-49eb-a547-f88e986f326e","Type":"ContainerDied","Data":"81dc18175a458a0d1e57583f805b2614af5b4f06183622336860874df0cedc4e"} Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.653390 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535090-n42nj" event={"ID":"9b298a96-eca9-49eb-a547-f88e986f326e","Type":"ContainerStarted","Data":"73fa3e6eb5ecf757d1fa8efa2ff8a94d5bbdeb6c98a459819b28cec05f8462fa"} Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.675684 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vn7l5" Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.762524 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5r5x\" (UniqueName: \"kubernetes.io/projected/44038b95-eefd-44cd-9781-0a2273605e75-kube-api-access-d5r5x\") pod \"44038b95-eefd-44cd-9781-0a2273605e75\" (UID: \"44038b95-eefd-44cd-9781-0a2273605e75\") " Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.762670 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44038b95-eefd-44cd-9781-0a2273605e75-config\") pod \"44038b95-eefd-44cd-9781-0a2273605e75\" (UID: \"44038b95-eefd-44cd-9781-0a2273605e75\") " Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.768230 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44038b95-eefd-44cd-9781-0a2273605e75-kube-api-access-d5r5x" (OuterVolumeSpecName: "kube-api-access-d5r5x") pod "44038b95-eefd-44cd-9781-0a2273605e75" (UID: "44038b95-eefd-44cd-9781-0a2273605e75"). InnerVolumeSpecName "kube-api-access-d5r5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.780722 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44038b95-eefd-44cd-9781-0a2273605e75-config" (OuterVolumeSpecName: "config") pod "44038b95-eefd-44cd-9781-0a2273605e75" (UID: "44038b95-eefd-44cd-9781-0a2273605e75"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.865057 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5r5x\" (UniqueName: \"kubernetes.io/projected/44038b95-eefd-44cd-9781-0a2273605e75-kube-api-access-d5r5x\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:10 crc kubenswrapper[4699]: I0226 11:30:10.865094 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44038b95-eefd-44cd-9781-0a2273605e75-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:11 crc kubenswrapper[4699]: I0226 11:30:11.584880 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:30:11 crc kubenswrapper[4699]: I0226 11:30:11.584944 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:30:11 crc kubenswrapper[4699]: I0226 11:30:11.664815 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mwnwv" event={"ID":"13838b5f-5f0e-44ba-8b63-97b4e20efbce","Type":"ContainerStarted","Data":"66f36dd399631c094ac2a9c045c8b98455818a36d07ce2ecb6290645636b2800"} Feb 26 11:30:11 crc kubenswrapper[4699]: I0226 11:30:11.665057 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vn7l5" Feb 26 11:30:11 crc kubenswrapper[4699]: I0226 11:30:11.682356 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-mwnwv" podStartSLOduration=4.07180611 podStartE2EDuration="21.682336693s" podCreationTimestamp="2026-02-26 11:29:50 +0000 UTC" firstStartedPulling="2026-02-26 11:29:51.6512119 +0000 UTC m=+1137.462038334" lastFinishedPulling="2026-02-26 11:30:09.261742483 +0000 UTC m=+1155.072568917" observedRunningTime="2026-02-26 11:30:11.680421639 +0000 UTC m=+1157.491248083" watchObservedRunningTime="2026-02-26 11:30:11.682336693 +0000 UTC m=+1157.493163137" Feb 26 11:30:11 crc kubenswrapper[4699]: I0226 11:30:11.737036 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vn7l5"] Feb 26 11:30:11 crc kubenswrapper[4699]: I0226 11:30:11.747441 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vn7l5"] Feb 26 11:30:12 crc kubenswrapper[4699]: I0226 11:30:12.272396 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44038b95-eefd-44cd-9781-0a2273605e75" path="/var/lib/kubelet/pods/44038b95-eefd-44cd-9781-0a2273605e75/volumes" Feb 26 11:30:12 crc kubenswrapper[4699]: I0226 11:30:12.493635 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535090-n42nj" Feb 26 11:30:12 crc kubenswrapper[4699]: I0226 11:30:12.594992 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b298a96-eca9-49eb-a547-f88e986f326e-config-volume\") pod \"9b298a96-eca9-49eb-a547-f88e986f326e\" (UID: \"9b298a96-eca9-49eb-a547-f88e986f326e\") " Feb 26 11:30:12 crc kubenswrapper[4699]: I0226 11:30:12.595044 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b298a96-eca9-49eb-a547-f88e986f326e-secret-volume\") pod \"9b298a96-eca9-49eb-a547-f88e986f326e\" (UID: \"9b298a96-eca9-49eb-a547-f88e986f326e\") " Feb 26 11:30:12 crc kubenswrapper[4699]: I0226 11:30:12.595151 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvx8x\" (UniqueName: \"kubernetes.io/projected/9b298a96-eca9-49eb-a547-f88e986f326e-kube-api-access-vvx8x\") pod \"9b298a96-eca9-49eb-a547-f88e986f326e\" (UID: \"9b298a96-eca9-49eb-a547-f88e986f326e\") " Feb 26 11:30:12 crc kubenswrapper[4699]: I0226 11:30:12.596766 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b298a96-eca9-49eb-a547-f88e986f326e-config-volume" (OuterVolumeSpecName: "config-volume") pod "9b298a96-eca9-49eb-a547-f88e986f326e" (UID: "9b298a96-eca9-49eb-a547-f88e986f326e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:12 crc kubenswrapper[4699]: I0226 11:30:12.600780 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b298a96-eca9-49eb-a547-f88e986f326e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9b298a96-eca9-49eb-a547-f88e986f326e" (UID: "9b298a96-eca9-49eb-a547-f88e986f326e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:30:12 crc kubenswrapper[4699]: I0226 11:30:12.609982 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b298a96-eca9-49eb-a547-f88e986f326e-kube-api-access-vvx8x" (OuterVolumeSpecName: "kube-api-access-vvx8x") pod "9b298a96-eca9-49eb-a547-f88e986f326e" (UID: "9b298a96-eca9-49eb-a547-f88e986f326e"). InnerVolumeSpecName "kube-api-access-vvx8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:30:12 crc kubenswrapper[4699]: I0226 11:30:12.671001 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535090-n42nj" event={"ID":"9b298a96-eca9-49eb-a547-f88e986f326e","Type":"ContainerDied","Data":"73fa3e6eb5ecf757d1fa8efa2ff8a94d5bbdeb6c98a459819b28cec05f8462fa"} Feb 26 11:30:12 crc kubenswrapper[4699]: I0226 11:30:12.671062 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73fa3e6eb5ecf757d1fa8efa2ff8a94d5bbdeb6c98a459819b28cec05f8462fa" Feb 26 11:30:12 crc kubenswrapper[4699]: I0226 11:30:12.671025 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535090-n42nj" Feb 26 11:30:12 crc kubenswrapper[4699]: I0226 11:30:12.671525 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-mwnwv" Feb 26 11:30:12 crc kubenswrapper[4699]: I0226 11:30:12.698348 4699 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b298a96-eca9-49eb-a547-f88e986f326e-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:12 crc kubenswrapper[4699]: I0226 11:30:12.698381 4699 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b298a96-eca9-49eb-a547-f88e986f326e-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:12 crc kubenswrapper[4699]: I0226 11:30:12.698393 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvx8x\" (UniqueName: \"kubernetes.io/projected/9b298a96-eca9-49eb-a547-f88e986f326e-kube-api-access-vvx8x\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:15 crc kubenswrapper[4699]: I0226 11:30:15.891636 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-vpxrq" Feb 26 11:30:16 crc kubenswrapper[4699]: I0226 11:30:16.168352 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-mwnwv" Feb 26 11:30:16 crc kubenswrapper[4699]: I0226 11:30:16.224250 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vpxrq"] Feb 26 11:30:16 crc kubenswrapper[4699]: I0226 11:30:16.707727 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-vpxrq" podUID="9e16e518-0512-4df0-b8c7-1cd2f9c1e352" containerName="dnsmasq-dns" containerID="cri-o://fbd96fb959f10fe807f04f188f2e74e23275f80b68cbba2481fd65c49eae67f6" gracePeriod=10 Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.183468 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vpxrq" Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.284166 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e16e518-0512-4df0-b8c7-1cd2f9c1e352-config\") pod \"9e16e518-0512-4df0-b8c7-1cd2f9c1e352\" (UID: \"9e16e518-0512-4df0-b8c7-1cd2f9c1e352\") " Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.284719 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khq7k\" (UniqueName: \"kubernetes.io/projected/9e16e518-0512-4df0-b8c7-1cd2f9c1e352-kube-api-access-khq7k\") pod \"9e16e518-0512-4df0-b8c7-1cd2f9c1e352\" (UID: \"9e16e518-0512-4df0-b8c7-1cd2f9c1e352\") " Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.284752 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e16e518-0512-4df0-b8c7-1cd2f9c1e352-dns-svc\") pod \"9e16e518-0512-4df0-b8c7-1cd2f9c1e352\" (UID: \"9e16e518-0512-4df0-b8c7-1cd2f9c1e352\") " Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.291272 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e16e518-0512-4df0-b8c7-1cd2f9c1e352-kube-api-access-khq7k" (OuterVolumeSpecName: "kube-api-access-khq7k") pod "9e16e518-0512-4df0-b8c7-1cd2f9c1e352" (UID: "9e16e518-0512-4df0-b8c7-1cd2f9c1e352"). InnerVolumeSpecName "kube-api-access-khq7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.348961 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e16e518-0512-4df0-b8c7-1cd2f9c1e352-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9e16e518-0512-4df0-b8c7-1cd2f9c1e352" (UID: "9e16e518-0512-4df0-b8c7-1cd2f9c1e352"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.359677 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e16e518-0512-4df0-b8c7-1cd2f9c1e352-config" (OuterVolumeSpecName: "config") pod "9e16e518-0512-4df0-b8c7-1cd2f9c1e352" (UID: "9e16e518-0512-4df0-b8c7-1cd2f9c1e352"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.386461 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e16e518-0512-4df0-b8c7-1cd2f9c1e352-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.386483 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khq7k\" (UniqueName: \"kubernetes.io/projected/9e16e518-0512-4df0-b8c7-1cd2f9c1e352-kube-api-access-khq7k\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.386495 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e16e518-0512-4df0-b8c7-1cd2f9c1e352-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.716517 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ef805480-81ec-4d0b-b2ca-06db4bf74383","Type":"ContainerStarted","Data":"83de79cf35fb56cc25b9ade694c104cc19ad051e503fbae8961ea45a36867761"} Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.722608 4699 generic.go:334] "Generic (PLEG): container finished" podID="a0d38a99-b56f-423c-9c5b-c8f726bf62f9" containerID="02c1126ec0d166bfd6091e444f16da2788ee1d75f58864b8bc99a6f2547f9104" exitCode=0 Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.722742 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535090-7v44h" event={"ID":"a0d38a99-b56f-423c-9c5b-c8f726bf62f9","Type":"ContainerDied","Data":"02c1126ec0d166bfd6091e444f16da2788ee1d75f58864b8bc99a6f2547f9104"} Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.724942 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nrvng" event={"ID":"cd4015f0-f1a7-40d7-ae69-089f74a6873d","Type":"ContainerStarted","Data":"6daa0e89b9d465ed2a671b70b807249f8398d2cac3d7fa7e605d52f5a2b8b1c9"} Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.726105 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-nrvng" Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.729737 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b981c8a5-ce76-4bc1-a018-28255391e3f2","Type":"ContainerStarted","Data":"f5b08dafae9646ed5b59889aec03efa40dde95cfd3131c9d3c0a40ce48338bd5"} Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.732599 4699 generic.go:334] "Generic (PLEG): container finished" podID="9e16e518-0512-4df0-b8c7-1cd2f9c1e352" containerID="fbd96fb959f10fe807f04f188f2e74e23275f80b68cbba2481fd65c49eae67f6" exitCode=0 Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.732669 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-vpxrq" event={"ID":"9e16e518-0512-4df0-b8c7-1cd2f9c1e352","Type":"ContainerDied","Data":"fbd96fb959f10fe807f04f188f2e74e23275f80b68cbba2481fd65c49eae67f6"} Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.732693 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-vpxrq" event={"ID":"9e16e518-0512-4df0-b8c7-1cd2f9c1e352","Type":"ContainerDied","Data":"2aaa9042481814730657b40428f86e835d8db2be305a9194e255d49a0c3e4409"} Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.732710 4699 scope.go:117] "RemoveContainer" containerID="fbd96fb959f10fe807f04f188f2e74e23275f80b68cbba2481fd65c49eae67f6" Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.732851 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vpxrq" Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.735977 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf","Type":"ContainerStarted","Data":"d2f101e4ea8ddf069d258e4d0b303b0807a2a990a84939a9396f47359ac82f3b"} Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.736422 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.742298 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2","Type":"ContainerStarted","Data":"d4e69267c485636aa7f9c0d96e2f0273d578817d607b0b5383e8f27e20ec9d5b"} Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.743539 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.748428 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gxnxl" event={"ID":"8afc038e-11dc-4959-a6b0-61e9b1c2dc35","Type":"ContainerStarted","Data":"30fd8efce92530234f9e98449df002b51929f5b86050e02a5e9ce686fe6ee5d5"} Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.753679 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6fdc6b6d-ac77-4179-9864-f220d622c0f4","Type":"ContainerStarted","Data":"bc8cfe4cbc14669a7d23e320ce251a249775b1b75e5ecb548ae10249e750c023"} Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.757960 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9","Type":"ContainerStarted","Data":"bf6fd25fd5c5219234878667bbfc768fc4a7fc9b607b1bbc5dba75d0bb38306a"} Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.760479 4699 scope.go:117] "RemoveContainer" containerID="4e432431b1637ae6370a4d90a6195b4e6436618155eeb3ae004d2fe9bb9b8958" Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.765033 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=13.018325433 podStartE2EDuration="20.765007959s" podCreationTimestamp="2026-02-26 11:29:57 +0000 UTC" firstStartedPulling="2026-02-26 11:30:09.210470084 +0000 UTC m=+1155.021296518" lastFinishedPulling="2026-02-26 11:30:16.95715261 +0000 UTC m=+1162.767979044" observedRunningTime="2026-02-26 11:30:17.758366267 +0000 UTC m=+1163.569192721" watchObservedRunningTime="2026-02-26 11:30:17.765007959 +0000 UTC m=+1163.575834403" Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.785215 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-nrvng" podStartSLOduration=10.767181605 podStartE2EDuration="17.785190322s" podCreationTimestamp="2026-02-26 11:30:00 +0000 UTC" firstStartedPulling="2026-02-26 11:30:09.871089309 +0000 UTC m=+1155.681915733" lastFinishedPulling="2026-02-26 11:30:16.889098016 +0000 UTC m=+1162.699924450" observedRunningTime="2026-02-26 11:30:17.776640575 +0000 UTC m=+1163.587467019" watchObservedRunningTime="2026-02-26 11:30:17.785190322 +0000 UTC m=+1163.596016756" Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.794346 4699 scope.go:117] "RemoveContainer" containerID="fbd96fb959f10fe807f04f188f2e74e23275f80b68cbba2481fd65c49eae67f6" Feb 26 11:30:17 crc kubenswrapper[4699]: E0226 11:30:17.794795 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbd96fb959f10fe807f04f188f2e74e23275f80b68cbba2481fd65c49eae67f6\": container with ID starting with fbd96fb959f10fe807f04f188f2e74e23275f80b68cbba2481fd65c49eae67f6 not found: ID does not exist" containerID="fbd96fb959f10fe807f04f188f2e74e23275f80b68cbba2481fd65c49eae67f6" Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.794833 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbd96fb959f10fe807f04f188f2e74e23275f80b68cbba2481fd65c49eae67f6"} err="failed to get container status \"fbd96fb959f10fe807f04f188f2e74e23275f80b68cbba2481fd65c49eae67f6\": rpc error: code = NotFound desc = could not find container \"fbd96fb959f10fe807f04f188f2e74e23275f80b68cbba2481fd65c49eae67f6\": container with ID starting with fbd96fb959f10fe807f04f188f2e74e23275f80b68cbba2481fd65c49eae67f6 not found: ID does not exist" Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.794855 4699 scope.go:117] "RemoveContainer" containerID="4e432431b1637ae6370a4d90a6195b4e6436618155eeb3ae004d2fe9bb9b8958" Feb 26 11:30:17 crc kubenswrapper[4699]: E0226 11:30:17.798301 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e432431b1637ae6370a4d90a6195b4e6436618155eeb3ae004d2fe9bb9b8958\": container with ID starting with 4e432431b1637ae6370a4d90a6195b4e6436618155eeb3ae004d2fe9bb9b8958 not found: ID does not exist" containerID="4e432431b1637ae6370a4d90a6195b4e6436618155eeb3ae004d2fe9bb9b8958" Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.798346 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e432431b1637ae6370a4d90a6195b4e6436618155eeb3ae004d2fe9bb9b8958"} err="failed to get container status \"4e432431b1637ae6370a4d90a6195b4e6436618155eeb3ae004d2fe9bb9b8958\": rpc error: code = NotFound desc = could not find container \"4e432431b1637ae6370a4d90a6195b4e6436618155eeb3ae004d2fe9bb9b8958\": container with ID starting with 4e432431b1637ae6370a4d90a6195b4e6436618155eeb3ae004d2fe9bb9b8958 not found: ID does not exist" Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.835647 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=16.984357537 podStartE2EDuration="23.835628549s" podCreationTimestamp="2026-02-26 11:29:54 +0000 UTC" firstStartedPulling="2026-02-26 11:30:09.552096582 +0000 UTC m=+1155.362923026" lastFinishedPulling="2026-02-26 11:30:16.403367604 +0000 UTC m=+1162.214194038" observedRunningTime="2026-02-26 11:30:17.831580302 +0000 UTC m=+1163.642406746" watchObservedRunningTime="2026-02-26 11:30:17.835628549 +0000 UTC m=+1163.646454983" Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.886619 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vpxrq"] Feb 26 11:30:17 crc kubenswrapper[4699]: I0226 11:30:17.894752 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vpxrq"] Feb 26 11:30:18 crc kubenswrapper[4699]: I0226 11:30:18.269865 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e16e518-0512-4df0-b8c7-1cd2f9c1e352" path="/var/lib/kubelet/pods/9e16e518-0512-4df0-b8c7-1cd2f9c1e352/volumes" Feb 26 11:30:18 crc kubenswrapper[4699]: I0226 11:30:18.775655 4699 generic.go:334] "Generic (PLEG): container finished" podID="8afc038e-11dc-4959-a6b0-61e9b1c2dc35" containerID="30fd8efce92530234f9e98449df002b51929f5b86050e02a5e9ce686fe6ee5d5" exitCode=0 Feb 26 11:30:18 crc kubenswrapper[4699]: I0226 11:30:18.775722 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gxnxl" event={"ID":"8afc038e-11dc-4959-a6b0-61e9b1c2dc35","Type":"ContainerDied","Data":"30fd8efce92530234f9e98449df002b51929f5b86050e02a5e9ce686fe6ee5d5"} Feb 26 11:30:19 crc kubenswrapper[4699]: I0226 11:30:19.151445 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535090-7v44h" Feb 26 11:30:19 crc kubenswrapper[4699]: I0226 11:30:19.318472 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmg6x\" (UniqueName: \"kubernetes.io/projected/a0d38a99-b56f-423c-9c5b-c8f726bf62f9-kube-api-access-jmg6x\") pod \"a0d38a99-b56f-423c-9c5b-c8f726bf62f9\" (UID: \"a0d38a99-b56f-423c-9c5b-c8f726bf62f9\") " Feb 26 11:30:19 crc kubenswrapper[4699]: I0226 11:30:19.322103 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0d38a99-b56f-423c-9c5b-c8f726bf62f9-kube-api-access-jmg6x" (OuterVolumeSpecName: "kube-api-access-jmg6x") pod "a0d38a99-b56f-423c-9c5b-c8f726bf62f9" (UID: "a0d38a99-b56f-423c-9c5b-c8f726bf62f9"). InnerVolumeSpecName "kube-api-access-jmg6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:30:19 crc kubenswrapper[4699]: I0226 11:30:19.420358 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmg6x\" (UniqueName: \"kubernetes.io/projected/a0d38a99-b56f-423c-9c5b-c8f726bf62f9-kube-api-access-jmg6x\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:19 crc kubenswrapper[4699]: I0226 11:30:19.786105 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ef805480-81ec-4d0b-b2ca-06db4bf74383","Type":"ContainerStarted","Data":"9946260ac01a55e7f1ab3f7896c759d9a52ef25112fb9dc7797036b2c1f1bc10"} Feb 26 11:30:19 crc kubenswrapper[4699]: I0226 11:30:19.788662 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535090-7v44h" Feb 26 11:30:19 crc kubenswrapper[4699]: I0226 11:30:19.788661 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535090-7v44h" event={"ID":"a0d38a99-b56f-423c-9c5b-c8f726bf62f9","Type":"ContainerDied","Data":"f0b01a3b5e5254bcfd2326338d3cde12bc7d1f83ca8ef3d4f65d618963dce401"} Feb 26 11:30:19 crc kubenswrapper[4699]: I0226 11:30:19.788816 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0b01a3b5e5254bcfd2326338d3cde12bc7d1f83ca8ef3d4f65d618963dce401" Feb 26 11:30:19 crc kubenswrapper[4699]: I0226 11:30:19.793416 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b981c8a5-ce76-4bc1-a018-28255391e3f2","Type":"ContainerStarted","Data":"d91c2a24145839adbe8cd440ccb150622ce74f6cb1b8640ea93e964ccb3524cc"} Feb 26 11:30:19 crc kubenswrapper[4699]: I0226 11:30:19.797930 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gxnxl" event={"ID":"8afc038e-11dc-4959-a6b0-61e9b1c2dc35","Type":"ContainerStarted","Data":"cb74f1ae25097fb7fa82c7a7adf3297259141728c67285fb021aa734c2697509"} Feb 26 11:30:19 crc kubenswrapper[4699]: I0226 11:30:19.797981 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gxnxl" event={"ID":"8afc038e-11dc-4959-a6b0-61e9b1c2dc35","Type":"ContainerStarted","Data":"3eebae866234458e23d0849a009bd08fb6aa50dedf47a3fb2fc906687ff99310"} Feb 26 11:30:19 crc kubenswrapper[4699]: I0226 11:30:19.798000 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:19 crc kubenswrapper[4699]: I0226 11:30:19.798568 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:19 crc kubenswrapper[4699]: I0226 11:30:19.816777 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=8.117685905 podStartE2EDuration="16.81675353s" podCreationTimestamp="2026-02-26 11:30:03 +0000 UTC" firstStartedPulling="2026-02-26 11:30:10.585249989 +0000 UTC m=+1156.396076423" lastFinishedPulling="2026-02-26 11:30:19.284317614 +0000 UTC m=+1165.095144048" observedRunningTime="2026-02-26 11:30:19.810293784 +0000 UTC m=+1165.621120218" watchObservedRunningTime="2026-02-26 11:30:19.81675353 +0000 UTC m=+1165.627579964" Feb 26 11:30:19 crc kubenswrapper[4699]: I0226 11:30:19.838954 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-gxnxl" podStartSLOduration=13.547131424 podStartE2EDuration="19.838929791s" podCreationTimestamp="2026-02-26 11:30:00 +0000 UTC" firstStartedPulling="2026-02-26 11:30:10.589263572 +0000 UTC m=+1156.400090006" lastFinishedPulling="2026-02-26 11:30:16.881061949 +0000 UTC m=+1162.691888373" observedRunningTime="2026-02-26 11:30:19.837250902 +0000 UTC m=+1165.648077346" watchObservedRunningTime="2026-02-26 11:30:19.838929791 +0000 UTC m=+1165.649756235" Feb 26 11:30:19 crc kubenswrapper[4699]: I0226 11:30:19.867262 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=7.554973604 podStartE2EDuration="16.867240158s" podCreationTimestamp="2026-02-26 11:30:03 +0000 UTC" firstStartedPulling="2026-02-26 11:30:09.978433654 +0000 UTC m=+1155.789260088" lastFinishedPulling="2026-02-26 11:30:19.290700208 +0000 UTC m=+1165.101526642" observedRunningTime="2026-02-26 11:30:19.861687658 +0000 UTC m=+1165.672514102" watchObservedRunningTime="2026-02-26 11:30:19.867240158 +0000 UTC m=+1165.678066592" Feb 26 11:30:20 crc kubenswrapper[4699]: I0226 11:30:20.048070 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:20 crc kubenswrapper[4699]: I0226 11:30:20.048242 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:20 crc kubenswrapper[4699]: I0226 11:30:20.088981 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:20 crc kubenswrapper[4699]: I0226 11:30:20.213422 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535084-h8xlt"] Feb 26 11:30:20 crc kubenswrapper[4699]: I0226 11:30:20.219009 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535084-h8xlt"] Feb 26 11:30:20 crc kubenswrapper[4699]: I0226 11:30:20.269973 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98d6d072-33c5-4660-b6c3-80344c215e6a" path="/var/lib/kubelet/pods/98d6d072-33c5-4660-b6c3-80344c215e6a/volumes" Feb 26 11:30:21 crc kubenswrapper[4699]: I0226 11:30:21.224910 4699 scope.go:117] "RemoveContainer" containerID="dd76d54940753753e3f7a2683a8c241e99cd1928bc9d5ed547595d83c46f6f57" Feb 26 11:30:22 crc kubenswrapper[4699]: I0226 11:30:22.785327 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:22 crc kubenswrapper[4699]: I0226 11:30:22.822266 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:22 crc kubenswrapper[4699]: I0226 11:30:22.823840 4699 generic.go:334] "Generic (PLEG): container finished" podID="6fdc6b6d-ac77-4179-9864-f220d622c0f4" containerID="bc8cfe4cbc14669a7d23e320ce251a249775b1b75e5ecb548ae10249e750c023" exitCode=0 Feb 26 11:30:22 crc kubenswrapper[4699]: I0226 11:30:22.823895 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6fdc6b6d-ac77-4179-9864-f220d622c0f4","Type":"ContainerDied","Data":"bc8cfe4cbc14669a7d23e320ce251a249775b1b75e5ecb548ae10249e750c023"} Feb 26 11:30:22 crc kubenswrapper[4699]: I0226 11:30:22.824445 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:22 crc kubenswrapper[4699]: I0226 11:30:22.870521 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.117618 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-87kkf"] Feb 26 11:30:23 crc kubenswrapper[4699]: E0226 11:30:23.118349 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e16e518-0512-4df0-b8c7-1cd2f9c1e352" containerName="dnsmasq-dns" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.118370 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e16e518-0512-4df0-b8c7-1cd2f9c1e352" containerName="dnsmasq-dns" Feb 26 11:30:23 crc kubenswrapper[4699]: E0226 11:30:23.118409 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e16e518-0512-4df0-b8c7-1cd2f9c1e352" containerName="init" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.118418 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e16e518-0512-4df0-b8c7-1cd2f9c1e352" containerName="init" Feb 26 11:30:23 crc kubenswrapper[4699]: E0226 11:30:23.118441 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44038b95-eefd-44cd-9781-0a2273605e75" containerName="init" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.118448 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="44038b95-eefd-44cd-9781-0a2273605e75" containerName="init" Feb 26 11:30:23 crc kubenswrapper[4699]: E0226 11:30:23.118474 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b298a96-eca9-49eb-a547-f88e986f326e" containerName="collect-profiles" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.118481 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b298a96-eca9-49eb-a547-f88e986f326e" containerName="collect-profiles" Feb 26 11:30:23 crc kubenswrapper[4699]: E0226 11:30:23.118492 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d38a99-b56f-423c-9c5b-c8f726bf62f9" containerName="oc" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.118500 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d38a99-b56f-423c-9c5b-c8f726bf62f9" containerName="oc" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.118695 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="44038b95-eefd-44cd-9781-0a2273605e75" containerName="init" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.118713 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0d38a99-b56f-423c-9c5b-c8f726bf62f9" containerName="oc" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.118724 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b298a96-eca9-49eb-a547-f88e986f326e" containerName="collect-profiles" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.118734 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e16e518-0512-4df0-b8c7-1cd2f9c1e352" containerName="dnsmasq-dns" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.119871 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-87kkf" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.122213 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.130595 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-87kkf"] Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.176013 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-qfxsz"] Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.177328 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-qfxsz" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.181236 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.200406 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-qfxsz"] Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.283139 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4767003-9eba-4b86-933c-5bcbaa93e458-combined-ca-bundle\") pod \"ovn-controller-metrics-qfxsz\" (UID: \"a4767003-9eba-4b86-933c-5bcbaa93e458\") " pod="openstack/ovn-controller-metrics-qfxsz" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.283195 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4767003-9eba-4b86-933c-5bcbaa93e458-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-qfxsz\" (UID: \"a4767003-9eba-4b86-933c-5bcbaa93e458\") " pod="openstack/ovn-controller-metrics-qfxsz" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.283436 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4767003-9eba-4b86-933c-5bcbaa93e458-config\") pod \"ovn-controller-metrics-qfxsz\" (UID: \"a4767003-9eba-4b86-933c-5bcbaa93e458\") " pod="openstack/ovn-controller-metrics-qfxsz" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.283528 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a4767003-9eba-4b86-933c-5bcbaa93e458-ovn-rundir\") pod \"ovn-controller-metrics-qfxsz\" (UID: \"a4767003-9eba-4b86-933c-5bcbaa93e458\") " pod="openstack/ovn-controller-metrics-qfxsz" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.283614 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n9zb\" (UniqueName: \"kubernetes.io/projected/e0f71319-4adc-48a8-82d1-29a8a6bb7500-kube-api-access-8n9zb\") pod \"dnsmasq-dns-5bf47b49b7-87kkf\" (UID: \"e0f71319-4adc-48a8-82d1-29a8a6bb7500\") " pod="openstack/dnsmasq-dns-5bf47b49b7-87kkf" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.283679 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0f71319-4adc-48a8-82d1-29a8a6bb7500-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-87kkf\" (UID: \"e0f71319-4adc-48a8-82d1-29a8a6bb7500\") " pod="openstack/dnsmasq-dns-5bf47b49b7-87kkf" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.283787 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a4767003-9eba-4b86-933c-5bcbaa93e458-ovs-rundir\") pod \"ovn-controller-metrics-qfxsz\" (UID: \"a4767003-9eba-4b86-933c-5bcbaa93e458\") " pod="openstack/ovn-controller-metrics-qfxsz" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.283839 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9pzj\" (UniqueName: \"kubernetes.io/projected/a4767003-9eba-4b86-933c-5bcbaa93e458-kube-api-access-g9pzj\") pod \"ovn-controller-metrics-qfxsz\" (UID: \"a4767003-9eba-4b86-933c-5bcbaa93e458\") " pod="openstack/ovn-controller-metrics-qfxsz" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.283944 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f71319-4adc-48a8-82d1-29a8a6bb7500-config\") pod \"dnsmasq-dns-5bf47b49b7-87kkf\" (UID: \"e0f71319-4adc-48a8-82d1-29a8a6bb7500\") " pod="openstack/dnsmasq-dns-5bf47b49b7-87kkf" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.284019 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0f71319-4adc-48a8-82d1-29a8a6bb7500-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-87kkf\" (UID: \"e0f71319-4adc-48a8-82d1-29a8a6bb7500\") " pod="openstack/dnsmasq-dns-5bf47b49b7-87kkf" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.386762 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4767003-9eba-4b86-933c-5bcbaa93e458-config\") pod \"ovn-controller-metrics-qfxsz\" (UID: \"a4767003-9eba-4b86-933c-5bcbaa93e458\") " pod="openstack/ovn-controller-metrics-qfxsz" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.386835 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a4767003-9eba-4b86-933c-5bcbaa93e458-ovn-rundir\") pod \"ovn-controller-metrics-qfxsz\" (UID: \"a4767003-9eba-4b86-933c-5bcbaa93e458\") " pod="openstack/ovn-controller-metrics-qfxsz" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.386912 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n9zb\" (UniqueName: \"kubernetes.io/projected/e0f71319-4adc-48a8-82d1-29a8a6bb7500-kube-api-access-8n9zb\") pod \"dnsmasq-dns-5bf47b49b7-87kkf\" (UID: \"e0f71319-4adc-48a8-82d1-29a8a6bb7500\") " pod="openstack/dnsmasq-dns-5bf47b49b7-87kkf" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.386964 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0f71319-4adc-48a8-82d1-29a8a6bb7500-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-87kkf\" (UID: \"e0f71319-4adc-48a8-82d1-29a8a6bb7500\") " pod="openstack/dnsmasq-dns-5bf47b49b7-87kkf" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.387012 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a4767003-9eba-4b86-933c-5bcbaa93e458-ovs-rundir\") pod \"ovn-controller-metrics-qfxsz\" (UID: \"a4767003-9eba-4b86-933c-5bcbaa93e458\") " pod="openstack/ovn-controller-metrics-qfxsz" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.387064 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9pzj\" (UniqueName: \"kubernetes.io/projected/a4767003-9eba-4b86-933c-5bcbaa93e458-kube-api-access-g9pzj\") pod \"ovn-controller-metrics-qfxsz\" (UID: \"a4767003-9eba-4b86-933c-5bcbaa93e458\") " pod="openstack/ovn-controller-metrics-qfxsz" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.387163 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f71319-4adc-48a8-82d1-29a8a6bb7500-config\") pod \"dnsmasq-dns-5bf47b49b7-87kkf\" (UID: \"e0f71319-4adc-48a8-82d1-29a8a6bb7500\") " pod="openstack/dnsmasq-dns-5bf47b49b7-87kkf" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.387216 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0f71319-4adc-48a8-82d1-29a8a6bb7500-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-87kkf\" (UID: \"e0f71319-4adc-48a8-82d1-29a8a6bb7500\") " pod="openstack/dnsmasq-dns-5bf47b49b7-87kkf" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.387263 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4767003-9eba-4b86-933c-5bcbaa93e458-combined-ca-bundle\") pod \"ovn-controller-metrics-qfxsz\" (UID: \"a4767003-9eba-4b86-933c-5bcbaa93e458\") " pod="openstack/ovn-controller-metrics-qfxsz" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.387279 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a4767003-9eba-4b86-933c-5bcbaa93e458-ovn-rundir\") pod \"ovn-controller-metrics-qfxsz\" (UID: \"a4767003-9eba-4b86-933c-5bcbaa93e458\") " pod="openstack/ovn-controller-metrics-qfxsz" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.387279 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a4767003-9eba-4b86-933c-5bcbaa93e458-ovs-rundir\") pod \"ovn-controller-metrics-qfxsz\" (UID: \"a4767003-9eba-4b86-933c-5bcbaa93e458\") " pod="openstack/ovn-controller-metrics-qfxsz" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.387365 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4767003-9eba-4b86-933c-5bcbaa93e458-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-qfxsz\" (UID: \"a4767003-9eba-4b86-933c-5bcbaa93e458\") " pod="openstack/ovn-controller-metrics-qfxsz" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.388029 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f71319-4adc-48a8-82d1-29a8a6bb7500-config\") pod \"dnsmasq-dns-5bf47b49b7-87kkf\" (UID: \"e0f71319-4adc-48a8-82d1-29a8a6bb7500\") " pod="openstack/dnsmasq-dns-5bf47b49b7-87kkf" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.388037 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0f71319-4adc-48a8-82d1-29a8a6bb7500-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-87kkf\" (UID: \"e0f71319-4adc-48a8-82d1-29a8a6bb7500\") " pod="openstack/dnsmasq-dns-5bf47b49b7-87kkf" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.388499 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0f71319-4adc-48a8-82d1-29a8a6bb7500-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-87kkf\" (UID: \"e0f71319-4adc-48a8-82d1-29a8a6bb7500\") " pod="openstack/dnsmasq-dns-5bf47b49b7-87kkf" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.388852 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4767003-9eba-4b86-933c-5bcbaa93e458-config\") pod \"ovn-controller-metrics-qfxsz\" (UID: \"a4767003-9eba-4b86-933c-5bcbaa93e458\") " pod="openstack/ovn-controller-metrics-qfxsz" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.392985 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4767003-9eba-4b86-933c-5bcbaa93e458-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-qfxsz\" (UID: \"a4767003-9eba-4b86-933c-5bcbaa93e458\") " pod="openstack/ovn-controller-metrics-qfxsz" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.394823 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4767003-9eba-4b86-933c-5bcbaa93e458-combined-ca-bundle\") pod \"ovn-controller-metrics-qfxsz\" (UID: \"a4767003-9eba-4b86-933c-5bcbaa93e458\") " pod="openstack/ovn-controller-metrics-qfxsz" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.406980 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n9zb\" (UniqueName: \"kubernetes.io/projected/e0f71319-4adc-48a8-82d1-29a8a6bb7500-kube-api-access-8n9zb\") pod \"dnsmasq-dns-5bf47b49b7-87kkf\" (UID: \"e0f71319-4adc-48a8-82d1-29a8a6bb7500\") " pod="openstack/dnsmasq-dns-5bf47b49b7-87kkf" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.413024 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9pzj\" (UniqueName: \"kubernetes.io/projected/a4767003-9eba-4b86-933c-5bcbaa93e458-kube-api-access-g9pzj\") pod \"ovn-controller-metrics-qfxsz\" (UID: \"a4767003-9eba-4b86-933c-5bcbaa93e458\") " pod="openstack/ovn-controller-metrics-qfxsz" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.446464 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-87kkf" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.484301 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-87kkf"] Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.498826 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-qfxsz" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.522370 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-rx5rp"] Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.525362 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-rx5rp" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.529892 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.546485 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-rx5rp"] Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.591165 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-config\") pod \"dnsmasq-dns-8554648995-rx5rp\" (UID: \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\") " pod="openstack/dnsmasq-dns-8554648995-rx5rp" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.591253 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjqzz\" (UniqueName: \"kubernetes.io/projected/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-kube-api-access-mjqzz\") pod \"dnsmasq-dns-8554648995-rx5rp\" (UID: \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\") " pod="openstack/dnsmasq-dns-8554648995-rx5rp" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.591284 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-dns-svc\") pod \"dnsmasq-dns-8554648995-rx5rp\" (UID: \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\") " pod="openstack/dnsmasq-dns-8554648995-rx5rp" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.591339 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-rx5rp\" (UID: \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\") " pod="openstack/dnsmasq-dns-8554648995-rx5rp" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.591386 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-rx5rp\" (UID: \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\") " pod="openstack/dnsmasq-dns-8554648995-rx5rp" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.693329 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-config\") pod \"dnsmasq-dns-8554648995-rx5rp\" (UID: \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\") " pod="openstack/dnsmasq-dns-8554648995-rx5rp" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.693426 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjqzz\" (UniqueName: \"kubernetes.io/projected/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-kube-api-access-mjqzz\") pod \"dnsmasq-dns-8554648995-rx5rp\" (UID: \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\") " pod="openstack/dnsmasq-dns-8554648995-rx5rp" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.693457 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-dns-svc\") pod \"dnsmasq-dns-8554648995-rx5rp\" (UID: \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\") " pod="openstack/dnsmasq-dns-8554648995-rx5rp" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.693517 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-rx5rp\" (UID: \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\") " pod="openstack/dnsmasq-dns-8554648995-rx5rp" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.693560 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-rx5rp\" (UID: \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\") " pod="openstack/dnsmasq-dns-8554648995-rx5rp" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.694757 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-config\") pod \"dnsmasq-dns-8554648995-rx5rp\" (UID: \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\") " pod="openstack/dnsmasq-dns-8554648995-rx5rp" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.694896 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-dns-svc\") pod \"dnsmasq-dns-8554648995-rx5rp\" (UID: \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\") " pod="openstack/dnsmasq-dns-8554648995-rx5rp" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.694920 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-rx5rp\" (UID: \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\") " pod="openstack/dnsmasq-dns-8554648995-rx5rp" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.695432 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-rx5rp\" (UID: \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\") " pod="openstack/dnsmasq-dns-8554648995-rx5rp" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.714884 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjqzz\" (UniqueName: \"kubernetes.io/projected/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-kube-api-access-mjqzz\") pod \"dnsmasq-dns-8554648995-rx5rp\" (UID: \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\") " pod="openstack/dnsmasq-dns-8554648995-rx5rp" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.845732 4699 generic.go:334] "Generic (PLEG): container finished" podID="edce8e75-6dd5-4fbd-8f76-bc6553cc27b9" containerID="bf6fd25fd5c5219234878667bbfc768fc4a7fc9b607b1bbc5dba75d0bb38306a" exitCode=0 Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.846266 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9","Type":"ContainerDied","Data":"bf6fd25fd5c5219234878667bbfc768fc4a7fc9b607b1bbc5dba75d0bb38306a"} Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.849955 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-rx5rp" Feb 26 11:30:23 crc kubenswrapper[4699]: I0226 11:30:23.986636 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-87kkf"] Feb 26 11:30:24 crc kubenswrapper[4699]: I0226 11:30:24.088802 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-qfxsz"] Feb 26 11:30:24 crc kubenswrapper[4699]: I0226 11:30:24.358162 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-rx5rp"] Feb 26 11:30:24 crc kubenswrapper[4699]: W0226 11:30:24.359211 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84a51a2d_7b6c_4a4a_849f_7f02bcbaf87a.slice/crio-3c99596d39f423554ef17ae2aa77a54967d1e95b6f6bf8ba86b76cfbeba577ff WatchSource:0}: Error finding container 3c99596d39f423554ef17ae2aa77a54967d1e95b6f6bf8ba86b76cfbeba577ff: Status 404 returned error can't find the container with id 3c99596d39f423554ef17ae2aa77a54967d1e95b6f6bf8ba86b76cfbeba577ff Feb 26 11:30:24 crc kubenswrapper[4699]: I0226 11:30:24.853589 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-87kkf" event={"ID":"e0f71319-4adc-48a8-82d1-29a8a6bb7500","Type":"ContainerStarted","Data":"f70b3a342001c7db5b4059fb06e2519c604846c94f573dd9ba11e049e0643348"} Feb 26 11:30:24 crc kubenswrapper[4699]: I0226 11:30:24.854955 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-qfxsz" event={"ID":"a4767003-9eba-4b86-933c-5bcbaa93e458","Type":"ContainerStarted","Data":"984cafebc7a4395333e2636a2f692969021197cb222ff4a24278d12bd1a90320"} Feb 26 11:30:24 crc kubenswrapper[4699]: I0226 11:30:24.857247 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-rx5rp" event={"ID":"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a","Type":"ContainerStarted","Data":"3c99596d39f423554ef17ae2aa77a54967d1e95b6f6bf8ba86b76cfbeba577ff"} Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.083431 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.183944 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.223833 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.225315 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.238215 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.238269 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-plkfw" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.238519 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.238717 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.241261 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.322965 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbd47d6-02c1-4ac4-a981-231eb0f13530-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8fbd47d6-02c1-4ac4-a981-231eb0f13530\") " pod="openstack/ovn-northd-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.323051 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbd47d6-02c1-4ac4-a981-231eb0f13530-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8fbd47d6-02c1-4ac4-a981-231eb0f13530\") " pod="openstack/ovn-northd-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.323072 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fbd47d6-02c1-4ac4-a981-231eb0f13530-config\") pod \"ovn-northd-0\" (UID: \"8fbd47d6-02c1-4ac4-a981-231eb0f13530\") " pod="openstack/ovn-northd-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.323104 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fbd47d6-02c1-4ac4-a981-231eb0f13530-scripts\") pod \"ovn-northd-0\" (UID: \"8fbd47d6-02c1-4ac4-a981-231eb0f13530\") " pod="openstack/ovn-northd-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.323285 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4rrl\" (UniqueName: \"kubernetes.io/projected/8fbd47d6-02c1-4ac4-a981-231eb0f13530-kube-api-access-r4rrl\") pod \"ovn-northd-0\" (UID: \"8fbd47d6-02c1-4ac4-a981-231eb0f13530\") " pod="openstack/ovn-northd-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.323314 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbd47d6-02c1-4ac4-a981-231eb0f13530-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8fbd47d6-02c1-4ac4-a981-231eb0f13530\") " pod="openstack/ovn-northd-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.323371 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8fbd47d6-02c1-4ac4-a981-231eb0f13530-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8fbd47d6-02c1-4ac4-a981-231eb0f13530\") " pod="openstack/ovn-northd-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.425264 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbd47d6-02c1-4ac4-a981-231eb0f13530-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8fbd47d6-02c1-4ac4-a981-231eb0f13530\") " pod="openstack/ovn-northd-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.425311 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fbd47d6-02c1-4ac4-a981-231eb0f13530-config\") pod \"ovn-northd-0\" (UID: \"8fbd47d6-02c1-4ac4-a981-231eb0f13530\") " pod="openstack/ovn-northd-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.425344 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fbd47d6-02c1-4ac4-a981-231eb0f13530-scripts\") pod \"ovn-northd-0\" (UID: \"8fbd47d6-02c1-4ac4-a981-231eb0f13530\") " pod="openstack/ovn-northd-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.425368 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4rrl\" (UniqueName: \"kubernetes.io/projected/8fbd47d6-02c1-4ac4-a981-231eb0f13530-kube-api-access-r4rrl\") pod \"ovn-northd-0\" (UID: \"8fbd47d6-02c1-4ac4-a981-231eb0f13530\") " pod="openstack/ovn-northd-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.425388 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbd47d6-02c1-4ac4-a981-231eb0f13530-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8fbd47d6-02c1-4ac4-a981-231eb0f13530\") " pod="openstack/ovn-northd-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.425444 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8fbd47d6-02c1-4ac4-a981-231eb0f13530-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8fbd47d6-02c1-4ac4-a981-231eb0f13530\") " pod="openstack/ovn-northd-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.425552 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbd47d6-02c1-4ac4-a981-231eb0f13530-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8fbd47d6-02c1-4ac4-a981-231eb0f13530\") " pod="openstack/ovn-northd-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.426345 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8fbd47d6-02c1-4ac4-a981-231eb0f13530-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8fbd47d6-02c1-4ac4-a981-231eb0f13530\") " pod="openstack/ovn-northd-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.426373 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fbd47d6-02c1-4ac4-a981-231eb0f13530-config\") pod \"ovn-northd-0\" (UID: \"8fbd47d6-02c1-4ac4-a981-231eb0f13530\") " pod="openstack/ovn-northd-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.426483 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fbd47d6-02c1-4ac4-a981-231eb0f13530-scripts\") pod \"ovn-northd-0\" (UID: \"8fbd47d6-02c1-4ac4-a981-231eb0f13530\") " pod="openstack/ovn-northd-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.430833 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbd47d6-02c1-4ac4-a981-231eb0f13530-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8fbd47d6-02c1-4ac4-a981-231eb0f13530\") " pod="openstack/ovn-northd-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.430971 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbd47d6-02c1-4ac4-a981-231eb0f13530-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8fbd47d6-02c1-4ac4-a981-231eb0f13530\") " pod="openstack/ovn-northd-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.434740 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbd47d6-02c1-4ac4-a981-231eb0f13530-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8fbd47d6-02c1-4ac4-a981-231eb0f13530\") " pod="openstack/ovn-northd-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.455978 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4rrl\" (UniqueName: \"kubernetes.io/projected/8fbd47d6-02c1-4ac4-a981-231eb0f13530-kube-api-access-r4rrl\") pod \"ovn-northd-0\" (UID: \"8fbd47d6-02c1-4ac4-a981-231eb0f13530\") " pod="openstack/ovn-northd-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.554907 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 26 11:30:25 crc kubenswrapper[4699]: I0226 11:30:25.975982 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 26 11:30:25 crc kubenswrapper[4699]: W0226 11:30:25.976399 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fbd47d6_02c1_4ac4_a981_231eb0f13530.slice/crio-bb93ad59326b9fe455fd100fa7d74a387c24b573a58b8705cf87c3955f4720de WatchSource:0}: Error finding container bb93ad59326b9fe455fd100fa7d74a387c24b573a58b8705cf87c3955f4720de: Status 404 returned error can't find the container with id bb93ad59326b9fe455fd100fa7d74a387c24b573a58b8705cf87c3955f4720de Feb 26 11:30:26 crc kubenswrapper[4699]: I0226 11:30:26.870333 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8fbd47d6-02c1-4ac4-a981-231eb0f13530","Type":"ContainerStarted","Data":"bb93ad59326b9fe455fd100fa7d74a387c24b573a58b8705cf87c3955f4720de"} Feb 26 11:30:27 crc kubenswrapper[4699]: I0226 11:30:27.717331 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 26 11:30:27 crc kubenswrapper[4699]: I0226 11:30:27.804451 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-rx5rp"] Feb 26 11:30:27 crc kubenswrapper[4699]: I0226 11:30:27.829625 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6nf48"] Feb 26 11:30:27 crc kubenswrapper[4699]: I0226 11:30:27.830971 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" Feb 26 11:30:27 crc kubenswrapper[4699]: I0226 11:30:27.850214 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6nf48"] Feb 26 11:30:27 crc kubenswrapper[4699]: I0226 11:30:27.973355 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-config\") pod \"dnsmasq-dns-b8fbc5445-6nf48\" (UID: \"2a166832-199a-436c-85a2-4ccde527f180\") " pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" Feb 26 11:30:27 crc kubenswrapper[4699]: I0226 11:30:27.973527 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-6nf48\" (UID: \"2a166832-199a-436c-85a2-4ccde527f180\") " pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" Feb 26 11:30:27 crc kubenswrapper[4699]: I0226 11:30:27.973596 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-6nf48\" (UID: \"2a166832-199a-436c-85a2-4ccde527f180\") " pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" Feb 26 11:30:27 crc kubenswrapper[4699]: I0226 11:30:27.973660 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-6nf48\" (UID: \"2a166832-199a-436c-85a2-4ccde527f180\") " pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" Feb 26 11:30:27 crc kubenswrapper[4699]: I0226 11:30:27.973729 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pjrg\" (UniqueName: \"kubernetes.io/projected/2a166832-199a-436c-85a2-4ccde527f180-kube-api-access-7pjrg\") pod \"dnsmasq-dns-b8fbc5445-6nf48\" (UID: \"2a166832-199a-436c-85a2-4ccde527f180\") " pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" Feb 26 11:30:28 crc kubenswrapper[4699]: I0226 11:30:28.075534 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-6nf48\" (UID: \"2a166832-199a-436c-85a2-4ccde527f180\") " pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" Feb 26 11:30:28 crc kubenswrapper[4699]: I0226 11:30:28.075596 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-6nf48\" (UID: \"2a166832-199a-436c-85a2-4ccde527f180\") " pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" Feb 26 11:30:28 crc kubenswrapper[4699]: I0226 11:30:28.075666 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pjrg\" (UniqueName: \"kubernetes.io/projected/2a166832-199a-436c-85a2-4ccde527f180-kube-api-access-7pjrg\") pod \"dnsmasq-dns-b8fbc5445-6nf48\" (UID: \"2a166832-199a-436c-85a2-4ccde527f180\") " pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" Feb 26 11:30:28 crc kubenswrapper[4699]: I0226 11:30:28.075753 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-config\") pod \"dnsmasq-dns-b8fbc5445-6nf48\" (UID: \"2a166832-199a-436c-85a2-4ccde527f180\") " pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" Feb 26 11:30:28 crc kubenswrapper[4699]: I0226 11:30:28.075777 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-6nf48\" (UID: \"2a166832-199a-436c-85a2-4ccde527f180\") " pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" Feb 26 11:30:28 crc kubenswrapper[4699]: I0226 11:30:28.076886 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-6nf48\" (UID: \"2a166832-199a-436c-85a2-4ccde527f180\") " pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" Feb 26 11:30:28 crc kubenswrapper[4699]: I0226 11:30:28.076893 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-config\") pod \"dnsmasq-dns-b8fbc5445-6nf48\" (UID: \"2a166832-199a-436c-85a2-4ccde527f180\") " pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" Feb 26 11:30:28 crc kubenswrapper[4699]: I0226 11:30:28.077049 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-6nf48\" (UID: \"2a166832-199a-436c-85a2-4ccde527f180\") " pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" Feb 26 11:30:28 crc kubenswrapper[4699]: I0226 11:30:28.077049 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-6nf48\" (UID: \"2a166832-199a-436c-85a2-4ccde527f180\") " pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" Feb 26 11:30:28 crc kubenswrapper[4699]: I0226 11:30:28.106291 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pjrg\" (UniqueName: \"kubernetes.io/projected/2a166832-199a-436c-85a2-4ccde527f180-kube-api-access-7pjrg\") pod \"dnsmasq-dns-b8fbc5445-6nf48\" (UID: \"2a166832-199a-436c-85a2-4ccde527f180\") " pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" Feb 26 11:30:28 crc kubenswrapper[4699]: I0226 11:30:28.149823 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" Feb 26 11:30:28 crc kubenswrapper[4699]: I0226 11:30:28.616919 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6nf48"] Feb 26 11:30:28 crc kubenswrapper[4699]: I0226 11:30:28.885332 4699 generic.go:334] "Generic (PLEG): container finished" podID="84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a" containerID="24d9e2bdd993f65b648a62785a8a9bb52bde1911788d2bc3af7f542b158faa63" exitCode=0 Feb 26 11:30:28 crc kubenswrapper[4699]: I0226 11:30:28.885392 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-rx5rp" event={"ID":"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a","Type":"ContainerDied","Data":"24d9e2bdd993f65b648a62785a8a9bb52bde1911788d2bc3af7f542b158faa63"} Feb 26 11:30:28 crc kubenswrapper[4699]: I0226 11:30:28.887775 4699 generic.go:334] "Generic (PLEG): container finished" podID="e0f71319-4adc-48a8-82d1-29a8a6bb7500" containerID="445ab4d3ee3c89f4634bcfb0a33d6ea9b7825c4b93d1fd1727ebec918c7cc6e0" exitCode=0 Feb 26 11:30:28 crc kubenswrapper[4699]: I0226 11:30:28.887848 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-87kkf" event={"ID":"e0f71319-4adc-48a8-82d1-29a8a6bb7500","Type":"ContainerDied","Data":"445ab4d3ee3c89f4634bcfb0a33d6ea9b7825c4b93d1fd1727ebec918c7cc6e0"} Feb 26 11:30:28 crc kubenswrapper[4699]: I0226 11:30:28.890806 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6fdc6b6d-ac77-4179-9864-f220d622c0f4","Type":"ContainerStarted","Data":"2b3636f054dda5285e4c35de5b8f9641752e1f9f5af5a4146b6d4cb34172fda2"} Feb 26 11:30:28 crc kubenswrapper[4699]: I0226 11:30:28.894255 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"edce8e75-6dd5-4fbd-8f76-bc6553cc27b9","Type":"ContainerStarted","Data":"787e4940aeb2a74392a4a5643cd807e47e393353cd12ad5bb452113b610b3397"} Feb 26 11:30:28 crc kubenswrapper[4699]: I0226 11:30:28.896041 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-qfxsz" event={"ID":"a4767003-9eba-4b86-933c-5bcbaa93e458","Type":"ContainerStarted","Data":"f01ac35f9fdf0369d52cce7cc5e603e07f36c304c89816ec522cd67b395bcec5"} Feb 26 11:30:28 crc kubenswrapper[4699]: I0226 11:30:28.945606 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=28.955902422 podStartE2EDuration="35.945571917s" podCreationTimestamp="2026-02-26 11:29:53 +0000 UTC" firstStartedPulling="2026-02-26 11:30:09.900597874 +0000 UTC m=+1155.711424308" lastFinishedPulling="2026-02-26 11:30:16.890267379 +0000 UTC m=+1162.701093803" observedRunningTime="2026-02-26 11:30:28.944136715 +0000 UTC m=+1174.754963159" watchObservedRunningTime="2026-02-26 11:30:28.945571917 +0000 UTC m=+1174.756398351" Feb 26 11:30:28 crc kubenswrapper[4699]: I0226 11:30:28.968963 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-qfxsz" podStartSLOduration=5.968933252 podStartE2EDuration="5.968933252s" podCreationTimestamp="2026-02-26 11:30:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:30:28.959299543 +0000 UTC m=+1174.770125977" watchObservedRunningTime="2026-02-26 11:30:28.968933252 +0000 UTC m=+1174.779759686" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.017623 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.024971 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.029650 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.029696 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.029934 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-z4964" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.029650 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.035402 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=29.363270932 podStartE2EDuration="37.03537153s" podCreationTimestamp="2026-02-26 11:29:52 +0000 UTC" firstStartedPulling="2026-02-26 11:30:09.208959551 +0000 UTC m=+1155.019785985" lastFinishedPulling="2026-02-26 11:30:16.881060149 +0000 UTC m=+1162.691886583" observedRunningTime="2026-02-26 11:30:29.017595357 +0000 UTC m=+1174.828421801" watchObservedRunningTime="2026-02-26 11:30:29.03537153 +0000 UTC m=+1174.846197964" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.058673 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.100726 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-etc-swift\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") " pod="openstack/swift-storage-0" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.100815 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") " pod="openstack/swift-storage-0" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.100862 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-lock\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") " pod="openstack/swift-storage-0" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.100888 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") " pod="openstack/swift-storage-0" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.100940 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-cache\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") " pod="openstack/swift-storage-0" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.100961 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75z7t\" (UniqueName: \"kubernetes.io/projected/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-kube-api-access-75z7t\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") " pod="openstack/swift-storage-0" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.208388 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") " pod="openstack/swift-storage-0" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.208466 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-lock\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") " pod="openstack/swift-storage-0" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.208497 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") " pod="openstack/swift-storage-0" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.208558 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-cache\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") " pod="openstack/swift-storage-0" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.208582 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75z7t\" (UniqueName: \"kubernetes.io/projected/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-kube-api-access-75z7t\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") " pod="openstack/swift-storage-0" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.208633 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-etc-swift\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") " pod="openstack/swift-storage-0" Feb 26 11:30:29 crc kubenswrapper[4699]: E0226 11:30:29.208819 4699 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 11:30:29 crc kubenswrapper[4699]: E0226 11:30:29.208837 4699 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 11:30:29 crc kubenswrapper[4699]: E0226 11:30:29.208890 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-etc-swift podName:f23ec57b-7ab1-4152-8108-e0e27b4ba95c nodeName:}" failed. No retries permitted until 2026-02-26 11:30:29.708868601 +0000 UTC m=+1175.519695035 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-etc-swift") pod "swift-storage-0" (UID: "f23ec57b-7ab1-4152-8108-e0e27b4ba95c") : configmap "swift-ring-files" not found Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.209970 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-cache\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") " pod="openstack/swift-storage-0" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.210147 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-lock\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") " pod="openstack/swift-storage-0" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.210209 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/swift-storage-0" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.216748 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") " pod="openstack/swift-storage-0" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.242747 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75z7t\" (UniqueName: \"kubernetes.io/projected/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-kube-api-access-75z7t\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") " pod="openstack/swift-storage-0" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.254713 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") " pod="openstack/swift-storage-0" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.346488 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-rx5rp" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.410897 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-dns-svc\") pod \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\" (UID: \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\") " Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.411862 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-config\") pod \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\" (UID: \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\") " Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.411928 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjqzz\" (UniqueName: \"kubernetes.io/projected/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-kube-api-access-mjqzz\") pod \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\" (UID: \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\") " Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.412097 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-ovsdbserver-sb\") pod \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\" (UID: \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\") " Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.412407 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-ovsdbserver-nb\") pod \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\" (UID: \"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a\") " Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.417856 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-kube-api-access-mjqzz" (OuterVolumeSpecName: "kube-api-access-mjqzz") pod "84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a" (UID: "84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a"). InnerVolumeSpecName "kube-api-access-mjqzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.436430 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a" (UID: "84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.436923 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a" (UID: "84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.439528 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-87kkf" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.440516 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-config" (OuterVolumeSpecName: "config") pod "84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a" (UID: "84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.444760 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a" (UID: "84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.514576 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0f71319-4adc-48a8-82d1-29a8a6bb7500-dns-svc\") pod \"e0f71319-4adc-48a8-82d1-29a8a6bb7500\" (UID: \"e0f71319-4adc-48a8-82d1-29a8a6bb7500\") " Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.514871 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0f71319-4adc-48a8-82d1-29a8a6bb7500-ovsdbserver-nb\") pod \"e0f71319-4adc-48a8-82d1-29a8a6bb7500\" (UID: \"e0f71319-4adc-48a8-82d1-29a8a6bb7500\") " Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.515135 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f71319-4adc-48a8-82d1-29a8a6bb7500-config\") pod \"e0f71319-4adc-48a8-82d1-29a8a6bb7500\" (UID: \"e0f71319-4adc-48a8-82d1-29a8a6bb7500\") " Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.515269 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n9zb\" (UniqueName: \"kubernetes.io/projected/e0f71319-4adc-48a8-82d1-29a8a6bb7500-kube-api-access-8n9zb\") pod \"e0f71319-4adc-48a8-82d1-29a8a6bb7500\" (UID: \"e0f71319-4adc-48a8-82d1-29a8a6bb7500\") " Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.515754 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.515834 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.515902 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.515964 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.516025 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjqzz\" (UniqueName: \"kubernetes.io/projected/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a-kube-api-access-mjqzz\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.518629 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0f71319-4adc-48a8-82d1-29a8a6bb7500-kube-api-access-8n9zb" (OuterVolumeSpecName: "kube-api-access-8n9zb") pod "e0f71319-4adc-48a8-82d1-29a8a6bb7500" (UID: "e0f71319-4adc-48a8-82d1-29a8a6bb7500"). InnerVolumeSpecName "kube-api-access-8n9zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.532047 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0f71319-4adc-48a8-82d1-29a8a6bb7500-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e0f71319-4adc-48a8-82d1-29a8a6bb7500" (UID: "e0f71319-4adc-48a8-82d1-29a8a6bb7500"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.532248 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0f71319-4adc-48a8-82d1-29a8a6bb7500-config" (OuterVolumeSpecName: "config") pod "e0f71319-4adc-48a8-82d1-29a8a6bb7500" (UID: "e0f71319-4adc-48a8-82d1-29a8a6bb7500"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.532876 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0f71319-4adc-48a8-82d1-29a8a6bb7500-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e0f71319-4adc-48a8-82d1-29a8a6bb7500" (UID: "e0f71319-4adc-48a8-82d1-29a8a6bb7500"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.617674 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0f71319-4adc-48a8-82d1-29a8a6bb7500-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.617708 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0f71319-4adc-48a8-82d1-29a8a6bb7500-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.617722 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f71319-4adc-48a8-82d1-29a8a6bb7500-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.617733 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n9zb\" (UniqueName: \"kubernetes.io/projected/e0f71319-4adc-48a8-82d1-29a8a6bb7500-kube-api-access-8n9zb\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.719648 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-etc-swift\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") " pod="openstack/swift-storage-0" Feb 26 11:30:29 crc kubenswrapper[4699]: E0226 11:30:29.721012 4699 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 11:30:29 crc kubenswrapper[4699]: E0226 11:30:29.721043 4699 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 11:30:29 crc kubenswrapper[4699]: E0226 11:30:29.721247 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-etc-swift podName:f23ec57b-7ab1-4152-8108-e0e27b4ba95c nodeName:}" failed. No retries permitted until 2026-02-26 11:30:30.721081902 +0000 UTC m=+1176.531908336 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-etc-swift") pod "swift-storage-0" (UID: "f23ec57b-7ab1-4152-8108-e0e27b4ba95c") : configmap "swift-ring-files" not found Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.908857 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-87kkf" event={"ID":"e0f71319-4adc-48a8-82d1-29a8a6bb7500","Type":"ContainerDied","Data":"f70b3a342001c7db5b4059fb06e2519c604846c94f573dd9ba11e049e0643348"} Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.908908 4699 scope.go:117] "RemoveContainer" containerID="445ab4d3ee3c89f4634bcfb0a33d6ea9b7825c4b93d1fd1727ebec918c7cc6e0" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.909032 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-87kkf" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.916549 4699 generic.go:334] "Generic (PLEG): container finished" podID="2a166832-199a-436c-85a2-4ccde527f180" containerID="4ad9a83fa9f5197d955a8f1565b66571572dedbb333404d507411352c78978c6" exitCode=0 Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.916680 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" event={"ID":"2a166832-199a-436c-85a2-4ccde527f180","Type":"ContainerDied","Data":"4ad9a83fa9f5197d955a8f1565b66571572dedbb333404d507411352c78978c6"} Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.916716 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" event={"ID":"2a166832-199a-436c-85a2-4ccde527f180","Type":"ContainerStarted","Data":"e37733ce4b3de5c1e636da1d778df1b2746e600646623b6c23cb5510f0a9db33"} Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.924151 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8fbd47d6-02c1-4ac4-a981-231eb0f13530","Type":"ContainerStarted","Data":"931d0ab71ab277a245269ac933b40aa1db31817a206320265f641db79ee1b41b"} Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.935147 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-rx5rp" Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.935205 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-rx5rp" event={"ID":"84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a","Type":"ContainerDied","Data":"3c99596d39f423554ef17ae2aa77a54967d1e95b6f6bf8ba86b76cfbeba577ff"} Feb 26 11:30:29 crc kubenswrapper[4699]: I0226 11:30:29.972034 4699 scope.go:117] "RemoveContainer" containerID="24d9e2bdd993f65b648a62785a8a9bb52bde1911788d2bc3af7f542b158faa63" Feb 26 11:30:30 crc kubenswrapper[4699]: I0226 11:30:30.023329 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-87kkf"] Feb 26 11:30:30 crc kubenswrapper[4699]: I0226 11:30:30.035641 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-87kkf"] Feb 26 11:30:30 crc kubenswrapper[4699]: I0226 11:30:30.058518 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-rx5rp"] Feb 26 11:30:30 crc kubenswrapper[4699]: I0226 11:30:30.063808 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-rx5rp"] Feb 26 11:30:30 crc kubenswrapper[4699]: I0226 11:30:30.269707 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a" path="/var/lib/kubelet/pods/84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a/volumes" Feb 26 11:30:30 crc kubenswrapper[4699]: I0226 11:30:30.270253 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0f71319-4adc-48a8-82d1-29a8a6bb7500" path="/var/lib/kubelet/pods/e0f71319-4adc-48a8-82d1-29a8a6bb7500/volumes" Feb 26 11:30:30 crc kubenswrapper[4699]: I0226 11:30:30.737025 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-etc-swift\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") " pod="openstack/swift-storage-0" Feb 26 11:30:30 crc kubenswrapper[4699]: E0226 11:30:30.737218 4699 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 11:30:30 crc kubenswrapper[4699]: E0226 11:30:30.737548 4699 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 11:30:30 crc kubenswrapper[4699]: E0226 11:30:30.737613 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-etc-swift podName:f23ec57b-7ab1-4152-8108-e0e27b4ba95c nodeName:}" failed. No retries permitted until 2026-02-26 11:30:32.737591787 +0000 UTC m=+1178.548418221 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-etc-swift") pod "swift-storage-0" (UID: "f23ec57b-7ab1-4152-8108-e0e27b4ba95c") : configmap "swift-ring-files" not found Feb 26 11:30:30 crc kubenswrapper[4699]: I0226 11:30:30.943497 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" event={"ID":"2a166832-199a-436c-85a2-4ccde527f180","Type":"ContainerStarted","Data":"a6963bcffe5d258cd49c8f7db7cd1ef0c3f71763a18cb9d09e9e8a608d9fa6bd"} Feb 26 11:30:30 crc kubenswrapper[4699]: I0226 11:30:30.944537 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" Feb 26 11:30:30 crc kubenswrapper[4699]: I0226 11:30:30.946248 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8fbd47d6-02c1-4ac4-a981-231eb0f13530","Type":"ContainerStarted","Data":"31fcfd1e702ad2c30a0ec2023dd323706787df36e45990f279928b76e2e809f0"} Feb 26 11:30:30 crc kubenswrapper[4699]: I0226 11:30:30.946394 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 26 11:30:30 crc kubenswrapper[4699]: I0226 11:30:30.968998 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" podStartSLOduration=3.968983059 podStartE2EDuration="3.968983059s" podCreationTimestamp="2026-02-26 11:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:30:30.963155161 +0000 UTC m=+1176.773981615" watchObservedRunningTime="2026-02-26 11:30:30.968983059 +0000 UTC m=+1176.779809493" Feb 26 11:30:30 crc kubenswrapper[4699]: I0226 11:30:30.978140 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.230932742 podStartE2EDuration="5.978124554s" podCreationTimestamp="2026-02-26 11:30:25 +0000 UTC" firstStartedPulling="2026-02-26 11:30:25.978676248 +0000 UTC m=+1171.789502682" lastFinishedPulling="2026-02-26 11:30:29.72586806 +0000 UTC m=+1175.536694494" observedRunningTime="2026-02-26 11:30:30.977382352 +0000 UTC m=+1176.788208786" watchObservedRunningTime="2026-02-26 11:30:30.978124554 +0000 UTC m=+1176.788950988" Feb 26 11:30:31 crc kubenswrapper[4699]: E0226 11:30:31.749242 4699 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.213:47266->38.102.83.213:34509: write tcp 38.102.83.213:47266->38.102.83.213:34509: write: broken pipe Feb 26 11:30:32 crc kubenswrapper[4699]: I0226 11:30:32.772621 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-etc-swift\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") " pod="openstack/swift-storage-0" Feb 26 11:30:32 crc kubenswrapper[4699]: E0226 11:30:32.772835 4699 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 11:30:32 crc kubenswrapper[4699]: E0226 11:30:32.772881 4699 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 11:30:32 crc kubenswrapper[4699]: E0226 11:30:32.772960 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-etc-swift podName:f23ec57b-7ab1-4152-8108-e0e27b4ba95c nodeName:}" failed. No retries permitted until 2026-02-26 11:30:36.772937635 +0000 UTC m=+1182.583764069 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-etc-swift") pod "swift-storage-0" (UID: "f23ec57b-7ab1-4152-8108-e0e27b4ba95c") : configmap "swift-ring-files" not found Feb 26 11:30:32 crc kubenswrapper[4699]: I0226 11:30:32.894183 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-lqqdx"] Feb 26 11:30:32 crc kubenswrapper[4699]: E0226 11:30:32.894526 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0f71319-4adc-48a8-82d1-29a8a6bb7500" containerName="init" Feb 26 11:30:32 crc kubenswrapper[4699]: I0226 11:30:32.894543 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0f71319-4adc-48a8-82d1-29a8a6bb7500" containerName="init" Feb 26 11:30:32 crc kubenswrapper[4699]: E0226 11:30:32.894565 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a" containerName="init" Feb 26 11:30:32 crc kubenswrapper[4699]: I0226 11:30:32.894571 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a" containerName="init" Feb 26 11:30:32 crc kubenswrapper[4699]: I0226 11:30:32.894728 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="84a51a2d-7b6c-4a4a-849f-7f02bcbaf87a" containerName="init" Feb 26 11:30:32 crc kubenswrapper[4699]: I0226 11:30:32.894743 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0f71319-4adc-48a8-82d1-29a8a6bb7500" containerName="init" Feb 26 11:30:32 crc kubenswrapper[4699]: I0226 11:30:32.895265 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:32 crc kubenswrapper[4699]: I0226 11:30:32.897609 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 26 11:30:32 crc kubenswrapper[4699]: I0226 11:30:32.897626 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 26 11:30:32 crc kubenswrapper[4699]: I0226 11:30:32.905474 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 26 11:30:32 crc kubenswrapper[4699]: I0226 11:30:32.913137 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-lqqdx"] Feb 26 11:30:32 crc kubenswrapper[4699]: I0226 11:30:32.976001 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-scripts\") pod \"swift-ring-rebalance-lqqdx\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:32 crc kubenswrapper[4699]: I0226 11:30:32.976058 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-swiftconf\") pod \"swift-ring-rebalance-lqqdx\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:32 crc kubenswrapper[4699]: I0226 11:30:32.976148 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6d5r\" (UniqueName: \"kubernetes.io/projected/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-kube-api-access-k6d5r\") pod \"swift-ring-rebalance-lqqdx\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:32 crc kubenswrapper[4699]: I0226 11:30:32.976181 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-ring-data-devices\") pod \"swift-ring-rebalance-lqqdx\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:32 crc kubenswrapper[4699]: I0226 11:30:32.976214 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-dispersionconf\") pod \"swift-ring-rebalance-lqqdx\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:32 crc kubenswrapper[4699]: I0226 11:30:32.976234 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-etc-swift\") pod \"swift-ring-rebalance-lqqdx\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:32 crc kubenswrapper[4699]: I0226 11:30:32.976254 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-combined-ca-bundle\") pod \"swift-ring-rebalance-lqqdx\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:33 crc kubenswrapper[4699]: I0226 11:30:33.077638 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-ring-data-devices\") pod \"swift-ring-rebalance-lqqdx\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:33 crc kubenswrapper[4699]: I0226 11:30:33.077730 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-dispersionconf\") pod \"swift-ring-rebalance-lqqdx\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:33 crc kubenswrapper[4699]: I0226 11:30:33.077763 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-etc-swift\") pod \"swift-ring-rebalance-lqqdx\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:33 crc kubenswrapper[4699]: I0226 11:30:33.077805 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-combined-ca-bundle\") pod \"swift-ring-rebalance-lqqdx\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:33 crc kubenswrapper[4699]: I0226 11:30:33.077902 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-scripts\") pod \"swift-ring-rebalance-lqqdx\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:33 crc kubenswrapper[4699]: I0226 11:30:33.077956 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-swiftconf\") pod \"swift-ring-rebalance-lqqdx\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:33 crc kubenswrapper[4699]: I0226 11:30:33.078055 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6d5r\" (UniqueName: \"kubernetes.io/projected/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-kube-api-access-k6d5r\") pod \"swift-ring-rebalance-lqqdx\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:33 crc kubenswrapper[4699]: I0226 11:30:33.078554 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-etc-swift\") pod \"swift-ring-rebalance-lqqdx\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:33 crc kubenswrapper[4699]: I0226 11:30:33.078846 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-scripts\") pod \"swift-ring-rebalance-lqqdx\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:33 crc kubenswrapper[4699]: I0226 11:30:33.079034 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-ring-data-devices\") pod \"swift-ring-rebalance-lqqdx\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:33 crc kubenswrapper[4699]: I0226 11:30:33.088360 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-dispersionconf\") pod \"swift-ring-rebalance-lqqdx\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:33 crc kubenswrapper[4699]: I0226 11:30:33.088496 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-swiftconf\") pod \"swift-ring-rebalance-lqqdx\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:33 crc kubenswrapper[4699]: I0226 11:30:33.088600 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-combined-ca-bundle\") pod \"swift-ring-rebalance-lqqdx\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:33 crc kubenswrapper[4699]: I0226 11:30:33.111871 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6d5r\" (UniqueName: \"kubernetes.io/projected/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-kube-api-access-k6d5r\") pod \"swift-ring-rebalance-lqqdx\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:33 crc kubenswrapper[4699]: I0226 11:30:33.212151 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:33 crc kubenswrapper[4699]: I0226 11:30:33.638850 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-lqqdx"] Feb 26 11:30:33 crc kubenswrapper[4699]: W0226 11:30:33.643705 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9125ee3a_a0b6_469b_b79d_3a376f2d5d91.slice/crio-a3441a4af4d4bdfc7161195190910ab39fbc98b45c549fbd1135470c64e160b7 WatchSource:0}: Error finding container a3441a4af4d4bdfc7161195190910ab39fbc98b45c549fbd1135470c64e160b7: Status 404 returned error can't find the container with id a3441a4af4d4bdfc7161195190910ab39fbc98b45c549fbd1135470c64e160b7 Feb 26 11:30:33 crc kubenswrapper[4699]: I0226 11:30:33.973206 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lqqdx" event={"ID":"9125ee3a-a0b6-469b-b79d-3a376f2d5d91","Type":"ContainerStarted","Data":"a3441a4af4d4bdfc7161195190910ab39fbc98b45c549fbd1135470c64e160b7"} Feb 26 11:30:33 crc kubenswrapper[4699]: I0226 11:30:33.982541 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 26 11:30:33 crc kubenswrapper[4699]: I0226 11:30:33.983902 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 26 11:30:35 crc kubenswrapper[4699]: I0226 11:30:35.123687 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 26 11:30:35 crc kubenswrapper[4699]: I0226 11:30:35.124955 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 26 11:30:35 crc kubenswrapper[4699]: I0226 11:30:35.193735 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.114021 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.285217 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.393479 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.579008 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8e68-account-create-update-bwkx8"] Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.580226 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8e68-account-create-update-bwkx8" Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.582613 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.591050 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8e68-account-create-update-bwkx8"] Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.648266 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6e3aace-8f02-410d-8e7e-4fa61336435b-operator-scripts\") pod \"placement-8e68-account-create-update-bwkx8\" (UID: \"f6e3aace-8f02-410d-8e7e-4fa61336435b\") " pod="openstack/placement-8e68-account-create-update-bwkx8" Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.648422 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r55vn\" (UniqueName: \"kubernetes.io/projected/f6e3aace-8f02-410d-8e7e-4fa61336435b-kube-api-access-r55vn\") pod \"placement-8e68-account-create-update-bwkx8\" (UID: \"f6e3aace-8f02-410d-8e7e-4fa61336435b\") " pod="openstack/placement-8e68-account-create-update-bwkx8" Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.676824 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-29gg4"] Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.677989 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-29gg4" Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.684071 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-29gg4"] Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.751108 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r55vn\" (UniqueName: \"kubernetes.io/projected/f6e3aace-8f02-410d-8e7e-4fa61336435b-kube-api-access-r55vn\") pod \"placement-8e68-account-create-update-bwkx8\" (UID: \"f6e3aace-8f02-410d-8e7e-4fa61336435b\") " pod="openstack/placement-8e68-account-create-update-bwkx8" Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.751418 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6e3aace-8f02-410d-8e7e-4fa61336435b-operator-scripts\") pod \"placement-8e68-account-create-update-bwkx8\" (UID: \"f6e3aace-8f02-410d-8e7e-4fa61336435b\") " pod="openstack/placement-8e68-account-create-update-bwkx8" Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.751511 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz95n\" (UniqueName: \"kubernetes.io/projected/e9392947-cd31-4afd-92c7-73bac0d4cbd3-kube-api-access-kz95n\") pod \"placement-db-create-29gg4\" (UID: \"e9392947-cd31-4afd-92c7-73bac0d4cbd3\") " pod="openstack/placement-db-create-29gg4" Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.751611 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9392947-cd31-4afd-92c7-73bac0d4cbd3-operator-scripts\") pod \"placement-db-create-29gg4\" (UID: \"e9392947-cd31-4afd-92c7-73bac0d4cbd3\") " pod="openstack/placement-db-create-29gg4" Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.752300 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6e3aace-8f02-410d-8e7e-4fa61336435b-operator-scripts\") pod \"placement-8e68-account-create-update-bwkx8\" (UID: \"f6e3aace-8f02-410d-8e7e-4fa61336435b\") " pod="openstack/placement-8e68-account-create-update-bwkx8" Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.790164 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r55vn\" (UniqueName: \"kubernetes.io/projected/f6e3aace-8f02-410d-8e7e-4fa61336435b-kube-api-access-r55vn\") pod \"placement-8e68-account-create-update-bwkx8\" (UID: \"f6e3aace-8f02-410d-8e7e-4fa61336435b\") " pod="openstack/placement-8e68-account-create-update-bwkx8" Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.853621 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9392947-cd31-4afd-92c7-73bac0d4cbd3-operator-scripts\") pod \"placement-db-create-29gg4\" (UID: \"e9392947-cd31-4afd-92c7-73bac0d4cbd3\") " pod="openstack/placement-db-create-29gg4" Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.853768 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-etc-swift\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") " pod="openstack/swift-storage-0" Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.853845 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz95n\" (UniqueName: \"kubernetes.io/projected/e9392947-cd31-4afd-92c7-73bac0d4cbd3-kube-api-access-kz95n\") pod \"placement-db-create-29gg4\" (UID: \"e9392947-cd31-4afd-92c7-73bac0d4cbd3\") " pod="openstack/placement-db-create-29gg4" Feb 26 11:30:36 crc kubenswrapper[4699]: E0226 11:30:36.854020 4699 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 11:30:36 crc kubenswrapper[4699]: E0226 11:30:36.854049 4699 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 11:30:36 crc kubenswrapper[4699]: E0226 11:30:36.854101 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-etc-swift podName:f23ec57b-7ab1-4152-8108-e0e27b4ba95c nodeName:}" failed. No retries permitted until 2026-02-26 11:30:44.854084312 +0000 UTC m=+1190.664910746 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-etc-swift") pod "swift-storage-0" (UID: "f23ec57b-7ab1-4152-8108-e0e27b4ba95c") : configmap "swift-ring-files" not found Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.855025 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9392947-cd31-4afd-92c7-73bac0d4cbd3-operator-scripts\") pod \"placement-db-create-29gg4\" (UID: \"e9392947-cd31-4afd-92c7-73bac0d4cbd3\") " pod="openstack/placement-db-create-29gg4" Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.889700 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz95n\" (UniqueName: \"kubernetes.io/projected/e9392947-cd31-4afd-92c7-73bac0d4cbd3-kube-api-access-kz95n\") pod \"placement-db-create-29gg4\" (UID: \"e9392947-cd31-4afd-92c7-73bac0d4cbd3\") " pod="openstack/placement-db-create-29gg4" Feb 26 11:30:36 crc kubenswrapper[4699]: I0226 11:30:36.900486 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8e68-account-create-update-bwkx8" Feb 26 11:30:37 crc kubenswrapper[4699]: I0226 11:30:37.000152 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-29gg4" Feb 26 11:30:37 crc kubenswrapper[4699]: W0226 11:30:37.499392 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6e3aace_8f02_410d_8e7e_4fa61336435b.slice/crio-b596e01a17243fbf4997c4d2c8dfee292b581ac3cf78de67257c3b09a33dffa3 WatchSource:0}: Error finding container b596e01a17243fbf4997c4d2c8dfee292b581ac3cf78de67257c3b09a33dffa3: Status 404 returned error can't find the container with id b596e01a17243fbf4997c4d2c8dfee292b581ac3cf78de67257c3b09a33dffa3 Feb 26 11:30:37 crc kubenswrapper[4699]: I0226 11:30:37.500581 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8e68-account-create-update-bwkx8"] Feb 26 11:30:37 crc kubenswrapper[4699]: I0226 11:30:37.589702 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-29gg4"] Feb 26 11:30:37 crc kubenswrapper[4699]: W0226 11:30:37.591571 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9392947_cd31_4afd_92c7_73bac0d4cbd3.slice/crio-b278a38ee63ea38861814367d188a15a9a6283a23e63740d2fe8992f0f5bf2f7 WatchSource:0}: Error finding container b278a38ee63ea38861814367d188a15a9a6283a23e63740d2fe8992f0f5bf2f7: Status 404 returned error can't find the container with id b278a38ee63ea38861814367d188a15a9a6283a23e63740d2fe8992f0f5bf2f7 Feb 26 11:30:38 crc kubenswrapper[4699]: I0226 11:30:38.003627 4699 generic.go:334] "Generic (PLEG): container finished" podID="f6e3aace-8f02-410d-8e7e-4fa61336435b" containerID="6bf24901f54aea8222e7ac0b7dea606ea0a09d83f0dad7544b8e7bc98249b1e8" exitCode=0 Feb 26 11:30:38 crc kubenswrapper[4699]: I0226 11:30:38.004018 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8e68-account-create-update-bwkx8" event={"ID":"f6e3aace-8f02-410d-8e7e-4fa61336435b","Type":"ContainerDied","Data":"6bf24901f54aea8222e7ac0b7dea606ea0a09d83f0dad7544b8e7bc98249b1e8"} Feb 26 11:30:38 crc kubenswrapper[4699]: I0226 11:30:38.004044 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8e68-account-create-update-bwkx8" event={"ID":"f6e3aace-8f02-410d-8e7e-4fa61336435b","Type":"ContainerStarted","Data":"b596e01a17243fbf4997c4d2c8dfee292b581ac3cf78de67257c3b09a33dffa3"} Feb 26 11:30:38 crc kubenswrapper[4699]: I0226 11:30:38.005383 4699 generic.go:334] "Generic (PLEG): container finished" podID="e9392947-cd31-4afd-92c7-73bac0d4cbd3" containerID="02517dfaa484539c60d2ef72e32d7a113f0b9a11e109ec31ac01691b7f015d05" exitCode=0 Feb 26 11:30:38 crc kubenswrapper[4699]: I0226 11:30:38.005434 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-29gg4" event={"ID":"e9392947-cd31-4afd-92c7-73bac0d4cbd3","Type":"ContainerDied","Data":"02517dfaa484539c60d2ef72e32d7a113f0b9a11e109ec31ac01691b7f015d05"} Feb 26 11:30:38 crc kubenswrapper[4699]: I0226 11:30:38.005449 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-29gg4" event={"ID":"e9392947-cd31-4afd-92c7-73bac0d4cbd3","Type":"ContainerStarted","Data":"b278a38ee63ea38861814367d188a15a9a6283a23e63740d2fe8992f0f5bf2f7"} Feb 26 11:30:38 crc kubenswrapper[4699]: I0226 11:30:38.006707 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lqqdx" event={"ID":"9125ee3a-a0b6-469b-b79d-3a376f2d5d91","Type":"ContainerStarted","Data":"11bb20834c3902f477ab036d4f74aa6b8faa916aaeb98d82af08a9d084ddec28"} Feb 26 11:30:38 crc kubenswrapper[4699]: I0226 11:30:38.054508 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-lqqdx" podStartSLOduration=2.476964355 podStartE2EDuration="6.054487218s" podCreationTimestamp="2026-02-26 11:30:32 +0000 UTC" firstStartedPulling="2026-02-26 11:30:33.645806792 +0000 UTC m=+1179.456633226" lastFinishedPulling="2026-02-26 11:30:37.223329655 +0000 UTC m=+1183.034156089" observedRunningTime="2026-02-26 11:30:38.047994941 +0000 UTC m=+1183.858821395" watchObservedRunningTime="2026-02-26 11:30:38.054487218 +0000 UTC m=+1183.865313652" Feb 26 11:30:38 crc kubenswrapper[4699]: I0226 11:30:38.152251 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" Feb 26 11:30:38 crc kubenswrapper[4699]: I0226 11:30:38.220793 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mwnwv"] Feb 26 11:30:38 crc kubenswrapper[4699]: I0226 11:30:38.221046 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-mwnwv" podUID="13838b5f-5f0e-44ba-8b63-97b4e20efbce" containerName="dnsmasq-dns" containerID="cri-o://66f36dd399631c094ac2a9c045c8b98455818a36d07ce2ecb6290645636b2800" gracePeriod=10 Feb 26 11:30:38 crc kubenswrapper[4699]: I0226 11:30:38.703007 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mwnwv" Feb 26 11:30:38 crc kubenswrapper[4699]: I0226 11:30:38.786983 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13838b5f-5f0e-44ba-8b63-97b4e20efbce-config\") pod \"13838b5f-5f0e-44ba-8b63-97b4e20efbce\" (UID: \"13838b5f-5f0e-44ba-8b63-97b4e20efbce\") " Feb 26 11:30:38 crc kubenswrapper[4699]: I0226 11:30:38.787074 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx227\" (UniqueName: \"kubernetes.io/projected/13838b5f-5f0e-44ba-8b63-97b4e20efbce-kube-api-access-vx227\") pod \"13838b5f-5f0e-44ba-8b63-97b4e20efbce\" (UID: \"13838b5f-5f0e-44ba-8b63-97b4e20efbce\") " Feb 26 11:30:38 crc kubenswrapper[4699]: I0226 11:30:38.787281 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13838b5f-5f0e-44ba-8b63-97b4e20efbce-dns-svc\") pod \"13838b5f-5f0e-44ba-8b63-97b4e20efbce\" (UID: \"13838b5f-5f0e-44ba-8b63-97b4e20efbce\") " Feb 26 11:30:38 crc kubenswrapper[4699]: I0226 11:30:38.793244 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13838b5f-5f0e-44ba-8b63-97b4e20efbce-kube-api-access-vx227" (OuterVolumeSpecName: "kube-api-access-vx227") pod "13838b5f-5f0e-44ba-8b63-97b4e20efbce" (UID: "13838b5f-5f0e-44ba-8b63-97b4e20efbce"). InnerVolumeSpecName "kube-api-access-vx227". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:30:38 crc kubenswrapper[4699]: I0226 11:30:38.828274 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13838b5f-5f0e-44ba-8b63-97b4e20efbce-config" (OuterVolumeSpecName: "config") pod "13838b5f-5f0e-44ba-8b63-97b4e20efbce" (UID: "13838b5f-5f0e-44ba-8b63-97b4e20efbce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:38 crc kubenswrapper[4699]: I0226 11:30:38.829292 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13838b5f-5f0e-44ba-8b63-97b4e20efbce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "13838b5f-5f0e-44ba-8b63-97b4e20efbce" (UID: "13838b5f-5f0e-44ba-8b63-97b4e20efbce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:38 crc kubenswrapper[4699]: I0226 11:30:38.889668 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx227\" (UniqueName: \"kubernetes.io/projected/13838b5f-5f0e-44ba-8b63-97b4e20efbce-kube-api-access-vx227\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:38 crc kubenswrapper[4699]: I0226 11:30:38.889720 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13838b5f-5f0e-44ba-8b63-97b4e20efbce-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:38 crc kubenswrapper[4699]: I0226 11:30:38.889733 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13838b5f-5f0e-44ba-8b63-97b4e20efbce-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.014506 4699 generic.go:334] "Generic (PLEG): container finished" podID="13838b5f-5f0e-44ba-8b63-97b4e20efbce" containerID="66f36dd399631c094ac2a9c045c8b98455818a36d07ce2ecb6290645636b2800" exitCode=0 Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.014588 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mwnwv" Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.014612 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mwnwv" event={"ID":"13838b5f-5f0e-44ba-8b63-97b4e20efbce","Type":"ContainerDied","Data":"66f36dd399631c094ac2a9c045c8b98455818a36d07ce2ecb6290645636b2800"} Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.014655 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mwnwv" event={"ID":"13838b5f-5f0e-44ba-8b63-97b4e20efbce","Type":"ContainerDied","Data":"30da7116ef227de51d61b599067ff253e3fbcd27cb9bf2e3d4c83d06e5a7374a"} Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.014674 4699 scope.go:117] "RemoveContainer" containerID="66f36dd399631c094ac2a9c045c8b98455818a36d07ce2ecb6290645636b2800" Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.057983 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mwnwv"] Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.072298 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mwnwv"] Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.086908 4699 scope.go:117] "RemoveContainer" containerID="d9b6954a7b897d0e33bc9da8a8d509e007749e72356e89501274b2dcc58768fe" Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.148830 4699 scope.go:117] "RemoveContainer" containerID="66f36dd399631c094ac2a9c045c8b98455818a36d07ce2ecb6290645636b2800" Feb 26 11:30:39 crc kubenswrapper[4699]: E0226 11:30:39.152230 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66f36dd399631c094ac2a9c045c8b98455818a36d07ce2ecb6290645636b2800\": container with ID starting with 66f36dd399631c094ac2a9c045c8b98455818a36d07ce2ecb6290645636b2800 not found: ID does not exist" containerID="66f36dd399631c094ac2a9c045c8b98455818a36d07ce2ecb6290645636b2800" Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.152273 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66f36dd399631c094ac2a9c045c8b98455818a36d07ce2ecb6290645636b2800"} err="failed to get container status \"66f36dd399631c094ac2a9c045c8b98455818a36d07ce2ecb6290645636b2800\": rpc error: code = NotFound desc = could not find container \"66f36dd399631c094ac2a9c045c8b98455818a36d07ce2ecb6290645636b2800\": container with ID starting with 66f36dd399631c094ac2a9c045c8b98455818a36d07ce2ecb6290645636b2800 not found: ID does not exist" Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.152297 4699 scope.go:117] "RemoveContainer" containerID="d9b6954a7b897d0e33bc9da8a8d509e007749e72356e89501274b2dcc58768fe" Feb 26 11:30:39 crc kubenswrapper[4699]: E0226 11:30:39.154357 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9b6954a7b897d0e33bc9da8a8d509e007749e72356e89501274b2dcc58768fe\": container with ID starting with d9b6954a7b897d0e33bc9da8a8d509e007749e72356e89501274b2dcc58768fe not found: ID does not exist" containerID="d9b6954a7b897d0e33bc9da8a8d509e007749e72356e89501274b2dcc58768fe" Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.154396 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9b6954a7b897d0e33bc9da8a8d509e007749e72356e89501274b2dcc58768fe"} err="failed to get container status \"d9b6954a7b897d0e33bc9da8a8d509e007749e72356e89501274b2dcc58768fe\": rpc error: code = NotFound desc = could not find container \"d9b6954a7b897d0e33bc9da8a8d509e007749e72356e89501274b2dcc58768fe\": container with ID starting with d9b6954a7b897d0e33bc9da8a8d509e007749e72356e89501274b2dcc58768fe not found: ID does not exist" Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.469762 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-29gg4" Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.483092 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8e68-account-create-update-bwkx8" Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.602736 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz95n\" (UniqueName: \"kubernetes.io/projected/e9392947-cd31-4afd-92c7-73bac0d4cbd3-kube-api-access-kz95n\") pod \"e9392947-cd31-4afd-92c7-73bac0d4cbd3\" (UID: \"e9392947-cd31-4afd-92c7-73bac0d4cbd3\") " Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.603203 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6e3aace-8f02-410d-8e7e-4fa61336435b-operator-scripts\") pod \"f6e3aace-8f02-410d-8e7e-4fa61336435b\" (UID: \"f6e3aace-8f02-410d-8e7e-4fa61336435b\") " Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.603234 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9392947-cd31-4afd-92c7-73bac0d4cbd3-operator-scripts\") pod \"e9392947-cd31-4afd-92c7-73bac0d4cbd3\" (UID: \"e9392947-cd31-4afd-92c7-73bac0d4cbd3\") " Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.603443 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r55vn\" (UniqueName: \"kubernetes.io/projected/f6e3aace-8f02-410d-8e7e-4fa61336435b-kube-api-access-r55vn\") pod \"f6e3aace-8f02-410d-8e7e-4fa61336435b\" (UID: \"f6e3aace-8f02-410d-8e7e-4fa61336435b\") " Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.603767 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6e3aace-8f02-410d-8e7e-4fa61336435b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f6e3aace-8f02-410d-8e7e-4fa61336435b" (UID: "f6e3aace-8f02-410d-8e7e-4fa61336435b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.604150 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9392947-cd31-4afd-92c7-73bac0d4cbd3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9392947-cd31-4afd-92c7-73bac0d4cbd3" (UID: "e9392947-cd31-4afd-92c7-73bac0d4cbd3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.607970 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9392947-cd31-4afd-92c7-73bac0d4cbd3-kube-api-access-kz95n" (OuterVolumeSpecName: "kube-api-access-kz95n") pod "e9392947-cd31-4afd-92c7-73bac0d4cbd3" (UID: "e9392947-cd31-4afd-92c7-73bac0d4cbd3"). InnerVolumeSpecName "kube-api-access-kz95n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.608012 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6e3aace-8f02-410d-8e7e-4fa61336435b-kube-api-access-r55vn" (OuterVolumeSpecName: "kube-api-access-r55vn") pod "f6e3aace-8f02-410d-8e7e-4fa61336435b" (UID: "f6e3aace-8f02-410d-8e7e-4fa61336435b"). InnerVolumeSpecName "kube-api-access-r55vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.705380 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r55vn\" (UniqueName: \"kubernetes.io/projected/f6e3aace-8f02-410d-8e7e-4fa61336435b-kube-api-access-r55vn\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.705409 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz95n\" (UniqueName: \"kubernetes.io/projected/e9392947-cd31-4afd-92c7-73bac0d4cbd3-kube-api-access-kz95n\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.705420 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6e3aace-8f02-410d-8e7e-4fa61336435b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:39 crc kubenswrapper[4699]: I0226 11:30:39.705428 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9392947-cd31-4afd-92c7-73bac0d4cbd3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:40 crc kubenswrapper[4699]: I0226 11:30:40.024014 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8e68-account-create-update-bwkx8" event={"ID":"f6e3aace-8f02-410d-8e7e-4fa61336435b","Type":"ContainerDied","Data":"b596e01a17243fbf4997c4d2c8dfee292b581ac3cf78de67257c3b09a33dffa3"} Feb 26 11:30:40 crc kubenswrapper[4699]: I0226 11:30:40.024066 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b596e01a17243fbf4997c4d2c8dfee292b581ac3cf78de67257c3b09a33dffa3" Feb 26 11:30:40 crc kubenswrapper[4699]: I0226 11:30:40.024154 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8e68-account-create-update-bwkx8" Feb 26 11:30:40 crc kubenswrapper[4699]: I0226 11:30:40.032191 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-29gg4" event={"ID":"e9392947-cd31-4afd-92c7-73bac0d4cbd3","Type":"ContainerDied","Data":"b278a38ee63ea38861814367d188a15a9a6283a23e63740d2fe8992f0f5bf2f7"} Feb 26 11:30:40 crc kubenswrapper[4699]: I0226 11:30:40.032230 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b278a38ee63ea38861814367d188a15a9a6283a23e63740d2fe8992f0f5bf2f7" Feb 26 11:30:40 crc kubenswrapper[4699]: I0226 11:30:40.032364 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-29gg4" Feb 26 11:30:40 crc kubenswrapper[4699]: I0226 11:30:40.271368 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13838b5f-5f0e-44ba-8b63-97b4e20efbce" path="/var/lib/kubelet/pods/13838b5f-5f0e-44ba-8b63-97b4e20efbce/volumes" Feb 26 11:30:41 crc kubenswrapper[4699]: I0226 11:30:41.585490 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:30:41 crc kubenswrapper[4699]: I0226 11:30:41.585781 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:30:42 crc kubenswrapper[4699]: I0226 11:30:42.347182 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-47sx2"] Feb 26 11:30:42 crc kubenswrapper[4699]: E0226 11:30:42.347563 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13838b5f-5f0e-44ba-8b63-97b4e20efbce" containerName="dnsmasq-dns" Feb 26 11:30:42 crc kubenswrapper[4699]: I0226 11:30:42.347602 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="13838b5f-5f0e-44ba-8b63-97b4e20efbce" containerName="dnsmasq-dns" Feb 26 11:30:42 crc kubenswrapper[4699]: E0226 11:30:42.347614 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6e3aace-8f02-410d-8e7e-4fa61336435b" containerName="mariadb-account-create-update" Feb 26 11:30:42 crc kubenswrapper[4699]: I0226 11:30:42.347621 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6e3aace-8f02-410d-8e7e-4fa61336435b" containerName="mariadb-account-create-update" Feb 26 11:30:42 crc kubenswrapper[4699]: E0226 11:30:42.347635 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9392947-cd31-4afd-92c7-73bac0d4cbd3" containerName="mariadb-database-create" Feb 26 11:30:42 crc kubenswrapper[4699]: I0226 11:30:42.347643 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9392947-cd31-4afd-92c7-73bac0d4cbd3" containerName="mariadb-database-create" Feb 26 11:30:42 crc kubenswrapper[4699]: E0226 11:30:42.347668 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13838b5f-5f0e-44ba-8b63-97b4e20efbce" containerName="init" Feb 26 11:30:42 crc kubenswrapper[4699]: I0226 11:30:42.347678 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="13838b5f-5f0e-44ba-8b63-97b4e20efbce" containerName="init" Feb 26 11:30:42 crc kubenswrapper[4699]: I0226 11:30:42.347873 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="13838b5f-5f0e-44ba-8b63-97b4e20efbce" containerName="dnsmasq-dns" Feb 26 11:30:42 crc kubenswrapper[4699]: I0226 11:30:42.347888 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9392947-cd31-4afd-92c7-73bac0d4cbd3" containerName="mariadb-database-create" Feb 26 11:30:42 crc kubenswrapper[4699]: I0226 11:30:42.347902 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6e3aace-8f02-410d-8e7e-4fa61336435b" containerName="mariadb-account-create-update" Feb 26 11:30:42 crc kubenswrapper[4699]: I0226 11:30:42.348526 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-47sx2" Feb 26 11:30:42 crc kubenswrapper[4699]: I0226 11:30:42.351673 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 26 11:30:42 crc kubenswrapper[4699]: I0226 11:30:42.358995 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-47sx2"] Feb 26 11:30:42 crc kubenswrapper[4699]: I0226 11:30:42.450821 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9stw5\" (UniqueName: \"kubernetes.io/projected/acabaa2a-471d-49a2-9e75-b5c1a8eb590e-kube-api-access-9stw5\") pod \"root-account-create-update-47sx2\" (UID: \"acabaa2a-471d-49a2-9e75-b5c1a8eb590e\") " pod="openstack/root-account-create-update-47sx2" Feb 26 11:30:42 crc kubenswrapper[4699]: I0226 11:30:42.450969 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acabaa2a-471d-49a2-9e75-b5c1a8eb590e-operator-scripts\") pod \"root-account-create-update-47sx2\" (UID: \"acabaa2a-471d-49a2-9e75-b5c1a8eb590e\") " pod="openstack/root-account-create-update-47sx2" Feb 26 11:30:42 crc kubenswrapper[4699]: I0226 11:30:42.552952 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9stw5\" (UniqueName: \"kubernetes.io/projected/acabaa2a-471d-49a2-9e75-b5c1a8eb590e-kube-api-access-9stw5\") pod \"root-account-create-update-47sx2\" (UID: \"acabaa2a-471d-49a2-9e75-b5c1a8eb590e\") " pod="openstack/root-account-create-update-47sx2" Feb 26 11:30:42 crc kubenswrapper[4699]: I0226 11:30:42.553064 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acabaa2a-471d-49a2-9e75-b5c1a8eb590e-operator-scripts\") pod \"root-account-create-update-47sx2\" (UID: \"acabaa2a-471d-49a2-9e75-b5c1a8eb590e\") " pod="openstack/root-account-create-update-47sx2" Feb 26 11:30:42 crc kubenswrapper[4699]: I0226 11:30:42.553880 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acabaa2a-471d-49a2-9e75-b5c1a8eb590e-operator-scripts\") pod \"root-account-create-update-47sx2\" (UID: \"acabaa2a-471d-49a2-9e75-b5c1a8eb590e\") " pod="openstack/root-account-create-update-47sx2" Feb 26 11:30:42 crc kubenswrapper[4699]: I0226 11:30:42.570155 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9stw5\" (UniqueName: \"kubernetes.io/projected/acabaa2a-471d-49a2-9e75-b5c1a8eb590e-kube-api-access-9stw5\") pod \"root-account-create-update-47sx2\" (UID: \"acabaa2a-471d-49a2-9e75-b5c1a8eb590e\") " pod="openstack/root-account-create-update-47sx2" Feb 26 11:30:42 crc kubenswrapper[4699]: I0226 11:30:42.669868 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-47sx2" Feb 26 11:30:43 crc kubenswrapper[4699]: I0226 11:30:43.034828 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-47sx2"] Feb 26 11:30:43 crc kubenswrapper[4699]: I0226 11:30:43.060933 4699 generic.go:334] "Generic (PLEG): container finished" podID="2d57084d-dc87-44e4-bbc8-50c402b7165b" containerID="4d1eb8a447a752838461f0d636790cc5233f2885f6f184a34b386abf15a99629" exitCode=0 Feb 26 11:30:43 crc kubenswrapper[4699]: I0226 11:30:43.061007 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d57084d-dc87-44e4-bbc8-50c402b7165b","Type":"ContainerDied","Data":"4d1eb8a447a752838461f0d636790cc5233f2885f6f184a34b386abf15a99629"} Feb 26 11:30:43 crc kubenswrapper[4699]: I0226 11:30:43.064868 4699 generic.go:334] "Generic (PLEG): container finished" podID="f4a652b4-5b96-4ebf-81b4-df92846455bd" containerID="01e22f9568e3ffe75e59f3efb64eaf87a7d36c70e421e1930b11972704fa1c7f" exitCode=0 Feb 26 11:30:43 crc kubenswrapper[4699]: I0226 11:30:43.064921 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f4a652b4-5b96-4ebf-81b4-df92846455bd","Type":"ContainerDied","Data":"01e22f9568e3ffe75e59f3efb64eaf87a7d36c70e421e1930b11972704fa1c7f"} Feb 26 11:30:44 crc kubenswrapper[4699]: I0226 11:30:44.073823 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f4a652b4-5b96-4ebf-81b4-df92846455bd","Type":"ContainerStarted","Data":"5338446d199768b870e8acd23a871db21784e94ef89f9e11e4dc4c4cebdfd0f7"} Feb 26 11:30:44 crc kubenswrapper[4699]: I0226 11:30:44.074405 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 26 11:30:44 crc kubenswrapper[4699]: I0226 11:30:44.082342 4699 generic.go:334] "Generic (PLEG): container finished" podID="acabaa2a-471d-49a2-9e75-b5c1a8eb590e" containerID="3fc8431c0d9189816a6d87bbbf1bde79cfcb29458f69200822c417c75941073b" exitCode=0 Feb 26 11:30:44 crc kubenswrapper[4699]: I0226 11:30:44.082397 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-47sx2" event={"ID":"acabaa2a-471d-49a2-9e75-b5c1a8eb590e","Type":"ContainerDied","Data":"3fc8431c0d9189816a6d87bbbf1bde79cfcb29458f69200822c417c75941073b"} Feb 26 11:30:44 crc kubenswrapper[4699]: I0226 11:30:44.082422 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-47sx2" event={"ID":"acabaa2a-471d-49a2-9e75-b5c1a8eb590e","Type":"ContainerStarted","Data":"7a667b210df76a1b9615f469b7a212c9e1abf68463fdab080abb7ecfefcc2f05"} Feb 26 11:30:44 crc kubenswrapper[4699]: I0226 11:30:44.083884 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d57084d-dc87-44e4-bbc8-50c402b7165b","Type":"ContainerStarted","Data":"34a964fb2885f46c40a593f7964e9249ef84fe7b3d8685d54bd17236234cc625"} Feb 26 11:30:44 crc kubenswrapper[4699]: I0226 11:30:44.084631 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:30:44 crc kubenswrapper[4699]: I0226 11:30:44.100396 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=47.336488712 podStartE2EDuration="54.100379684s" podCreationTimestamp="2026-02-26 11:29:50 +0000 UTC" firstStartedPulling="2026-02-26 11:30:02.536307278 +0000 UTC m=+1148.347133712" lastFinishedPulling="2026-02-26 11:30:09.30019825 +0000 UTC m=+1155.111024684" observedRunningTime="2026-02-26 11:30:44.097652515 +0000 UTC m=+1189.908478959" watchObservedRunningTime="2026-02-26 11:30:44.100379684 +0000 UTC m=+1189.911206118" Feb 26 11:30:44 crc kubenswrapper[4699]: I0226 11:30:44.147976 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=43.468588836 podStartE2EDuration="54.147956908s" podCreationTimestamp="2026-02-26 11:29:50 +0000 UTC" firstStartedPulling="2026-02-26 11:29:58.583180904 +0000 UTC m=+1144.394007338" lastFinishedPulling="2026-02-26 11:30:09.262548976 +0000 UTC m=+1155.073375410" observedRunningTime="2026-02-26 11:30:44.143556381 +0000 UTC m=+1189.954382835" watchObservedRunningTime="2026-02-26 11:30:44.147956908 +0000 UTC m=+1189.958783352" Feb 26 11:30:44 crc kubenswrapper[4699]: I0226 11:30:44.901698 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-etc-swift\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") " pod="openstack/swift-storage-0" Feb 26 11:30:44 crc kubenswrapper[4699]: I0226 11:30:44.907654 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f23ec57b-7ab1-4152-8108-e0e27b4ba95c-etc-swift\") pod \"swift-storage-0\" (UID: \"f23ec57b-7ab1-4152-8108-e0e27b4ba95c\") " pod="openstack/swift-storage-0" Feb 26 11:30:44 crc kubenswrapper[4699]: I0226 11:30:44.954042 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.091645 4699 generic.go:334] "Generic (PLEG): container finished" podID="9125ee3a-a0b6-469b-b79d-3a376f2d5d91" containerID="11bb20834c3902f477ab036d4f74aa6b8faa916aaeb98d82af08a9d084ddec28" exitCode=0 Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.091789 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lqqdx" event={"ID":"9125ee3a-a0b6-469b-b79d-3a376f2d5d91","Type":"ContainerDied","Data":"11bb20834c3902f477ab036d4f74aa6b8faa916aaeb98d82af08a9d084ddec28"} Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.416748 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-nhpn8"] Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.418519 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nhpn8" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.424267 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-nhpn8"] Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.511146 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzxwx\" (UniqueName: \"kubernetes.io/projected/0e74821a-c4e5-4812-829d-c6b60b6657b8-kube-api-access-fzxwx\") pod \"glance-db-create-nhpn8\" (UID: \"0e74821a-c4e5-4812-829d-c6b60b6657b8\") " pod="openstack/glance-db-create-nhpn8" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.511221 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e74821a-c4e5-4812-829d-c6b60b6657b8-operator-scripts\") pod \"glance-db-create-nhpn8\" (UID: \"0e74821a-c4e5-4812-829d-c6b60b6657b8\") " pod="openstack/glance-db-create-nhpn8" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.526598 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-47sx2" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.549389 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f9e8-account-create-update-zqq4d"] Feb 26 11:30:45 crc kubenswrapper[4699]: E0226 11:30:45.549805 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acabaa2a-471d-49a2-9e75-b5c1a8eb590e" containerName="mariadb-account-create-update" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.549823 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="acabaa2a-471d-49a2-9e75-b5c1a8eb590e" containerName="mariadb-account-create-update" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.550004 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="acabaa2a-471d-49a2-9e75-b5c1a8eb590e" containerName="mariadb-account-create-update" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.550621 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f9e8-account-create-update-zqq4d" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.552956 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.569331 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f9e8-account-create-update-zqq4d"] Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.617212 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9stw5\" (UniqueName: \"kubernetes.io/projected/acabaa2a-471d-49a2-9e75-b5c1a8eb590e-kube-api-access-9stw5\") pod \"acabaa2a-471d-49a2-9e75-b5c1a8eb590e\" (UID: \"acabaa2a-471d-49a2-9e75-b5c1a8eb590e\") " Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.619949 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acabaa2a-471d-49a2-9e75-b5c1a8eb590e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "acabaa2a-471d-49a2-9e75-b5c1a8eb590e" (UID: "acabaa2a-471d-49a2-9e75-b5c1a8eb590e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.618045 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acabaa2a-471d-49a2-9e75-b5c1a8eb590e-operator-scripts\") pod \"acabaa2a-471d-49a2-9e75-b5c1a8eb590e\" (UID: \"acabaa2a-471d-49a2-9e75-b5c1a8eb590e\") " Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.620350 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e74821a-c4e5-4812-829d-c6b60b6657b8-operator-scripts\") pod \"glance-db-create-nhpn8\" (UID: \"0e74821a-c4e5-4812-829d-c6b60b6657b8\") " pod="openstack/glance-db-create-nhpn8" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.621023 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e74821a-c4e5-4812-829d-c6b60b6657b8-operator-scripts\") pod \"glance-db-create-nhpn8\" (UID: \"0e74821a-c4e5-4812-829d-c6b60b6657b8\") " pod="openstack/glance-db-create-nhpn8" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.621162 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c209748-0c47-4bbb-883b-f4c245b6a156-operator-scripts\") pod \"glance-f9e8-account-create-update-zqq4d\" (UID: \"9c209748-0c47-4bbb-883b-f4c245b6a156\") " pod="openstack/glance-f9e8-account-create-update-zqq4d" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.623823 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acabaa2a-471d-49a2-9e75-b5c1a8eb590e-kube-api-access-9stw5" (OuterVolumeSpecName: "kube-api-access-9stw5") pod "acabaa2a-471d-49a2-9e75-b5c1a8eb590e" (UID: "acabaa2a-471d-49a2-9e75-b5c1a8eb590e"). InnerVolumeSpecName "kube-api-access-9stw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.624673 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f7fx\" (UniqueName: \"kubernetes.io/projected/9c209748-0c47-4bbb-883b-f4c245b6a156-kube-api-access-2f7fx\") pod \"glance-f9e8-account-create-update-zqq4d\" (UID: \"9c209748-0c47-4bbb-883b-f4c245b6a156\") " pod="openstack/glance-f9e8-account-create-update-zqq4d" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.625302 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzxwx\" (UniqueName: \"kubernetes.io/projected/0e74821a-c4e5-4812-829d-c6b60b6657b8-kube-api-access-fzxwx\") pod \"glance-db-create-nhpn8\" (UID: \"0e74821a-c4e5-4812-829d-c6b60b6657b8\") " pod="openstack/glance-db-create-nhpn8" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.625779 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acabaa2a-471d-49a2-9e75-b5c1a8eb590e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.625794 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9stw5\" (UniqueName: \"kubernetes.io/projected/acabaa2a-471d-49a2-9e75-b5c1a8eb590e-kube-api-access-9stw5\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.628653 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.648077 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzxwx\" (UniqueName: \"kubernetes.io/projected/0e74821a-c4e5-4812-829d-c6b60b6657b8-kube-api-access-fzxwx\") pod \"glance-db-create-nhpn8\" (UID: \"0e74821a-c4e5-4812-829d-c6b60b6657b8\") " pod="openstack/glance-db-create-nhpn8" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.713006 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.729442 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c209748-0c47-4bbb-883b-f4c245b6a156-operator-scripts\") pod \"glance-f9e8-account-create-update-zqq4d\" (UID: \"9c209748-0c47-4bbb-883b-f4c245b6a156\") " pod="openstack/glance-f9e8-account-create-update-zqq4d" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.729495 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f7fx\" (UniqueName: \"kubernetes.io/projected/9c209748-0c47-4bbb-883b-f4c245b6a156-kube-api-access-2f7fx\") pod \"glance-f9e8-account-create-update-zqq4d\" (UID: \"9c209748-0c47-4bbb-883b-f4c245b6a156\") " pod="openstack/glance-f9e8-account-create-update-zqq4d" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.731063 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c209748-0c47-4bbb-883b-f4c245b6a156-operator-scripts\") pod \"glance-f9e8-account-create-update-zqq4d\" (UID: \"9c209748-0c47-4bbb-883b-f4c245b6a156\") " pod="openstack/glance-f9e8-account-create-update-zqq4d" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.740474 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nhpn8" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.750450 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f7fx\" (UniqueName: \"kubernetes.io/projected/9c209748-0c47-4bbb-883b-f4c245b6a156-kube-api-access-2f7fx\") pod \"glance-f9e8-account-create-update-zqq4d\" (UID: \"9c209748-0c47-4bbb-883b-f4c245b6a156\") " pod="openstack/glance-f9e8-account-create-update-zqq4d" Feb 26 11:30:45 crc kubenswrapper[4699]: I0226 11:30:45.867635 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f9e8-account-create-update-zqq4d" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.102911 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f23ec57b-7ab1-4152-8108-e0e27b4ba95c","Type":"ContainerStarted","Data":"3586cfc7a59a5c437d5417c4ce50a7a439a961c8aa77b8f836d57d0a464bd67f"} Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.104523 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-47sx2" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.108385 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-47sx2" event={"ID":"acabaa2a-471d-49a2-9e75-b5c1a8eb590e","Type":"ContainerDied","Data":"7a667b210df76a1b9615f469b7a212c9e1abf68463fdab080abb7ecfefcc2f05"} Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.108438 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a667b210df76a1b9615f469b7a212c9e1abf68463fdab080abb7ecfefcc2f05" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.256468 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-nhpn8"] Feb 26 11:30:46 crc kubenswrapper[4699]: W0226 11:30:46.290270 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e74821a_c4e5_4812_829d_c6b60b6657b8.slice/crio-1e0f55984a322052ab8a31e913928c3a3606b3448556d7abf749cfd0762d8811 WatchSource:0}: Error finding container 1e0f55984a322052ab8a31e913928c3a3606b3448556d7abf749cfd0762d8811: Status 404 returned error can't find the container with id 1e0f55984a322052ab8a31e913928c3a3606b3448556d7abf749cfd0762d8811 Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.384641 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-htqpz"] Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.387168 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-htqpz" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.393699 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-htqpz"] Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.404515 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f9e8-account-create-update-zqq4d"] Feb 26 11:30:46 crc kubenswrapper[4699]: W0226 11:30:46.405410 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c209748_0c47_4bbb_883b_f4c245b6a156.slice/crio-4462b064acce42a7acadd47c88ba90a846e89e85ca3f9f837ec64f18e4763308 WatchSource:0}: Error finding container 4462b064acce42a7acadd47c88ba90a846e89e85ca3f9f837ec64f18e4763308: Status 404 returned error can't find the container with id 4462b064acce42a7acadd47c88ba90a846e89e85ca3f9f837ec64f18e4763308 Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.448258 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54x8m\" (UniqueName: \"kubernetes.io/projected/64b0134d-d882-4622-86a4-ab8172ee4fb2-kube-api-access-54x8m\") pod \"keystone-db-create-htqpz\" (UID: \"64b0134d-d882-4622-86a4-ab8172ee4fb2\") " pod="openstack/keystone-db-create-htqpz" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.448320 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64b0134d-d882-4622-86a4-ab8172ee4fb2-operator-scripts\") pod \"keystone-db-create-htqpz\" (UID: \"64b0134d-d882-4622-86a4-ab8172ee4fb2\") " pod="openstack/keystone-db-create-htqpz" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.472076 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-0fa1-account-create-update-l7dhx"] Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.474328 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0fa1-account-create-update-l7dhx" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.479762 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.486488 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0fa1-account-create-update-l7dhx"] Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.554003 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22d08e57-ba28-4614-8b11-2bd1bd4f836f-operator-scripts\") pod \"keystone-0fa1-account-create-update-l7dhx\" (UID: \"22d08e57-ba28-4614-8b11-2bd1bd4f836f\") " pod="openstack/keystone-0fa1-account-create-update-l7dhx" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.554410 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xrmj\" (UniqueName: \"kubernetes.io/projected/22d08e57-ba28-4614-8b11-2bd1bd4f836f-kube-api-access-8xrmj\") pod \"keystone-0fa1-account-create-update-l7dhx\" (UID: \"22d08e57-ba28-4614-8b11-2bd1bd4f836f\") " pod="openstack/keystone-0fa1-account-create-update-l7dhx" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.554576 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54x8m\" (UniqueName: \"kubernetes.io/projected/64b0134d-d882-4622-86a4-ab8172ee4fb2-kube-api-access-54x8m\") pod \"keystone-db-create-htqpz\" (UID: \"64b0134d-d882-4622-86a4-ab8172ee4fb2\") " pod="openstack/keystone-db-create-htqpz" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.554697 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64b0134d-d882-4622-86a4-ab8172ee4fb2-operator-scripts\") pod \"keystone-db-create-htqpz\" (UID: \"64b0134d-d882-4622-86a4-ab8172ee4fb2\") " pod="openstack/keystone-db-create-htqpz" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.555569 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64b0134d-d882-4622-86a4-ab8172ee4fb2-operator-scripts\") pod \"keystone-db-create-htqpz\" (UID: \"64b0134d-d882-4622-86a4-ab8172ee4fb2\") " pod="openstack/keystone-db-create-htqpz" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.582577 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54x8m\" (UniqueName: \"kubernetes.io/projected/64b0134d-d882-4622-86a4-ab8172ee4fb2-kube-api-access-54x8m\") pod \"keystone-db-create-htqpz\" (UID: \"64b0134d-d882-4622-86a4-ab8172ee4fb2\") " pod="openstack/keystone-db-create-htqpz" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.585904 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.655824 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6d5r\" (UniqueName: \"kubernetes.io/projected/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-kube-api-access-k6d5r\") pod \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.655938 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-etc-swift\") pod \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.655979 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-scripts\") pod \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.656041 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-swiftconf\") pod \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.656543 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-ring-data-devices\") pod \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.657265 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9125ee3a-a0b6-469b-b79d-3a376f2d5d91" (UID: "9125ee3a-a0b6-469b-b79d-3a376f2d5d91"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.657361 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-combined-ca-bundle\") pod \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.657450 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-dispersionconf\") pod \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\" (UID: \"9125ee3a-a0b6-469b-b79d-3a376f2d5d91\") " Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.657592 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9125ee3a-a0b6-469b-b79d-3a376f2d5d91" (UID: "9125ee3a-a0b6-469b-b79d-3a376f2d5d91"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.658063 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xrmj\" (UniqueName: \"kubernetes.io/projected/22d08e57-ba28-4614-8b11-2bd1bd4f836f-kube-api-access-8xrmj\") pod \"keystone-0fa1-account-create-update-l7dhx\" (UID: \"22d08e57-ba28-4614-8b11-2bd1bd4f836f\") " pod="openstack/keystone-0fa1-account-create-update-l7dhx" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.658288 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22d08e57-ba28-4614-8b11-2bd1bd4f836f-operator-scripts\") pod \"keystone-0fa1-account-create-update-l7dhx\" (UID: \"22d08e57-ba28-4614-8b11-2bd1bd4f836f\") " pod="openstack/keystone-0fa1-account-create-update-l7dhx" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.658341 4699 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.658366 4699 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.659139 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22d08e57-ba28-4614-8b11-2bd1bd4f836f-operator-scripts\") pod \"keystone-0fa1-account-create-update-l7dhx\" (UID: \"22d08e57-ba28-4614-8b11-2bd1bd4f836f\") " pod="openstack/keystone-0fa1-account-create-update-l7dhx" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.660867 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-kube-api-access-k6d5r" (OuterVolumeSpecName: "kube-api-access-k6d5r") pod "9125ee3a-a0b6-469b-b79d-3a376f2d5d91" (UID: "9125ee3a-a0b6-469b-b79d-3a376f2d5d91"). InnerVolumeSpecName "kube-api-access-k6d5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.668753 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9125ee3a-a0b6-469b-b79d-3a376f2d5d91" (UID: "9125ee3a-a0b6-469b-b79d-3a376f2d5d91"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.679607 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xrmj\" (UniqueName: \"kubernetes.io/projected/22d08e57-ba28-4614-8b11-2bd1bd4f836f-kube-api-access-8xrmj\") pod \"keystone-0fa1-account-create-update-l7dhx\" (UID: \"22d08e57-ba28-4614-8b11-2bd1bd4f836f\") " pod="openstack/keystone-0fa1-account-create-update-l7dhx" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.680583 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-scripts" (OuterVolumeSpecName: "scripts") pod "9125ee3a-a0b6-469b-b79d-3a376f2d5d91" (UID: "9125ee3a-a0b6-469b-b79d-3a376f2d5d91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.689965 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9125ee3a-a0b6-469b-b79d-3a376f2d5d91" (UID: "9125ee3a-a0b6-469b-b79d-3a376f2d5d91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.695224 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9125ee3a-a0b6-469b-b79d-3a376f2d5d91" (UID: "9125ee3a-a0b6-469b-b79d-3a376f2d5d91"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.759606 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.759647 4699 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.759657 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6d5r\" (UniqueName: \"kubernetes.io/projected/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-kube-api-access-k6d5r\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.759673 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.759685 4699 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9125ee3a-a0b6-469b-b79d-3a376f2d5d91-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.786062 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-htqpz" Feb 26 11:30:46 crc kubenswrapper[4699]: I0226 11:30:46.877635 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0fa1-account-create-update-l7dhx" Feb 26 11:30:47 crc kubenswrapper[4699]: I0226 11:30:47.113319 4699 generic.go:334] "Generic (PLEG): container finished" podID="9c209748-0c47-4bbb-883b-f4c245b6a156" containerID="c5f501a1150c4caded935575b10f8f9230324616853238eace0db08d01347483" exitCode=0 Feb 26 11:30:47 crc kubenswrapper[4699]: I0226 11:30:47.113401 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f9e8-account-create-update-zqq4d" event={"ID":"9c209748-0c47-4bbb-883b-f4c245b6a156","Type":"ContainerDied","Data":"c5f501a1150c4caded935575b10f8f9230324616853238eace0db08d01347483"} Feb 26 11:30:47 crc kubenswrapper[4699]: I0226 11:30:47.113716 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f9e8-account-create-update-zqq4d" event={"ID":"9c209748-0c47-4bbb-883b-f4c245b6a156","Type":"ContainerStarted","Data":"4462b064acce42a7acadd47c88ba90a846e89e85ca3f9f837ec64f18e4763308"} Feb 26 11:30:47 crc kubenswrapper[4699]: I0226 11:30:47.126796 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lqqdx" event={"ID":"9125ee3a-a0b6-469b-b79d-3a376f2d5d91","Type":"ContainerDied","Data":"a3441a4af4d4bdfc7161195190910ab39fbc98b45c549fbd1135470c64e160b7"} Feb 26 11:30:47 crc kubenswrapper[4699]: I0226 11:30:47.126856 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3441a4af4d4bdfc7161195190910ab39fbc98b45c549fbd1135470c64e160b7" Feb 26 11:30:47 crc kubenswrapper[4699]: I0226 11:30:47.126950 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lqqdx" Feb 26 11:30:47 crc kubenswrapper[4699]: I0226 11:30:47.140325 4699 generic.go:334] "Generic (PLEG): container finished" podID="0e74821a-c4e5-4812-829d-c6b60b6657b8" containerID="e9c4f64540efb8ca94268435547206be7e8a21ea869414c0e0fe3fdc2ad23ae0" exitCode=0 Feb 26 11:30:47 crc kubenswrapper[4699]: I0226 11:30:47.140379 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nhpn8" event={"ID":"0e74821a-c4e5-4812-829d-c6b60b6657b8","Type":"ContainerDied","Data":"e9c4f64540efb8ca94268435547206be7e8a21ea869414c0e0fe3fdc2ad23ae0"} Feb 26 11:30:47 crc kubenswrapper[4699]: I0226 11:30:47.140412 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nhpn8" event={"ID":"0e74821a-c4e5-4812-829d-c6b60b6657b8","Type":"ContainerStarted","Data":"1e0f55984a322052ab8a31e913928c3a3606b3448556d7abf749cfd0762d8811"} Feb 26 11:30:47 crc kubenswrapper[4699]: I0226 11:30:47.717438 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0fa1-account-create-update-l7dhx"] Feb 26 11:30:47 crc kubenswrapper[4699]: W0226 11:30:47.722102 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22d08e57_ba28_4614_8b11_2bd1bd4f836f.slice/crio-20e4ebfd1d0243f30ce7e375a5ce55aba074451b2b65a840da783bdddd8e987b WatchSource:0}: Error finding container 20e4ebfd1d0243f30ce7e375a5ce55aba074451b2b65a840da783bdddd8e987b: Status 404 returned error can't find the container with id 20e4ebfd1d0243f30ce7e375a5ce55aba074451b2b65a840da783bdddd8e987b Feb 26 11:30:47 crc kubenswrapper[4699]: I0226 11:30:47.858824 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-htqpz"] Feb 26 11:30:47 crc kubenswrapper[4699]: W0226 11:30:47.863220 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64b0134d_d882_4622_86a4_ab8172ee4fb2.slice/crio-ae8f44b486fa15025b9de2e5a645bec5ea7de597824b9c9e5ed6ba992765d39a WatchSource:0}: Error finding container ae8f44b486fa15025b9de2e5a645bec5ea7de597824b9c9e5ed6ba992765d39a: Status 404 returned error can't find the container with id ae8f44b486fa15025b9de2e5a645bec5ea7de597824b9c9e5ed6ba992765d39a Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.150108 4699 generic.go:334] "Generic (PLEG): container finished" podID="22d08e57-ba28-4614-8b11-2bd1bd4f836f" containerID="f56c01ae851446ecb80715a4bf6a848caa81425dc5709a8852bd80e336fdb67f" exitCode=0 Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.150216 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0fa1-account-create-update-l7dhx" event={"ID":"22d08e57-ba28-4614-8b11-2bd1bd4f836f","Type":"ContainerDied","Data":"f56c01ae851446ecb80715a4bf6a848caa81425dc5709a8852bd80e336fdb67f"} Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.150251 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0fa1-account-create-update-l7dhx" event={"ID":"22d08e57-ba28-4614-8b11-2bd1bd4f836f","Type":"ContainerStarted","Data":"20e4ebfd1d0243f30ce7e375a5ce55aba074451b2b65a840da783bdddd8e987b"} Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.152782 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f23ec57b-7ab1-4152-8108-e0e27b4ba95c","Type":"ContainerStarted","Data":"1abd5235054d6438660e4cd9e2b87298f012eb8e00a183cdffc64128ea761a95"} Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.152828 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f23ec57b-7ab1-4152-8108-e0e27b4ba95c","Type":"ContainerStarted","Data":"25c64821bb164d049d829a26cdc22e4714b6156e57d06d5ef5f8043bc28be1f8"} Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.152838 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f23ec57b-7ab1-4152-8108-e0e27b4ba95c","Type":"ContainerStarted","Data":"1b2eb469c18895b34d970e41a0f5223711facf0c299ecd2aec6d5a15360c562f"} Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.152847 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f23ec57b-7ab1-4152-8108-e0e27b4ba95c","Type":"ContainerStarted","Data":"69d593c1a30e3c31c8ab0e87969a35f50e8b930c935e4fe7853873452f097520"} Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.155562 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-htqpz" event={"ID":"64b0134d-d882-4622-86a4-ab8172ee4fb2","Type":"ContainerStarted","Data":"9e6e239d14eb5fdc0f0fee3107f485263c4c1938d985d9c817ca4f3885c7de71"} Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.155588 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-htqpz" event={"ID":"64b0134d-d882-4622-86a4-ab8172ee4fb2","Type":"ContainerStarted","Data":"ae8f44b486fa15025b9de2e5a645bec5ea7de597824b9c9e5ed6ba992765d39a"} Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.193811 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-htqpz" podStartSLOduration=2.193792905 podStartE2EDuration="2.193792905s" podCreationTimestamp="2026-02-26 11:30:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:30:48.19223649 +0000 UTC m=+1194.003062914" watchObservedRunningTime="2026-02-26 11:30:48.193792905 +0000 UTC m=+1194.004619339" Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.600280 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f9e8-account-create-update-zqq4d" Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.607254 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nhpn8" Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.712880 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f7fx\" (UniqueName: \"kubernetes.io/projected/9c209748-0c47-4bbb-883b-f4c245b6a156-kube-api-access-2f7fx\") pod \"9c209748-0c47-4bbb-883b-f4c245b6a156\" (UID: \"9c209748-0c47-4bbb-883b-f4c245b6a156\") " Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.712974 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e74821a-c4e5-4812-829d-c6b60b6657b8-operator-scripts\") pod \"0e74821a-c4e5-4812-829d-c6b60b6657b8\" (UID: \"0e74821a-c4e5-4812-829d-c6b60b6657b8\") " Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.713798 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e74821a-c4e5-4812-829d-c6b60b6657b8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0e74821a-c4e5-4812-829d-c6b60b6657b8" (UID: "0e74821a-c4e5-4812-829d-c6b60b6657b8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.713921 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c209748-0c47-4bbb-883b-f4c245b6a156-operator-scripts\") pod \"9c209748-0c47-4bbb-883b-f4c245b6a156\" (UID: \"9c209748-0c47-4bbb-883b-f4c245b6a156\") " Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.714425 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c209748-0c47-4bbb-883b-f4c245b6a156-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9c209748-0c47-4bbb-883b-f4c245b6a156" (UID: "9c209748-0c47-4bbb-883b-f4c245b6a156"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.714483 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzxwx\" (UniqueName: \"kubernetes.io/projected/0e74821a-c4e5-4812-829d-c6b60b6657b8-kube-api-access-fzxwx\") pod \"0e74821a-c4e5-4812-829d-c6b60b6657b8\" (UID: \"0e74821a-c4e5-4812-829d-c6b60b6657b8\") " Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.715306 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c209748-0c47-4bbb-883b-f4c245b6a156-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.715331 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e74821a-c4e5-4812-829d-c6b60b6657b8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.718903 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c209748-0c47-4bbb-883b-f4c245b6a156-kube-api-access-2f7fx" (OuterVolumeSpecName: "kube-api-access-2f7fx") pod "9c209748-0c47-4bbb-883b-f4c245b6a156" (UID: "9c209748-0c47-4bbb-883b-f4c245b6a156"). InnerVolumeSpecName "kube-api-access-2f7fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.718967 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e74821a-c4e5-4812-829d-c6b60b6657b8-kube-api-access-fzxwx" (OuterVolumeSpecName: "kube-api-access-fzxwx") pod "0e74821a-c4e5-4812-829d-c6b60b6657b8" (UID: "0e74821a-c4e5-4812-829d-c6b60b6657b8"). InnerVolumeSpecName "kube-api-access-fzxwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.770872 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-47sx2"] Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.776778 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-47sx2"] Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.817359 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzxwx\" (UniqueName: \"kubernetes.io/projected/0e74821a-c4e5-4812-829d-c6b60b6657b8-kube-api-access-fzxwx\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:48 crc kubenswrapper[4699]: I0226 11:30:48.817397 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f7fx\" (UniqueName: \"kubernetes.io/projected/9c209748-0c47-4bbb-883b-f4c245b6a156-kube-api-access-2f7fx\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:49 crc kubenswrapper[4699]: I0226 11:30:49.164525 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nhpn8" event={"ID":"0e74821a-c4e5-4812-829d-c6b60b6657b8","Type":"ContainerDied","Data":"1e0f55984a322052ab8a31e913928c3a3606b3448556d7abf749cfd0762d8811"} Feb 26 11:30:49 crc kubenswrapper[4699]: I0226 11:30:49.164570 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e0f55984a322052ab8a31e913928c3a3606b3448556d7abf749cfd0762d8811" Feb 26 11:30:49 crc kubenswrapper[4699]: I0226 11:30:49.164677 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nhpn8" Feb 26 11:30:49 crc kubenswrapper[4699]: I0226 11:30:49.168399 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f9e8-account-create-update-zqq4d" event={"ID":"9c209748-0c47-4bbb-883b-f4c245b6a156","Type":"ContainerDied","Data":"4462b064acce42a7acadd47c88ba90a846e89e85ca3f9f837ec64f18e4763308"} Feb 26 11:30:49 crc kubenswrapper[4699]: I0226 11:30:49.168445 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4462b064acce42a7acadd47c88ba90a846e89e85ca3f9f837ec64f18e4763308" Feb 26 11:30:49 crc kubenswrapper[4699]: I0226 11:30:49.168508 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f9e8-account-create-update-zqq4d" Feb 26 11:30:49 crc kubenswrapper[4699]: I0226 11:30:49.184372 4699 generic.go:334] "Generic (PLEG): container finished" podID="64b0134d-d882-4622-86a4-ab8172ee4fb2" containerID="9e6e239d14eb5fdc0f0fee3107f485263c4c1938d985d9c817ca4f3885c7de71" exitCode=0 Feb 26 11:30:49 crc kubenswrapper[4699]: I0226 11:30:49.184418 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-htqpz" event={"ID":"64b0134d-d882-4622-86a4-ab8172ee4fb2","Type":"ContainerDied","Data":"9e6e239d14eb5fdc0f0fee3107f485263c4c1938d985d9c817ca4f3885c7de71"} Feb 26 11:30:49 crc kubenswrapper[4699]: I0226 11:30:49.458695 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0fa1-account-create-update-l7dhx" Feb 26 11:30:49 crc kubenswrapper[4699]: I0226 11:30:49.528399 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22d08e57-ba28-4614-8b11-2bd1bd4f836f-operator-scripts\") pod \"22d08e57-ba28-4614-8b11-2bd1bd4f836f\" (UID: \"22d08e57-ba28-4614-8b11-2bd1bd4f836f\") " Feb 26 11:30:49 crc kubenswrapper[4699]: I0226 11:30:49.528544 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xrmj\" (UniqueName: \"kubernetes.io/projected/22d08e57-ba28-4614-8b11-2bd1bd4f836f-kube-api-access-8xrmj\") pod \"22d08e57-ba28-4614-8b11-2bd1bd4f836f\" (UID: \"22d08e57-ba28-4614-8b11-2bd1bd4f836f\") " Feb 26 11:30:49 crc kubenswrapper[4699]: I0226 11:30:49.528946 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22d08e57-ba28-4614-8b11-2bd1bd4f836f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "22d08e57-ba28-4614-8b11-2bd1bd4f836f" (UID: "22d08e57-ba28-4614-8b11-2bd1bd4f836f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:49 crc kubenswrapper[4699]: I0226 11:30:49.533733 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22d08e57-ba28-4614-8b11-2bd1bd4f836f-kube-api-access-8xrmj" (OuterVolumeSpecName: "kube-api-access-8xrmj") pod "22d08e57-ba28-4614-8b11-2bd1bd4f836f" (UID: "22d08e57-ba28-4614-8b11-2bd1bd4f836f"). InnerVolumeSpecName "kube-api-access-8xrmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:30:49 crc kubenswrapper[4699]: I0226 11:30:49.630919 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22d08e57-ba28-4614-8b11-2bd1bd4f836f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:49 crc kubenswrapper[4699]: I0226 11:30:49.631205 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xrmj\" (UniqueName: \"kubernetes.io/projected/22d08e57-ba28-4614-8b11-2bd1bd4f836f-kube-api-access-8xrmj\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.191358 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0fa1-account-create-update-l7dhx" event={"ID":"22d08e57-ba28-4614-8b11-2bd1bd4f836f","Type":"ContainerDied","Data":"20e4ebfd1d0243f30ce7e375a5ce55aba074451b2b65a840da783bdddd8e987b"} Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.191569 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20e4ebfd1d0243f30ce7e375a5ce55aba074451b2b65a840da783bdddd8e987b" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.191618 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0fa1-account-create-update-l7dhx" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.196004 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f23ec57b-7ab1-4152-8108-e0e27b4ba95c","Type":"ContainerStarted","Data":"902223d261f9e97320c9d132642f54dce273be4a13565ccb3c24bd0da65d0020"} Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.196066 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f23ec57b-7ab1-4152-8108-e0e27b4ba95c","Type":"ContainerStarted","Data":"edae6eb35b2125381e91e42d3750d4c44d05d7a03eef0a838129137f2b9566b3"} Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.196081 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f23ec57b-7ab1-4152-8108-e0e27b4ba95c","Type":"ContainerStarted","Data":"b5a728b521560aedc5243bcdef0a6a51a51a76fd36b9b40122e6e0396a142810"} Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.196094 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f23ec57b-7ab1-4152-8108-e0e27b4ba95c","Type":"ContainerStarted","Data":"937eebe4bac690c01308a25068e0989e77f70ee4b7fca45b48883de7fb571197"} Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.272589 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acabaa2a-471d-49a2-9e75-b5c1a8eb590e" path="/var/lib/kubelet/pods/acabaa2a-471d-49a2-9e75-b5c1a8eb590e/volumes" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.558770 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-htqpz" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.605291 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nrvng" podUID="cd4015f0-f1a7-40d7-ae69-089f74a6873d" containerName="ovn-controller" probeResult="failure" output=< Feb 26 11:30:50 crc kubenswrapper[4699]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 26 11:30:50 crc kubenswrapper[4699]: > Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.637108 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.649943 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54x8m\" (UniqueName: \"kubernetes.io/projected/64b0134d-d882-4622-86a4-ab8172ee4fb2-kube-api-access-54x8m\") pod \"64b0134d-d882-4622-86a4-ab8172ee4fb2\" (UID: \"64b0134d-d882-4622-86a4-ab8172ee4fb2\") " Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.650000 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64b0134d-d882-4622-86a4-ab8172ee4fb2-operator-scripts\") pod \"64b0134d-d882-4622-86a4-ab8172ee4fb2\" (UID: \"64b0134d-d882-4622-86a4-ab8172ee4fb2\") " Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.650663 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64b0134d-d882-4622-86a4-ab8172ee4fb2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "64b0134d-d882-4622-86a4-ab8172ee4fb2" (UID: "64b0134d-d882-4622-86a4-ab8172ee4fb2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.654577 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64b0134d-d882-4622-86a4-ab8172ee4fb2-kube-api-access-54x8m" (OuterVolumeSpecName: "kube-api-access-54x8m") pod "64b0134d-d882-4622-86a4-ab8172ee4fb2" (UID: "64b0134d-d882-4622-86a4-ab8172ee4fb2"). InnerVolumeSpecName "kube-api-access-54x8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.655911 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-gxnxl" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.751531 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54x8m\" (UniqueName: \"kubernetes.io/projected/64b0134d-d882-4622-86a4-ab8172ee4fb2-kube-api-access-54x8m\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.751860 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64b0134d-d882-4622-86a4-ab8172ee4fb2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.753471 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-nblvp"] Feb 26 11:30:50 crc kubenswrapper[4699]: E0226 11:30:50.753851 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e74821a-c4e5-4812-829d-c6b60b6657b8" containerName="mariadb-database-create" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.753871 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e74821a-c4e5-4812-829d-c6b60b6657b8" containerName="mariadb-database-create" Feb 26 11:30:50 crc kubenswrapper[4699]: E0226 11:30:50.753892 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22d08e57-ba28-4614-8b11-2bd1bd4f836f" containerName="mariadb-account-create-update" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.753900 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="22d08e57-ba28-4614-8b11-2bd1bd4f836f" containerName="mariadb-account-create-update" Feb 26 11:30:50 crc kubenswrapper[4699]: E0226 11:30:50.753913 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b0134d-d882-4622-86a4-ab8172ee4fb2" containerName="mariadb-database-create" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.753920 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b0134d-d882-4622-86a4-ab8172ee4fb2" containerName="mariadb-database-create" Feb 26 11:30:50 crc kubenswrapper[4699]: E0226 11:30:50.753945 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9125ee3a-a0b6-469b-b79d-3a376f2d5d91" containerName="swift-ring-rebalance" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.753953 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="9125ee3a-a0b6-469b-b79d-3a376f2d5d91" containerName="swift-ring-rebalance" Feb 26 11:30:50 crc kubenswrapper[4699]: E0226 11:30:50.753971 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c209748-0c47-4bbb-883b-f4c245b6a156" containerName="mariadb-account-create-update" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.753979 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c209748-0c47-4bbb-883b-f4c245b6a156" containerName="mariadb-account-create-update" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.754183 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="64b0134d-d882-4622-86a4-ab8172ee4fb2" containerName="mariadb-database-create" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.754198 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="9125ee3a-a0b6-469b-b79d-3a376f2d5d91" containerName="swift-ring-rebalance" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.754207 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e74821a-c4e5-4812-829d-c6b60b6657b8" containerName="mariadb-database-create" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.754219 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="22d08e57-ba28-4614-8b11-2bd1bd4f836f" containerName="mariadb-account-create-update" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.754229 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c209748-0c47-4bbb-883b-f4c245b6a156" containerName="mariadb-account-create-update" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.754768 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nblvp" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.759024 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.759589 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-j4q6c" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.762009 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-nblvp"] Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.853627 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72c1d656-4f85-483b-b7a2-6132b71ae093-combined-ca-bundle\") pod \"glance-db-sync-nblvp\" (UID: \"72c1d656-4f85-483b-b7a2-6132b71ae093\") " pod="openstack/glance-db-sync-nblvp" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.853750 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbj5f\" (UniqueName: \"kubernetes.io/projected/72c1d656-4f85-483b-b7a2-6132b71ae093-kube-api-access-vbj5f\") pod \"glance-db-sync-nblvp\" (UID: \"72c1d656-4f85-483b-b7a2-6132b71ae093\") " pod="openstack/glance-db-sync-nblvp" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.853785 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72c1d656-4f85-483b-b7a2-6132b71ae093-db-sync-config-data\") pod \"glance-db-sync-nblvp\" (UID: \"72c1d656-4f85-483b-b7a2-6132b71ae093\") " pod="openstack/glance-db-sync-nblvp" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.853858 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72c1d656-4f85-483b-b7a2-6132b71ae093-config-data\") pod \"glance-db-sync-nblvp\" (UID: \"72c1d656-4f85-483b-b7a2-6132b71ae093\") " pod="openstack/glance-db-sync-nblvp" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.901069 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nrvng-config-8jbz5"] Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.902219 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nrvng-config-8jbz5" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.905193 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.919664 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nrvng-config-8jbz5"] Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.955666 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-var-run-ovn\") pod \"ovn-controller-nrvng-config-8jbz5\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " pod="openstack/ovn-controller-nrvng-config-8jbz5" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.955715 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbj5f\" (UniqueName: \"kubernetes.io/projected/72c1d656-4f85-483b-b7a2-6132b71ae093-kube-api-access-vbj5f\") pod \"glance-db-sync-nblvp\" (UID: \"72c1d656-4f85-483b-b7a2-6132b71ae093\") " pod="openstack/glance-db-sync-nblvp" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.955751 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72c1d656-4f85-483b-b7a2-6132b71ae093-db-sync-config-data\") pod \"glance-db-sync-nblvp\" (UID: \"72c1d656-4f85-483b-b7a2-6132b71ae093\") " pod="openstack/glance-db-sync-nblvp" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.955792 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72c1d656-4f85-483b-b7a2-6132b71ae093-config-data\") pod \"glance-db-sync-nblvp\" (UID: \"72c1d656-4f85-483b-b7a2-6132b71ae093\") " pod="openstack/glance-db-sync-nblvp" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.955840 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-var-log-ovn\") pod \"ovn-controller-nrvng-config-8jbz5\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " pod="openstack/ovn-controller-nrvng-config-8jbz5" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.955863 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-additional-scripts\") pod \"ovn-controller-nrvng-config-8jbz5\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " pod="openstack/ovn-controller-nrvng-config-8jbz5" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.955881 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72c1d656-4f85-483b-b7a2-6132b71ae093-combined-ca-bundle\") pod \"glance-db-sync-nblvp\" (UID: \"72c1d656-4f85-483b-b7a2-6132b71ae093\") " pod="openstack/glance-db-sync-nblvp" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.955907 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-var-run\") pod \"ovn-controller-nrvng-config-8jbz5\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " pod="openstack/ovn-controller-nrvng-config-8jbz5" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.956026 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qczbr\" (UniqueName: \"kubernetes.io/projected/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-kube-api-access-qczbr\") pod \"ovn-controller-nrvng-config-8jbz5\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " pod="openstack/ovn-controller-nrvng-config-8jbz5" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.956171 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-scripts\") pod \"ovn-controller-nrvng-config-8jbz5\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " pod="openstack/ovn-controller-nrvng-config-8jbz5" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.969361 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72c1d656-4f85-483b-b7a2-6132b71ae093-config-data\") pod \"glance-db-sync-nblvp\" (UID: \"72c1d656-4f85-483b-b7a2-6132b71ae093\") " pod="openstack/glance-db-sync-nblvp" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.971410 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72c1d656-4f85-483b-b7a2-6132b71ae093-db-sync-config-data\") pod \"glance-db-sync-nblvp\" (UID: \"72c1d656-4f85-483b-b7a2-6132b71ae093\") " pod="openstack/glance-db-sync-nblvp" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.971580 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72c1d656-4f85-483b-b7a2-6132b71ae093-combined-ca-bundle\") pod \"glance-db-sync-nblvp\" (UID: \"72c1d656-4f85-483b-b7a2-6132b71ae093\") " pod="openstack/glance-db-sync-nblvp" Feb 26 11:30:50 crc kubenswrapper[4699]: I0226 11:30:50.974576 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbj5f\" (UniqueName: \"kubernetes.io/projected/72c1d656-4f85-483b-b7a2-6132b71ae093-kube-api-access-vbj5f\") pod \"glance-db-sync-nblvp\" (UID: \"72c1d656-4f85-483b-b7a2-6132b71ae093\") " pod="openstack/glance-db-sync-nblvp" Feb 26 11:30:51 crc kubenswrapper[4699]: I0226 11:30:51.057129 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-var-log-ovn\") pod \"ovn-controller-nrvng-config-8jbz5\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " pod="openstack/ovn-controller-nrvng-config-8jbz5" Feb 26 11:30:51 crc kubenswrapper[4699]: I0226 11:30:51.057494 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-additional-scripts\") pod \"ovn-controller-nrvng-config-8jbz5\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " pod="openstack/ovn-controller-nrvng-config-8jbz5" Feb 26 11:30:51 crc kubenswrapper[4699]: I0226 11:30:51.057519 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-var-run\") pod \"ovn-controller-nrvng-config-8jbz5\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " pod="openstack/ovn-controller-nrvng-config-8jbz5" Feb 26 11:30:51 crc kubenswrapper[4699]: I0226 11:30:51.057616 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-var-run\") pod \"ovn-controller-nrvng-config-8jbz5\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " pod="openstack/ovn-controller-nrvng-config-8jbz5" Feb 26 11:30:51 crc kubenswrapper[4699]: I0226 11:30:51.057512 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-var-log-ovn\") pod \"ovn-controller-nrvng-config-8jbz5\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " pod="openstack/ovn-controller-nrvng-config-8jbz5" Feb 26 11:30:51 crc kubenswrapper[4699]: I0226 11:30:51.057677 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qczbr\" (UniqueName: \"kubernetes.io/projected/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-kube-api-access-qczbr\") pod \"ovn-controller-nrvng-config-8jbz5\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " pod="openstack/ovn-controller-nrvng-config-8jbz5" Feb 26 11:30:51 crc kubenswrapper[4699]: I0226 11:30:51.057789 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-scripts\") pod \"ovn-controller-nrvng-config-8jbz5\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " pod="openstack/ovn-controller-nrvng-config-8jbz5" Feb 26 11:30:51 crc kubenswrapper[4699]: I0226 11:30:51.057824 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-var-run-ovn\") pod \"ovn-controller-nrvng-config-8jbz5\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " pod="openstack/ovn-controller-nrvng-config-8jbz5" Feb 26 11:30:51 crc kubenswrapper[4699]: I0226 11:30:51.057981 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-var-run-ovn\") pod \"ovn-controller-nrvng-config-8jbz5\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " pod="openstack/ovn-controller-nrvng-config-8jbz5" Feb 26 11:30:51 crc kubenswrapper[4699]: I0226 11:30:51.058230 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-additional-scripts\") pod \"ovn-controller-nrvng-config-8jbz5\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " pod="openstack/ovn-controller-nrvng-config-8jbz5" Feb 26 11:30:51 crc kubenswrapper[4699]: I0226 11:30:51.059848 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-scripts\") pod \"ovn-controller-nrvng-config-8jbz5\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " pod="openstack/ovn-controller-nrvng-config-8jbz5" Feb 26 11:30:51 crc kubenswrapper[4699]: I0226 11:30:51.076382 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qczbr\" (UniqueName: \"kubernetes.io/projected/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-kube-api-access-qczbr\") pod \"ovn-controller-nrvng-config-8jbz5\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " pod="openstack/ovn-controller-nrvng-config-8jbz5" Feb 26 11:30:51 crc kubenswrapper[4699]: I0226 11:30:51.084492 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nblvp" Feb 26 11:30:51 crc kubenswrapper[4699]: I0226 11:30:51.222357 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nrvng-config-8jbz5" Feb 26 11:30:51 crc kubenswrapper[4699]: I0226 11:30:51.246466 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-htqpz" event={"ID":"64b0134d-d882-4622-86a4-ab8172ee4fb2","Type":"ContainerDied","Data":"ae8f44b486fa15025b9de2e5a645bec5ea7de597824b9c9e5ed6ba992765d39a"} Feb 26 11:30:51 crc kubenswrapper[4699]: I0226 11:30:51.246525 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae8f44b486fa15025b9de2e5a645bec5ea7de597824b9c9e5ed6ba992765d39a" Feb 26 11:30:51 crc kubenswrapper[4699]: I0226 11:30:51.246523 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-htqpz" Feb 26 11:30:51 crc kubenswrapper[4699]: I0226 11:30:51.655725 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-nblvp"] Feb 26 11:30:51 crc kubenswrapper[4699]: W0226 11:30:51.664345 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72c1d656_4f85_483b_b7a2_6132b71ae093.slice/crio-c8f798d0cd617868616cba798cc6a29d56d3b9e5026ef1c6b84fc3016e7bc40e WatchSource:0}: Error finding container c8f798d0cd617868616cba798cc6a29d56d3b9e5026ef1c6b84fc3016e7bc40e: Status 404 returned error can't find the container with id c8f798d0cd617868616cba798cc6a29d56d3b9e5026ef1c6b84fc3016e7bc40e Feb 26 11:30:51 crc kubenswrapper[4699]: I0226 11:30:51.741431 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nrvng-config-8jbz5"] Feb 26 11:30:51 crc kubenswrapper[4699]: W0226 11:30:51.745534 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac9468ff_bc02_4a6e_83b3_0f6a5a8876a1.slice/crio-ed20c534aca38673aafb1717e8c45b588dfeeb556cb91c29e830654d960b0a70 WatchSource:0}: Error finding container ed20c534aca38673aafb1717e8c45b588dfeeb556cb91c29e830654d960b0a70: Status 404 returned error can't find the container with id ed20c534aca38673aafb1717e8c45b588dfeeb556cb91c29e830654d960b0a70 Feb 26 11:30:52 crc kubenswrapper[4699]: I0226 11:30:52.260952 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nrvng-config-8jbz5" event={"ID":"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1","Type":"ContainerStarted","Data":"c6b236ca3c3f327dbd547c137704ae3085c07d33a8a0f68103faaa60a3289bc1"} Feb 26 11:30:52 crc kubenswrapper[4699]: I0226 11:30:52.261251 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nrvng-config-8jbz5" event={"ID":"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1","Type":"ContainerStarted","Data":"ed20c534aca38673aafb1717e8c45b588dfeeb556cb91c29e830654d960b0a70"} Feb 26 11:30:52 crc kubenswrapper[4699]: I0226 11:30:52.275267 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nblvp" event={"ID":"72c1d656-4f85-483b-b7a2-6132b71ae093","Type":"ContainerStarted","Data":"c8f798d0cd617868616cba798cc6a29d56d3b9e5026ef1c6b84fc3016e7bc40e"} Feb 26 11:30:52 crc kubenswrapper[4699]: I0226 11:30:52.280829 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f23ec57b-7ab1-4152-8108-e0e27b4ba95c","Type":"ContainerStarted","Data":"559c831350972c9fb72080a6ca387d917a3cd3d6a836ca18541ac2393f96585f"} Feb 26 11:30:52 crc kubenswrapper[4699]: I0226 11:30:52.280869 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f23ec57b-7ab1-4152-8108-e0e27b4ba95c","Type":"ContainerStarted","Data":"9b80de6c685d283bba6c9ee68dbafa5d881b0a8dafd5adde3ceebf5eb7dace4f"} Feb 26 11:30:52 crc kubenswrapper[4699]: I0226 11:30:52.280881 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f23ec57b-7ab1-4152-8108-e0e27b4ba95c","Type":"ContainerStarted","Data":"2e907cbd0611162733bea4638053f70d7b24fa340784ef231bfd03eeac9a5c47"} Feb 26 11:30:52 crc kubenswrapper[4699]: I0226 11:30:52.280890 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f23ec57b-7ab1-4152-8108-e0e27b4ba95c","Type":"ContainerStarted","Data":"852545b613e8bd35812950cb85024e453bb58b7b731c8a57fcbeeee92353a0e6"} Feb 26 11:30:52 crc kubenswrapper[4699]: I0226 11:30:52.280898 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f23ec57b-7ab1-4152-8108-e0e27b4ba95c","Type":"ContainerStarted","Data":"0cacbba609d21ec506f709c084156a2064c87e2d9dbdd77a8c579730774c901a"} Feb 26 11:30:52 crc kubenswrapper[4699]: I0226 11:30:52.308324 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-nrvng-config-8jbz5" podStartSLOduration=2.308299368 podStartE2EDuration="2.308299368s" podCreationTimestamp="2026-02-26 11:30:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:30:52.300285396 +0000 UTC m=+1198.111111820" watchObservedRunningTime="2026-02-26 11:30:52.308299368 +0000 UTC m=+1198.119125812" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.296144 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f23ec57b-7ab1-4152-8108-e0e27b4ba95c","Type":"ContainerStarted","Data":"19e6eb1a84328da44fcedd6164617f9e6ed1da2193ac5b1e14579975aee97b6c"} Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.296503 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f23ec57b-7ab1-4152-8108-e0e27b4ba95c","Type":"ContainerStarted","Data":"4598d639014915e11348a97d71f0dd85c79c60c306a6f9b7337450e6d52b6f98"} Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.300943 4699 generic.go:334] "Generic (PLEG): container finished" podID="ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1" containerID="c6b236ca3c3f327dbd547c137704ae3085c07d33a8a0f68103faaa60a3289bc1" exitCode=0 Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.300995 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nrvng-config-8jbz5" event={"ID":"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1","Type":"ContainerDied","Data":"c6b236ca3c3f327dbd547c137704ae3085c07d33a8a0f68103faaa60a3289bc1"} Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.382108 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=21.021940015 podStartE2EDuration="26.382091648s" podCreationTimestamp="2026-02-26 11:30:27 +0000 UTC" firstStartedPulling="2026-02-26 11:30:45.712596073 +0000 UTC m=+1191.523422507" lastFinishedPulling="2026-02-26 11:30:51.072747706 +0000 UTC m=+1196.883574140" observedRunningTime="2026-02-26 11:30:53.352983308 +0000 UTC m=+1199.163809762" watchObservedRunningTime="2026-02-26 11:30:53.382091648 +0000 UTC m=+1199.192918082" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.639012 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-c6zn7"] Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.641168 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.643849 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.645458 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-c6zn7"] Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.711528 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-config\") pod \"dnsmasq-dns-6d5b6d6b67-c6zn7\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.711603 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-c6zn7\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.711688 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-c6zn7\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.711718 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrxfq\" (UniqueName: \"kubernetes.io/projected/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-kube-api-access-lrxfq\") pod \"dnsmasq-dns-6d5b6d6b67-c6zn7\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.711918 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-c6zn7\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.712023 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-c6zn7\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.771499 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-gjgfc"] Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.774316 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gjgfc" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.776320 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.780741 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gjgfc"] Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.814073 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-config\") pod \"dnsmasq-dns-6d5b6d6b67-c6zn7\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.814169 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-c6zn7\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.814229 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-c6zn7\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.814253 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrxfq\" (UniqueName: \"kubernetes.io/projected/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-kube-api-access-lrxfq\") pod \"dnsmasq-dns-6d5b6d6b67-c6zn7\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.814275 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-c6zn7\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.814295 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c102f5c-cbaf-429e-b487-8b179f989720-operator-scripts\") pod \"root-account-create-update-gjgfc\" (UID: \"7c102f5c-cbaf-429e-b487-8b179f989720\") " pod="openstack/root-account-create-update-gjgfc" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.814322 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-c6zn7\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.814342 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q42gd\" (UniqueName: \"kubernetes.io/projected/7c102f5c-cbaf-429e-b487-8b179f989720-kube-api-access-q42gd\") pod \"root-account-create-update-gjgfc\" (UID: \"7c102f5c-cbaf-429e-b487-8b179f989720\") " pod="openstack/root-account-create-update-gjgfc" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.815278 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-config\") pod \"dnsmasq-dns-6d5b6d6b67-c6zn7\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.815808 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-c6zn7\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.816344 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-c6zn7\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.816907 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-c6zn7\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.818737 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-c6zn7\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.834325 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrxfq\" (UniqueName: \"kubernetes.io/projected/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-kube-api-access-lrxfq\") pod \"dnsmasq-dns-6d5b6d6b67-c6zn7\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.916096 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c102f5c-cbaf-429e-b487-8b179f989720-operator-scripts\") pod \"root-account-create-update-gjgfc\" (UID: \"7c102f5c-cbaf-429e-b487-8b179f989720\") " pod="openstack/root-account-create-update-gjgfc" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.916198 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q42gd\" (UniqueName: \"kubernetes.io/projected/7c102f5c-cbaf-429e-b487-8b179f989720-kube-api-access-q42gd\") pod \"root-account-create-update-gjgfc\" (UID: \"7c102f5c-cbaf-429e-b487-8b179f989720\") " pod="openstack/root-account-create-update-gjgfc" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.917357 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c102f5c-cbaf-429e-b487-8b179f989720-operator-scripts\") pod \"root-account-create-update-gjgfc\" (UID: \"7c102f5c-cbaf-429e-b487-8b179f989720\") " pod="openstack/root-account-create-update-gjgfc" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.931765 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q42gd\" (UniqueName: \"kubernetes.io/projected/7c102f5c-cbaf-429e-b487-8b179f989720-kube-api-access-q42gd\") pod \"root-account-create-update-gjgfc\" (UID: \"7c102f5c-cbaf-429e-b487-8b179f989720\") " pod="openstack/root-account-create-update-gjgfc" Feb 26 11:30:53 crc kubenswrapper[4699]: I0226 11:30:53.976159 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:30:54 crc kubenswrapper[4699]: I0226 11:30:54.095966 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gjgfc" Feb 26 11:30:55 crc kubenswrapper[4699]: I0226 11:30:54.419597 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-c6zn7"] Feb 26 11:30:55 crc kubenswrapper[4699]: I0226 11:30:55.323005 4699 generic.go:334] "Generic (PLEG): container finished" podID="6ebd52ff-bacc-40c7-afc5-83df5a1c36e9" containerID="9682f0a3316099cd400015d1d5abe7c7f75f2f43640ff21520a7cddc2ba23260" exitCode=0 Feb 26 11:30:55 crc kubenswrapper[4699]: I0226 11:30:55.323379 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" event={"ID":"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9","Type":"ContainerDied","Data":"9682f0a3316099cd400015d1d5abe7c7f75f2f43640ff21520a7cddc2ba23260"} Feb 26 11:30:55 crc kubenswrapper[4699]: I0226 11:30:55.323414 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" event={"ID":"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9","Type":"ContainerStarted","Data":"db12e6ab7e70b99da81ac4834b205007d7df170db9b9e0a8bd4ab5007bbb10d9"} Feb 26 11:30:55 crc kubenswrapper[4699]: I0226 11:30:55.592784 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-nrvng" Feb 26 11:30:55 crc kubenswrapper[4699]: I0226 11:30:55.956920 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nrvng-config-8jbz5" Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.037334 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gjgfc"] Feb 26 11:30:56 crc kubenswrapper[4699]: W0226 11:30:56.040334 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c102f5c_cbaf_429e_b487_8b179f989720.slice/crio-bfe030826ccd0b67fb14d360359010d9b579493ba9a7174535b351cb92366fa9 WatchSource:0}: Error finding container bfe030826ccd0b67fb14d360359010d9b579493ba9a7174535b351cb92366fa9: Status 404 returned error can't find the container with id bfe030826ccd0b67fb14d360359010d9b579493ba9a7174535b351cb92366fa9 Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.054993 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-var-run\") pod \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.055161 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-var-run-ovn\") pod \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.055226 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-scripts\") pod \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.055272 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qczbr\" (UniqueName: \"kubernetes.io/projected/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-kube-api-access-qczbr\") pod \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.055311 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-additional-scripts\") pod \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.055361 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-var-log-ovn\") pod \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\" (UID: \"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1\") " Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.055145 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-var-run" (OuterVolumeSpecName: "var-run") pod "ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1" (UID: "ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.055194 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1" (UID: "ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.055690 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1" (UID: "ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.056076 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1" (UID: "ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.056674 4699 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.056699 4699 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.056714 4699 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.056724 4699 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-var-run\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.057146 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-scripts" (OuterVolumeSpecName: "scripts") pod "ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1" (UID: "ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.061714 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-kube-api-access-qczbr" (OuterVolumeSpecName: "kube-api-access-qczbr") pod "ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1" (UID: "ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1"). InnerVolumeSpecName "kube-api-access-qczbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.160774 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.161220 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qczbr\" (UniqueName: \"kubernetes.io/projected/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1-kube-api-access-qczbr\") on node \"crc\" DevicePath \"\"" Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.332215 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nrvng-config-8jbz5" event={"ID":"ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1","Type":"ContainerDied","Data":"ed20c534aca38673aafb1717e8c45b588dfeeb556cb91c29e830654d960b0a70"} Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.332253 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed20c534aca38673aafb1717e8c45b588dfeeb556cb91c29e830654d960b0a70" Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.332231 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nrvng-config-8jbz5" Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.333534 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gjgfc" event={"ID":"7c102f5c-cbaf-429e-b487-8b179f989720","Type":"ContainerStarted","Data":"bfe030826ccd0b67fb14d360359010d9b579493ba9a7174535b351cb92366fa9"} Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.335243 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" event={"ID":"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9","Type":"ContainerStarted","Data":"fe976bbefde2fa99a8167c39df0e86003afc4a567d5a020335a060d2c650e894"} Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.336364 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:30:56 crc kubenswrapper[4699]: I0226 11:30:56.357384 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" podStartSLOduration=3.357340549 podStartE2EDuration="3.357340549s" podCreationTimestamp="2026-02-26 11:30:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:30:56.355054333 +0000 UTC m=+1202.165880797" watchObservedRunningTime="2026-02-26 11:30:56.357340549 +0000 UTC m=+1202.168167003" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.041711 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-nrvng-config-8jbz5"] Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.048915 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-nrvng-config-8jbz5"] Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.195092 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nrvng-config-gbwh4"] Feb 26 11:30:57 crc kubenswrapper[4699]: E0226 11:30:57.195541 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1" containerName="ovn-config" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.195566 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1" containerName="ovn-config" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.195787 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1" containerName="ovn-config" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.196466 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nrvng-config-gbwh4" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.198792 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.224726 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nrvng-config-gbwh4"] Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.279216 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-var-log-ovn\") pod \"ovn-controller-nrvng-config-gbwh4\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " pod="openstack/ovn-controller-nrvng-config-gbwh4" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.279274 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xbdb\" (UniqueName: \"kubernetes.io/projected/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-kube-api-access-2xbdb\") pod \"ovn-controller-nrvng-config-gbwh4\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " pod="openstack/ovn-controller-nrvng-config-gbwh4" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.279301 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-var-run\") pod \"ovn-controller-nrvng-config-gbwh4\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " pod="openstack/ovn-controller-nrvng-config-gbwh4" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.279324 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-scripts\") pod \"ovn-controller-nrvng-config-gbwh4\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " pod="openstack/ovn-controller-nrvng-config-gbwh4" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.279477 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-var-run-ovn\") pod \"ovn-controller-nrvng-config-gbwh4\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " pod="openstack/ovn-controller-nrvng-config-gbwh4" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.279547 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-additional-scripts\") pod \"ovn-controller-nrvng-config-gbwh4\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " pod="openstack/ovn-controller-nrvng-config-gbwh4" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.381348 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-var-log-ovn\") pod \"ovn-controller-nrvng-config-gbwh4\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " pod="openstack/ovn-controller-nrvng-config-gbwh4" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.381691 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xbdb\" (UniqueName: \"kubernetes.io/projected/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-kube-api-access-2xbdb\") pod \"ovn-controller-nrvng-config-gbwh4\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " pod="openstack/ovn-controller-nrvng-config-gbwh4" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.381713 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-var-run\") pod \"ovn-controller-nrvng-config-gbwh4\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " pod="openstack/ovn-controller-nrvng-config-gbwh4" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.381728 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-scripts\") pod \"ovn-controller-nrvng-config-gbwh4\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " pod="openstack/ovn-controller-nrvng-config-gbwh4" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.381775 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-var-run-ovn\") pod \"ovn-controller-nrvng-config-gbwh4\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " pod="openstack/ovn-controller-nrvng-config-gbwh4" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.381797 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-additional-scripts\") pod \"ovn-controller-nrvng-config-gbwh4\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " pod="openstack/ovn-controller-nrvng-config-gbwh4" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.381859 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-var-run\") pod \"ovn-controller-nrvng-config-gbwh4\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " pod="openstack/ovn-controller-nrvng-config-gbwh4" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.381998 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-var-log-ovn\") pod \"ovn-controller-nrvng-config-gbwh4\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " pod="openstack/ovn-controller-nrvng-config-gbwh4" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.382576 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-var-run-ovn\") pod \"ovn-controller-nrvng-config-gbwh4\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " pod="openstack/ovn-controller-nrvng-config-gbwh4" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.383241 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-additional-scripts\") pod \"ovn-controller-nrvng-config-gbwh4\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " pod="openstack/ovn-controller-nrvng-config-gbwh4" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.384313 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-scripts\") pod \"ovn-controller-nrvng-config-gbwh4\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " pod="openstack/ovn-controller-nrvng-config-gbwh4" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.399142 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xbdb\" (UniqueName: \"kubernetes.io/projected/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-kube-api-access-2xbdb\") pod \"ovn-controller-nrvng-config-gbwh4\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " pod="openstack/ovn-controller-nrvng-config-gbwh4" Feb 26 11:30:57 crc kubenswrapper[4699]: I0226 11:30:57.511886 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nrvng-config-gbwh4" Feb 26 11:30:58 crc kubenswrapper[4699]: I0226 11:30:58.270182 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1" path="/var/lib/kubelet/pods/ac9468ff-bc02-4a6e-83b3-0f6a5a8876a1/volumes" Feb 26 11:30:58 crc kubenswrapper[4699]: I0226 11:30:58.355838 4699 generic.go:334] "Generic (PLEG): container finished" podID="7c102f5c-cbaf-429e-b487-8b179f989720" containerID="91516e9d3caed541543b28d1d1f9c624822ee3d8a280a0f3e6e9514175f1fe30" exitCode=0 Feb 26 11:30:58 crc kubenswrapper[4699]: I0226 11:30:58.359977 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gjgfc" event={"ID":"7c102f5c-cbaf-429e-b487-8b179f989720","Type":"ContainerDied","Data":"91516e9d3caed541543b28d1d1f9c624822ee3d8a280a0f3e6e9514175f1fe30"} Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.340313 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.402292 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.658466 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-v77r5"] Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.659910 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v77r5" Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.700605 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-v77r5"] Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.772319 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a7a2-account-create-update-l2mt4"] Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.773976 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a7a2-account-create-update-l2mt4" Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.781165 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/758bbe1c-d826-47f7-aff6-54e9fc4ebe63-operator-scripts\") pod \"cinder-db-create-v77r5\" (UID: \"758bbe1c-d826-47f7-aff6-54e9fc4ebe63\") " pod="openstack/cinder-db-create-v77r5" Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.781634 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xts5\" (UniqueName: \"kubernetes.io/projected/758bbe1c-d826-47f7-aff6-54e9fc4ebe63-kube-api-access-5xts5\") pod \"cinder-db-create-v77r5\" (UID: \"758bbe1c-d826-47f7-aff6-54e9fc4ebe63\") " pod="openstack/cinder-db-create-v77r5" Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.785188 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.793689 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a7a2-account-create-update-l2mt4"] Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.853924 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-bl9wp"] Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.855762 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bl9wp" Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.881853 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bl9wp"] Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.886593 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a68fa18-1c49-4d3d-bc5f-75763944d818-operator-scripts\") pod \"cinder-a7a2-account-create-update-l2mt4\" (UID: \"7a68fa18-1c49-4d3d-bc5f-75763944d818\") " pod="openstack/cinder-a7a2-account-create-update-l2mt4" Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.886935 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/758bbe1c-d826-47f7-aff6-54e9fc4ebe63-operator-scripts\") pod \"cinder-db-create-v77r5\" (UID: \"758bbe1c-d826-47f7-aff6-54e9fc4ebe63\") " pod="openstack/cinder-db-create-v77r5" Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.887154 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xts5\" (UniqueName: \"kubernetes.io/projected/758bbe1c-d826-47f7-aff6-54e9fc4ebe63-kube-api-access-5xts5\") pod \"cinder-db-create-v77r5\" (UID: \"758bbe1c-d826-47f7-aff6-54e9fc4ebe63\") " pod="openstack/cinder-db-create-v77r5" Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.887195 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8wvk\" (UniqueName: \"kubernetes.io/projected/7a68fa18-1c49-4d3d-bc5f-75763944d818-kube-api-access-b8wvk\") pod \"cinder-a7a2-account-create-update-l2mt4\" (UID: \"7a68fa18-1c49-4d3d-bc5f-75763944d818\") " pod="openstack/cinder-a7a2-account-create-update-l2mt4" Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.888109 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/758bbe1c-d826-47f7-aff6-54e9fc4ebe63-operator-scripts\") pod \"cinder-db-create-v77r5\" (UID: \"758bbe1c-d826-47f7-aff6-54e9fc4ebe63\") " pod="openstack/cinder-db-create-v77r5" Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.920373 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xts5\" (UniqueName: \"kubernetes.io/projected/758bbe1c-d826-47f7-aff6-54e9fc4ebe63-kube-api-access-5xts5\") pod \"cinder-db-create-v77r5\" (UID: \"758bbe1c-d826-47f7-aff6-54e9fc4ebe63\") " pod="openstack/cinder-db-create-v77r5" Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.969964 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-4fx8g"] Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.971172 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4fx8g" Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.979508 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4fx8g"] Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.988667 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8wvk\" (UniqueName: \"kubernetes.io/projected/7a68fa18-1c49-4d3d-bc5f-75763944d818-kube-api-access-b8wvk\") pod \"cinder-a7a2-account-create-update-l2mt4\" (UID: \"7a68fa18-1c49-4d3d-bc5f-75763944d818\") " pod="openstack/cinder-a7a2-account-create-update-l2mt4" Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.988744 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftbnl\" (UniqueName: \"kubernetes.io/projected/1029eddb-2336-4ec5-af4a-b8fed82d3d55-kube-api-access-ftbnl\") pod \"barbican-db-create-bl9wp\" (UID: \"1029eddb-2336-4ec5-af4a-b8fed82d3d55\") " pod="openstack/barbican-db-create-bl9wp" Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.988845 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1029eddb-2336-4ec5-af4a-b8fed82d3d55-operator-scripts\") pod \"barbican-db-create-bl9wp\" (UID: \"1029eddb-2336-4ec5-af4a-b8fed82d3d55\") " pod="openstack/barbican-db-create-bl9wp" Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.988900 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a68fa18-1c49-4d3d-bc5f-75763944d818-operator-scripts\") pod \"cinder-a7a2-account-create-update-l2mt4\" (UID: \"7a68fa18-1c49-4d3d-bc5f-75763944d818\") " pod="openstack/cinder-a7a2-account-create-update-l2mt4" Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.994227 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a68fa18-1c49-4d3d-bc5f-75763944d818-operator-scripts\") pod \"cinder-a7a2-account-create-update-l2mt4\" (UID: \"7a68fa18-1c49-4d3d-bc5f-75763944d818\") " pod="openstack/cinder-a7a2-account-create-update-l2mt4" Feb 26 11:31:02 crc kubenswrapper[4699]: I0226 11:31:02.999607 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v77r5" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.012248 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8wvk\" (UniqueName: \"kubernetes.io/projected/7a68fa18-1c49-4d3d-bc5f-75763944d818-kube-api-access-b8wvk\") pod \"cinder-a7a2-account-create-update-l2mt4\" (UID: \"7a68fa18-1c49-4d3d-bc5f-75763944d818\") " pod="openstack/cinder-a7a2-account-create-update-l2mt4" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.059760 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-5a9e-account-create-update-fzhw8"] Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.061697 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5a9e-account-create-update-fzhw8" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.067465 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.075621 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5a9e-account-create-update-fzhw8"] Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.092970 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a7a2-account-create-update-l2mt4" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.093311 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1029eddb-2336-4ec5-af4a-b8fed82d3d55-operator-scripts\") pod \"barbican-db-create-bl9wp\" (UID: \"1029eddb-2336-4ec5-af4a-b8fed82d3d55\") " pod="openstack/barbican-db-create-bl9wp" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.093354 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c9e36d9-5d53-46d8-a91a-22dc9338ab58-operator-scripts\") pod \"neutron-db-create-4fx8g\" (UID: \"5c9e36d9-5d53-46d8-a91a-22dc9338ab58\") " pod="openstack/neutron-db-create-4fx8g" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.093403 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l6p4\" (UniqueName: \"kubernetes.io/projected/5c9e36d9-5d53-46d8-a91a-22dc9338ab58-kube-api-access-9l6p4\") pod \"neutron-db-create-4fx8g\" (UID: \"5c9e36d9-5d53-46d8-a91a-22dc9338ab58\") " pod="openstack/neutron-db-create-4fx8g" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.093492 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftbnl\" (UniqueName: \"kubernetes.io/projected/1029eddb-2336-4ec5-af4a-b8fed82d3d55-kube-api-access-ftbnl\") pod \"barbican-db-create-bl9wp\" (UID: \"1029eddb-2336-4ec5-af4a-b8fed82d3d55\") " pod="openstack/barbican-db-create-bl9wp" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.094160 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1029eddb-2336-4ec5-af4a-b8fed82d3d55-operator-scripts\") pod \"barbican-db-create-bl9wp\" (UID: \"1029eddb-2336-4ec5-af4a-b8fed82d3d55\") " pod="openstack/barbican-db-create-bl9wp" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.115286 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftbnl\" (UniqueName: \"kubernetes.io/projected/1029eddb-2336-4ec5-af4a-b8fed82d3d55-kube-api-access-ftbnl\") pod \"barbican-db-create-bl9wp\" (UID: \"1029eddb-2336-4ec5-af4a-b8fed82d3d55\") " pod="openstack/barbican-db-create-bl9wp" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.154511 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f3b2-account-create-update-xhgnq"] Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.155540 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f3b2-account-create-update-xhgnq" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.159210 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.171053 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bl9wp" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.188796 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f3b2-account-create-update-xhgnq"] Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.195812 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7spwj\" (UniqueName: \"kubernetes.io/projected/4c910eba-ce23-4fd9-b08a-54b96fe6a2da-kube-api-access-7spwj\") pod \"barbican-5a9e-account-create-update-fzhw8\" (UID: \"4c910eba-ce23-4fd9-b08a-54b96fe6a2da\") " pod="openstack/barbican-5a9e-account-create-update-fzhw8" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.195867 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c9e36d9-5d53-46d8-a91a-22dc9338ab58-operator-scripts\") pod \"neutron-db-create-4fx8g\" (UID: \"5c9e36d9-5d53-46d8-a91a-22dc9338ab58\") " pod="openstack/neutron-db-create-4fx8g" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.195911 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l6p4\" (UniqueName: \"kubernetes.io/projected/5c9e36d9-5d53-46d8-a91a-22dc9338ab58-kube-api-access-9l6p4\") pod \"neutron-db-create-4fx8g\" (UID: \"5c9e36d9-5d53-46d8-a91a-22dc9338ab58\") " pod="openstack/neutron-db-create-4fx8g" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.196254 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c910eba-ce23-4fd9-b08a-54b96fe6a2da-operator-scripts\") pod \"barbican-5a9e-account-create-update-fzhw8\" (UID: \"4c910eba-ce23-4fd9-b08a-54b96fe6a2da\") " pod="openstack/barbican-5a9e-account-create-update-fzhw8" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.196936 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c9e36d9-5d53-46d8-a91a-22dc9338ab58-operator-scripts\") pod \"neutron-db-create-4fx8g\" (UID: \"5c9e36d9-5d53-46d8-a91a-22dc9338ab58\") " pod="openstack/neutron-db-create-4fx8g" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.220699 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l6p4\" (UniqueName: \"kubernetes.io/projected/5c9e36d9-5d53-46d8-a91a-22dc9338ab58-kube-api-access-9l6p4\") pod \"neutron-db-create-4fx8g\" (UID: \"5c9e36d9-5d53-46d8-a91a-22dc9338ab58\") " pod="openstack/neutron-db-create-4fx8g" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.225182 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-v9z8k"] Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.229479 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v9z8k" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.232114 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.232292 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.232388 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.232469 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qbntt" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.252255 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-v9z8k"] Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.298407 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c25243e-b6d9-40f5-9c3b-31947cf74cc9-operator-scripts\") pod \"neutron-f3b2-account-create-update-xhgnq\" (UID: \"8c25243e-b6d9-40f5-9c3b-31947cf74cc9\") " pod="openstack/neutron-f3b2-account-create-update-xhgnq" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.298607 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c910eba-ce23-4fd9-b08a-54b96fe6a2da-operator-scripts\") pod \"barbican-5a9e-account-create-update-fzhw8\" (UID: \"4c910eba-ce23-4fd9-b08a-54b96fe6a2da\") " pod="openstack/barbican-5a9e-account-create-update-fzhw8" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.298762 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7spwj\" (UniqueName: \"kubernetes.io/projected/4c910eba-ce23-4fd9-b08a-54b96fe6a2da-kube-api-access-7spwj\") pod \"barbican-5a9e-account-create-update-fzhw8\" (UID: \"4c910eba-ce23-4fd9-b08a-54b96fe6a2da\") " pod="openstack/barbican-5a9e-account-create-update-fzhw8" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.298889 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vh8z\" (UniqueName: \"kubernetes.io/projected/8c25243e-b6d9-40f5-9c3b-31947cf74cc9-kube-api-access-6vh8z\") pod \"neutron-f3b2-account-create-update-xhgnq\" (UID: \"8c25243e-b6d9-40f5-9c3b-31947cf74cc9\") " pod="openstack/neutron-f3b2-account-create-update-xhgnq" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.301534 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c910eba-ce23-4fd9-b08a-54b96fe6a2da-operator-scripts\") pod \"barbican-5a9e-account-create-update-fzhw8\" (UID: \"4c910eba-ce23-4fd9-b08a-54b96fe6a2da\") " pod="openstack/barbican-5a9e-account-create-update-fzhw8" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.303245 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4fx8g" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.318435 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7spwj\" (UniqueName: \"kubernetes.io/projected/4c910eba-ce23-4fd9-b08a-54b96fe6a2da-kube-api-access-7spwj\") pod \"barbican-5a9e-account-create-update-fzhw8\" (UID: \"4c910eba-ce23-4fd9-b08a-54b96fe6a2da\") " pod="openstack/barbican-5a9e-account-create-update-fzhw8" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.399183 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5a9e-account-create-update-fzhw8" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.400716 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsndg\" (UniqueName: \"kubernetes.io/projected/7f040612-306e-4ce2-b289-ed5be7bbc9e3-kube-api-access-hsndg\") pod \"keystone-db-sync-v9z8k\" (UID: \"7f040612-306e-4ce2-b289-ed5be7bbc9e3\") " pod="openstack/keystone-db-sync-v9z8k" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.400923 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f040612-306e-4ce2-b289-ed5be7bbc9e3-config-data\") pod \"keystone-db-sync-v9z8k\" (UID: \"7f040612-306e-4ce2-b289-ed5be7bbc9e3\") " pod="openstack/keystone-db-sync-v9z8k" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.401021 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vh8z\" (UniqueName: \"kubernetes.io/projected/8c25243e-b6d9-40f5-9c3b-31947cf74cc9-kube-api-access-6vh8z\") pod \"neutron-f3b2-account-create-update-xhgnq\" (UID: \"8c25243e-b6d9-40f5-9c3b-31947cf74cc9\") " pod="openstack/neutron-f3b2-account-create-update-xhgnq" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.401091 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c25243e-b6d9-40f5-9c3b-31947cf74cc9-operator-scripts\") pod \"neutron-f3b2-account-create-update-xhgnq\" (UID: \"8c25243e-b6d9-40f5-9c3b-31947cf74cc9\") " pod="openstack/neutron-f3b2-account-create-update-xhgnq" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.401176 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f040612-306e-4ce2-b289-ed5be7bbc9e3-combined-ca-bundle\") pod \"keystone-db-sync-v9z8k\" (UID: \"7f040612-306e-4ce2-b289-ed5be7bbc9e3\") " pod="openstack/keystone-db-sync-v9z8k" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.401974 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c25243e-b6d9-40f5-9c3b-31947cf74cc9-operator-scripts\") pod \"neutron-f3b2-account-create-update-xhgnq\" (UID: \"8c25243e-b6d9-40f5-9c3b-31947cf74cc9\") " pod="openstack/neutron-f3b2-account-create-update-xhgnq" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.422108 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vh8z\" (UniqueName: \"kubernetes.io/projected/8c25243e-b6d9-40f5-9c3b-31947cf74cc9-kube-api-access-6vh8z\") pod \"neutron-f3b2-account-create-update-xhgnq\" (UID: \"8c25243e-b6d9-40f5-9c3b-31947cf74cc9\") " pod="openstack/neutron-f3b2-account-create-update-xhgnq" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.502552 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f040612-306e-4ce2-b289-ed5be7bbc9e3-combined-ca-bundle\") pod \"keystone-db-sync-v9z8k\" (UID: \"7f040612-306e-4ce2-b289-ed5be7bbc9e3\") " pod="openstack/keystone-db-sync-v9z8k" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.502600 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsndg\" (UniqueName: \"kubernetes.io/projected/7f040612-306e-4ce2-b289-ed5be7bbc9e3-kube-api-access-hsndg\") pod \"keystone-db-sync-v9z8k\" (UID: \"7f040612-306e-4ce2-b289-ed5be7bbc9e3\") " pod="openstack/keystone-db-sync-v9z8k" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.502650 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f040612-306e-4ce2-b289-ed5be7bbc9e3-config-data\") pod \"keystone-db-sync-v9z8k\" (UID: \"7f040612-306e-4ce2-b289-ed5be7bbc9e3\") " pod="openstack/keystone-db-sync-v9z8k" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.503010 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f3b2-account-create-update-xhgnq" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.507484 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f040612-306e-4ce2-b289-ed5be7bbc9e3-config-data\") pod \"keystone-db-sync-v9z8k\" (UID: \"7f040612-306e-4ce2-b289-ed5be7bbc9e3\") " pod="openstack/keystone-db-sync-v9z8k" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.511896 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f040612-306e-4ce2-b289-ed5be7bbc9e3-combined-ca-bundle\") pod \"keystone-db-sync-v9z8k\" (UID: \"7f040612-306e-4ce2-b289-ed5be7bbc9e3\") " pod="openstack/keystone-db-sync-v9z8k" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.519081 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsndg\" (UniqueName: \"kubernetes.io/projected/7f040612-306e-4ce2-b289-ed5be7bbc9e3-kube-api-access-hsndg\") pod \"keystone-db-sync-v9z8k\" (UID: \"7f040612-306e-4ce2-b289-ed5be7bbc9e3\") " pod="openstack/keystone-db-sync-v9z8k" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.579108 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v9z8k" Feb 26 11:31:03 crc kubenswrapper[4699]: I0226 11:31:03.978665 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:31:04 crc kubenswrapper[4699]: I0226 11:31:04.066710 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6nf48"] Feb 26 11:31:04 crc kubenswrapper[4699]: I0226 11:31:04.066990 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" podUID="2a166832-199a-436c-85a2-4ccde527f180" containerName="dnsmasq-dns" containerID="cri-o://a6963bcffe5d258cd49c8f7db7cd1ef0c3f71763a18cb9d09e9e8a608d9fa6bd" gracePeriod=10 Feb 26 11:31:04 crc kubenswrapper[4699]: I0226 11:31:04.284716 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gjgfc" Feb 26 11:31:04 crc kubenswrapper[4699]: I0226 11:31:04.414041 4699 generic.go:334] "Generic (PLEG): container finished" podID="2a166832-199a-436c-85a2-4ccde527f180" containerID="a6963bcffe5d258cd49c8f7db7cd1ef0c3f71763a18cb9d09e9e8a608d9fa6bd" exitCode=0 Feb 26 11:31:04 crc kubenswrapper[4699]: I0226 11:31:04.414228 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" event={"ID":"2a166832-199a-436c-85a2-4ccde527f180","Type":"ContainerDied","Data":"a6963bcffe5d258cd49c8f7db7cd1ef0c3f71763a18cb9d09e9e8a608d9fa6bd"} Feb 26 11:31:04 crc kubenswrapper[4699]: I0226 11:31:04.422012 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gjgfc" event={"ID":"7c102f5c-cbaf-429e-b487-8b179f989720","Type":"ContainerDied","Data":"bfe030826ccd0b67fb14d360359010d9b579493ba9a7174535b351cb92366fa9"} Feb 26 11:31:04 crc kubenswrapper[4699]: I0226 11:31:04.422045 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfe030826ccd0b67fb14d360359010d9b579493ba9a7174535b351cb92366fa9" Feb 26 11:31:04 crc kubenswrapper[4699]: I0226 11:31:04.422064 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gjgfc" Feb 26 11:31:04 crc kubenswrapper[4699]: I0226 11:31:04.434418 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q42gd\" (UniqueName: \"kubernetes.io/projected/7c102f5c-cbaf-429e-b487-8b179f989720-kube-api-access-q42gd\") pod \"7c102f5c-cbaf-429e-b487-8b179f989720\" (UID: \"7c102f5c-cbaf-429e-b487-8b179f989720\") " Feb 26 11:31:04 crc kubenswrapper[4699]: I0226 11:31:04.434510 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c102f5c-cbaf-429e-b487-8b179f989720-operator-scripts\") pod \"7c102f5c-cbaf-429e-b487-8b179f989720\" (UID: \"7c102f5c-cbaf-429e-b487-8b179f989720\") " Feb 26 11:31:04 crc kubenswrapper[4699]: I0226 11:31:04.435051 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c102f5c-cbaf-429e-b487-8b179f989720-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c102f5c-cbaf-429e-b487-8b179f989720" (UID: "7c102f5c-cbaf-429e-b487-8b179f989720"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:04 crc kubenswrapper[4699]: I0226 11:31:04.435972 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c102f5c-cbaf-429e-b487-8b179f989720-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:04 crc kubenswrapper[4699]: I0226 11:31:04.454927 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c102f5c-cbaf-429e-b487-8b179f989720-kube-api-access-q42gd" (OuterVolumeSpecName: "kube-api-access-q42gd") pod "7c102f5c-cbaf-429e-b487-8b179f989720" (UID: "7c102f5c-cbaf-429e-b487-8b179f989720"). InnerVolumeSpecName "kube-api-access-q42gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:04 crc kubenswrapper[4699]: I0226 11:31:04.538006 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q42gd\" (UniqueName: \"kubernetes.io/projected/7c102f5c-cbaf-429e-b487-8b179f989720-kube-api-access-q42gd\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.005973 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bl9wp"] Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.020025 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5a9e-account-create-update-fzhw8"] Feb 26 11:31:05 crc kubenswrapper[4699]: W0226 11:31:05.027120 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c910eba_ce23_4fd9_b08a_54b96fe6a2da.slice/crio-ccd60b62a46256153e5613142403578ee18ee15936b92911827adb6661d9a59f WatchSource:0}: Error finding container ccd60b62a46256153e5613142403578ee18ee15936b92911827adb6661d9a59f: Status 404 returned error can't find the container with id ccd60b62a46256153e5613142403578ee18ee15936b92911827adb6661d9a59f Feb 26 11:31:05 crc kubenswrapper[4699]: W0226 11:31:05.037322 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1029eddb_2336_4ec5_af4a_b8fed82d3d55.slice/crio-681ac56ff402c4cccc4c7060f9f4f090a2936d815a9584b9ec075bbc793e90c8 WatchSource:0}: Error finding container 681ac56ff402c4cccc4c7060f9f4f090a2936d815a9584b9ec075bbc793e90c8: Status 404 returned error can't find the container with id 681ac56ff402c4cccc4c7060f9f4f090a2936d815a9584b9ec075bbc793e90c8 Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.097844 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.206191 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nrvng-config-gbwh4"] Feb 26 11:31:05 crc kubenswrapper[4699]: W0226 11:31:05.220473 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46fd8768_dfd3_4bb1_b7c9_f4d803bf829f.slice/crio-343bd3b7a46be415742ed56cacad4cbc06c91d65f0c4265fc8e92491ddf876ad WatchSource:0}: Error finding container 343bd3b7a46be415742ed56cacad4cbc06c91d65f0c4265fc8e92491ddf876ad: Status 404 returned error can't find the container with id 343bd3b7a46be415742ed56cacad4cbc06c91d65f0c4265fc8e92491ddf876ad Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.251342 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-ovsdbserver-sb\") pod \"2a166832-199a-436c-85a2-4ccde527f180\" (UID: \"2a166832-199a-436c-85a2-4ccde527f180\") " Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.251675 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-ovsdbserver-nb\") pod \"2a166832-199a-436c-85a2-4ccde527f180\" (UID: \"2a166832-199a-436c-85a2-4ccde527f180\") " Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.251793 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-config\") pod \"2a166832-199a-436c-85a2-4ccde527f180\" (UID: \"2a166832-199a-436c-85a2-4ccde527f180\") " Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.251954 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-dns-svc\") pod \"2a166832-199a-436c-85a2-4ccde527f180\" (UID: \"2a166832-199a-436c-85a2-4ccde527f180\") " Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.252098 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pjrg\" (UniqueName: \"kubernetes.io/projected/2a166832-199a-436c-85a2-4ccde527f180-kube-api-access-7pjrg\") pod \"2a166832-199a-436c-85a2-4ccde527f180\" (UID: \"2a166832-199a-436c-85a2-4ccde527f180\") " Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.260836 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a166832-199a-436c-85a2-4ccde527f180-kube-api-access-7pjrg" (OuterVolumeSpecName: "kube-api-access-7pjrg") pod "2a166832-199a-436c-85a2-4ccde527f180" (UID: "2a166832-199a-436c-85a2-4ccde527f180"). InnerVolumeSpecName "kube-api-access-7pjrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.341521 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4fx8g"] Feb 26 11:31:05 crc kubenswrapper[4699]: W0226 11:31:05.342102 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f040612_306e_4ce2_b289_ed5be7bbc9e3.slice/crio-8bd38e2addb3bddd3fc1ffb0bcdd8462f8cbea6db1e2b7de24e31f6134f41fd6 WatchSource:0}: Error finding container 8bd38e2addb3bddd3fc1ffb0bcdd8462f8cbea6db1e2b7de24e31f6134f41fd6: Status 404 returned error can't find the container with id 8bd38e2addb3bddd3fc1ffb0bcdd8462f8cbea6db1e2b7de24e31f6134f41fd6 Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.354012 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pjrg\" (UniqueName: \"kubernetes.io/projected/2a166832-199a-436c-85a2-4ccde527f180-kube-api-access-7pjrg\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.371878 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-v9z8k"] Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.385704 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2a166832-199a-436c-85a2-4ccde527f180" (UID: "2a166832-199a-436c-85a2-4ccde527f180"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.392795 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2a166832-199a-436c-85a2-4ccde527f180" (UID: "2a166832-199a-436c-85a2-4ccde527f180"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:05 crc kubenswrapper[4699]: W0226 11:31:05.399064 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c25243e_b6d9_40f5_9c3b_31947cf74cc9.slice/crio-1e068fdc7bd3850e6a731714657c56632475a810f22a5b66f44175ff08b5063d WatchSource:0}: Error finding container 1e068fdc7bd3850e6a731714657c56632475a810f22a5b66f44175ff08b5063d: Status 404 returned error can't find the container with id 1e068fdc7bd3850e6a731714657c56632475a810f22a5b66f44175ff08b5063d Feb 26 11:31:05 crc kubenswrapper[4699]: W0226 11:31:05.408492 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a68fa18_1c49_4d3d_bc5f_75763944d818.slice/crio-decc5b383a239243923da333848dd63209b18fa929f715b34f5afb37b483d9dc WatchSource:0}: Error finding container decc5b383a239243923da333848dd63209b18fa929f715b34f5afb37b483d9dc: Status 404 returned error can't find the container with id decc5b383a239243923da333848dd63209b18fa929f715b34f5afb37b483d9dc Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.424036 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2a166832-199a-436c-85a2-4ccde527f180" (UID: "2a166832-199a-436c-85a2-4ccde527f180"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.430776 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-config" (OuterVolumeSpecName: "config") pod "2a166832-199a-436c-85a2-4ccde527f180" (UID: "2a166832-199a-436c-85a2-4ccde527f180"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.442066 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-v77r5"] Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.455691 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.457505 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.457603 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.457659 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a166832-199a-436c-85a2-4ccde527f180-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.465944 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f3b2-account-create-update-xhgnq"] Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.470930 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v9z8k" event={"ID":"7f040612-306e-4ce2-b289-ed5be7bbc9e3","Type":"ContainerStarted","Data":"8bd38e2addb3bddd3fc1ffb0bcdd8462f8cbea6db1e2b7de24e31f6134f41fd6"} Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.473856 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-v77r5" event={"ID":"758bbe1c-d826-47f7-aff6-54e9fc4ebe63","Type":"ContainerStarted","Data":"9e7dbf7b2fb001aa6400d97b6a5c91f4d444c0f906552e3eb3f37bb196932e99"} Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.479422 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f3b2-account-create-update-xhgnq" event={"ID":"8c25243e-b6d9-40f5-9c3b-31947cf74cc9","Type":"ContainerStarted","Data":"1e068fdc7bd3850e6a731714657c56632475a810f22a5b66f44175ff08b5063d"} Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.487044 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a7a2-account-create-update-l2mt4"] Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.488953 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" event={"ID":"2a166832-199a-436c-85a2-4ccde527f180","Type":"ContainerDied","Data":"e37733ce4b3de5c1e636da1d778df1b2746e600646623b6c23cb5510f0a9db33"} Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.489007 4699 scope.go:117] "RemoveContainer" containerID="a6963bcffe5d258cd49c8f7db7cd1ef0c3f71763a18cb9d09e9e8a608d9fa6bd" Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.489209 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6nf48" Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.500615 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nblvp" event={"ID":"72c1d656-4f85-483b-b7a2-6132b71ae093","Type":"ContainerStarted","Data":"99b2baa30a79cd9b1afa4299366118e58d2c6c18512f6454267d08d3b636f3e6"} Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.506159 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bl9wp" event={"ID":"1029eddb-2336-4ec5-af4a-b8fed82d3d55","Type":"ContainerStarted","Data":"7c9888c6347c41b14207598f1324ae87027fe21cf208ac04db043c3350762dde"} Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.506201 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bl9wp" event={"ID":"1029eddb-2336-4ec5-af4a-b8fed82d3d55","Type":"ContainerStarted","Data":"681ac56ff402c4cccc4c7060f9f4f090a2936d815a9584b9ec075bbc793e90c8"} Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.510607 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a7a2-account-create-update-l2mt4" event={"ID":"7a68fa18-1c49-4d3d-bc5f-75763944d818","Type":"ContainerStarted","Data":"decc5b383a239243923da333848dd63209b18fa929f715b34f5afb37b483d9dc"} Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.514719 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nrvng-config-gbwh4" event={"ID":"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f","Type":"ContainerStarted","Data":"343bd3b7a46be415742ed56cacad4cbc06c91d65f0c4265fc8e92491ddf876ad"} Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.523564 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5a9e-account-create-update-fzhw8" event={"ID":"4c910eba-ce23-4fd9-b08a-54b96fe6a2da","Type":"ContainerStarted","Data":"8ac6484a77ece8a11d14d59104b361e660535022ac1b3f3359289cdf598c1ea3"} Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.523637 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5a9e-account-create-update-fzhw8" event={"ID":"4c910eba-ce23-4fd9-b08a-54b96fe6a2da","Type":"ContainerStarted","Data":"ccd60b62a46256153e5613142403578ee18ee15936b92911827adb6661d9a59f"} Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.532782 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4fx8g" event={"ID":"5c9e36d9-5d53-46d8-a91a-22dc9338ab58","Type":"ContainerStarted","Data":"cc8acab0f309ee31e4dd19e0655a0a1330830aac52464ec3c2b6fa9110dbc2ad"} Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.536104 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-nblvp" podStartSLOduration=2.943763278 podStartE2EDuration="15.536078636s" podCreationTimestamp="2026-02-26 11:30:50 +0000 UTC" firstStartedPulling="2026-02-26 11:30:51.667442721 +0000 UTC m=+1197.478269145" lastFinishedPulling="2026-02-26 11:31:04.259758069 +0000 UTC m=+1210.070584503" observedRunningTime="2026-02-26 11:31:05.530478305 +0000 UTC m=+1211.341304739" watchObservedRunningTime="2026-02-26 11:31:05.536078636 +0000 UTC m=+1211.346905080" Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.547650 4699 scope.go:117] "RemoveContainer" containerID="4ad9a83fa9f5197d955a8f1565b66571572dedbb333404d507411352c78978c6" Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.589285 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6nf48"] Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.624937 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6nf48"] Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.639525 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-bl9wp" podStartSLOduration=3.6395058430000002 podStartE2EDuration="3.639505843s" podCreationTimestamp="2026-02-26 11:31:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:31:05.596670646 +0000 UTC m=+1211.407497080" watchObservedRunningTime="2026-02-26 11:31:05.639505843 +0000 UTC m=+1211.450332277" Feb 26 11:31:05 crc kubenswrapper[4699]: I0226 11:31:05.655781 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-5a9e-account-create-update-fzhw8" podStartSLOduration=2.655754522 podStartE2EDuration="2.655754522s" podCreationTimestamp="2026-02-26 11:31:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:31:05.614343837 +0000 UTC m=+1211.425170291" watchObservedRunningTime="2026-02-26 11:31:05.655754522 +0000 UTC m=+1211.466580966" Feb 26 11:31:06 crc kubenswrapper[4699]: I0226 11:31:06.277634 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a166832-199a-436c-85a2-4ccde527f180" path="/var/lib/kubelet/pods/2a166832-199a-436c-85a2-4ccde527f180/volumes" Feb 26 11:31:06 crc kubenswrapper[4699]: I0226 11:31:06.541860 4699 generic.go:334] "Generic (PLEG): container finished" podID="8c25243e-b6d9-40f5-9c3b-31947cf74cc9" containerID="d84c1ad7d451293243927fb877d730897ca18c570d340c3870da5a49cf7b4e49" exitCode=0 Feb 26 11:31:06 crc kubenswrapper[4699]: I0226 11:31:06.541913 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f3b2-account-create-update-xhgnq" event={"ID":"8c25243e-b6d9-40f5-9c3b-31947cf74cc9","Type":"ContainerDied","Data":"d84c1ad7d451293243927fb877d730897ca18c570d340c3870da5a49cf7b4e49"} Feb 26 11:31:06 crc kubenswrapper[4699]: I0226 11:31:06.544785 4699 generic.go:334] "Generic (PLEG): container finished" podID="7a68fa18-1c49-4d3d-bc5f-75763944d818" containerID="0d9733430c4e718e7aff62771d81bae98ffdfc65e518351b1e877ae065bfd725" exitCode=0 Feb 26 11:31:06 crc kubenswrapper[4699]: I0226 11:31:06.544847 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a7a2-account-create-update-l2mt4" event={"ID":"7a68fa18-1c49-4d3d-bc5f-75763944d818","Type":"ContainerDied","Data":"0d9733430c4e718e7aff62771d81bae98ffdfc65e518351b1e877ae065bfd725"} Feb 26 11:31:06 crc kubenswrapper[4699]: I0226 11:31:06.549106 4699 generic.go:334] "Generic (PLEG): container finished" podID="46fd8768-dfd3-4bb1-b7c9-f4d803bf829f" containerID="dad7fa90e67d3f965c26f7c4abb45503a74b01c5861c388e8b2b6571901121e5" exitCode=0 Feb 26 11:31:06 crc kubenswrapper[4699]: I0226 11:31:06.549155 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nrvng-config-gbwh4" event={"ID":"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f","Type":"ContainerDied","Data":"dad7fa90e67d3f965c26f7c4abb45503a74b01c5861c388e8b2b6571901121e5"} Feb 26 11:31:06 crc kubenswrapper[4699]: I0226 11:31:06.551196 4699 generic.go:334] "Generic (PLEG): container finished" podID="4c910eba-ce23-4fd9-b08a-54b96fe6a2da" containerID="8ac6484a77ece8a11d14d59104b361e660535022ac1b3f3359289cdf598c1ea3" exitCode=0 Feb 26 11:31:06 crc kubenswrapper[4699]: I0226 11:31:06.551228 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5a9e-account-create-update-fzhw8" event={"ID":"4c910eba-ce23-4fd9-b08a-54b96fe6a2da","Type":"ContainerDied","Data":"8ac6484a77ece8a11d14d59104b361e660535022ac1b3f3359289cdf598c1ea3"} Feb 26 11:31:06 crc kubenswrapper[4699]: I0226 11:31:06.553347 4699 generic.go:334] "Generic (PLEG): container finished" podID="5c9e36d9-5d53-46d8-a91a-22dc9338ab58" containerID="f9bc95d14d4ca0f4150bed4b727cc55b90093e4c3307ebc23256f5bd6248badb" exitCode=0 Feb 26 11:31:06 crc kubenswrapper[4699]: I0226 11:31:06.553400 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4fx8g" event={"ID":"5c9e36d9-5d53-46d8-a91a-22dc9338ab58","Type":"ContainerDied","Data":"f9bc95d14d4ca0f4150bed4b727cc55b90093e4c3307ebc23256f5bd6248badb"} Feb 26 11:31:06 crc kubenswrapper[4699]: I0226 11:31:06.557327 4699 generic.go:334] "Generic (PLEG): container finished" podID="1029eddb-2336-4ec5-af4a-b8fed82d3d55" containerID="7c9888c6347c41b14207598f1324ae87027fe21cf208ac04db043c3350762dde" exitCode=0 Feb 26 11:31:06 crc kubenswrapper[4699]: I0226 11:31:06.557402 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bl9wp" event={"ID":"1029eddb-2336-4ec5-af4a-b8fed82d3d55","Type":"ContainerDied","Data":"7c9888c6347c41b14207598f1324ae87027fe21cf208ac04db043c3350762dde"} Feb 26 11:31:06 crc kubenswrapper[4699]: I0226 11:31:06.558639 4699 generic.go:334] "Generic (PLEG): container finished" podID="758bbe1c-d826-47f7-aff6-54e9fc4ebe63" containerID="6a7d35b314cb71b7aea626b804eac24b58050ec797d6079e6362282e3f1a7a28" exitCode=0 Feb 26 11:31:06 crc kubenswrapper[4699]: I0226 11:31:06.559469 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-v77r5" event={"ID":"758bbe1c-d826-47f7-aff6-54e9fc4ebe63","Type":"ContainerDied","Data":"6a7d35b314cb71b7aea626b804eac24b58050ec797d6079e6362282e3f1a7a28"} Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.028052 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v77r5" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.076213 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4fx8g" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.103675 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bl9wp" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.106738 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nrvng-config-gbwh4" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.136620 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f3b2-account-create-update-xhgnq" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.150011 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/758bbe1c-d826-47f7-aff6-54e9fc4ebe63-operator-scripts\") pod \"758bbe1c-d826-47f7-aff6-54e9fc4ebe63\" (UID: \"758bbe1c-d826-47f7-aff6-54e9fc4ebe63\") " Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.150050 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xts5\" (UniqueName: \"kubernetes.io/projected/758bbe1c-d826-47f7-aff6-54e9fc4ebe63-kube-api-access-5xts5\") pod \"758bbe1c-d826-47f7-aff6-54e9fc4ebe63\" (UID: \"758bbe1c-d826-47f7-aff6-54e9fc4ebe63\") " Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.150828 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/758bbe1c-d826-47f7-aff6-54e9fc4ebe63-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "758bbe1c-d826-47f7-aff6-54e9fc4ebe63" (UID: "758bbe1c-d826-47f7-aff6-54e9fc4ebe63"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.152419 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a7a2-account-create-update-l2mt4" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.159903 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/758bbe1c-d826-47f7-aff6-54e9fc4ebe63-kube-api-access-5xts5" (OuterVolumeSpecName: "kube-api-access-5xts5") pod "758bbe1c-d826-47f7-aff6-54e9fc4ebe63" (UID: "758bbe1c-d826-47f7-aff6-54e9fc4ebe63"). InnerVolumeSpecName "kube-api-access-5xts5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.210803 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5a9e-account-create-update-fzhw8" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.252966 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-additional-scripts\") pod \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.253095 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-var-run-ovn\") pod \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.253146 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-scripts\") pod \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.253181 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8wvk\" (UniqueName: \"kubernetes.io/projected/7a68fa18-1c49-4d3d-bc5f-75763944d818-kube-api-access-b8wvk\") pod \"7a68fa18-1c49-4d3d-bc5f-75763944d818\" (UID: \"7a68fa18-1c49-4d3d-bc5f-75763944d818\") " Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.253260 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1029eddb-2336-4ec5-af4a-b8fed82d3d55-operator-scripts\") pod \"1029eddb-2336-4ec5-af4a-b8fed82d3d55\" (UID: \"1029eddb-2336-4ec5-af4a-b8fed82d3d55\") " Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.253288 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vh8z\" (UniqueName: \"kubernetes.io/projected/8c25243e-b6d9-40f5-9c3b-31947cf74cc9-kube-api-access-6vh8z\") pod \"8c25243e-b6d9-40f5-9c3b-31947cf74cc9\" (UID: \"8c25243e-b6d9-40f5-9c3b-31947cf74cc9\") " Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.253322 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l6p4\" (UniqueName: \"kubernetes.io/projected/5c9e36d9-5d53-46d8-a91a-22dc9338ab58-kube-api-access-9l6p4\") pod \"5c9e36d9-5d53-46d8-a91a-22dc9338ab58\" (UID: \"5c9e36d9-5d53-46d8-a91a-22dc9338ab58\") " Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.253350 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c9e36d9-5d53-46d8-a91a-22dc9338ab58-operator-scripts\") pod \"5c9e36d9-5d53-46d8-a91a-22dc9338ab58\" (UID: \"5c9e36d9-5d53-46d8-a91a-22dc9338ab58\") " Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.253386 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c25243e-b6d9-40f5-9c3b-31947cf74cc9-operator-scripts\") pod \"8c25243e-b6d9-40f5-9c3b-31947cf74cc9\" (UID: \"8c25243e-b6d9-40f5-9c3b-31947cf74cc9\") " Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.253413 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-var-run\") pod \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.253432 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a68fa18-1c49-4d3d-bc5f-75763944d818-operator-scripts\") pod \"7a68fa18-1c49-4d3d-bc5f-75763944d818\" (UID: \"7a68fa18-1c49-4d3d-bc5f-75763944d818\") " Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.253459 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-var-log-ovn\") pod \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.253519 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xbdb\" (UniqueName: \"kubernetes.io/projected/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-kube-api-access-2xbdb\") pod \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\" (UID: \"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f\") " Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.253543 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftbnl\" (UniqueName: \"kubernetes.io/projected/1029eddb-2336-4ec5-af4a-b8fed82d3d55-kube-api-access-ftbnl\") pod \"1029eddb-2336-4ec5-af4a-b8fed82d3d55\" (UID: \"1029eddb-2336-4ec5-af4a-b8fed82d3d55\") " Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.253855 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "46fd8768-dfd3-4bb1-b7c9-f4d803bf829f" (UID: "46fd8768-dfd3-4bb1-b7c9-f4d803bf829f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.254029 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/758bbe1c-d826-47f7-aff6-54e9fc4ebe63-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.254043 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xts5\" (UniqueName: \"kubernetes.io/projected/758bbe1c-d826-47f7-aff6-54e9fc4ebe63-kube-api-access-5xts5\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.254055 4699 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.254113 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-var-run" (OuterVolumeSpecName: "var-run") pod "46fd8768-dfd3-4bb1-b7c9-f4d803bf829f" (UID: "46fd8768-dfd3-4bb1-b7c9-f4d803bf829f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.254464 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c9e36d9-5d53-46d8-a91a-22dc9338ab58-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c9e36d9-5d53-46d8-a91a-22dc9338ab58" (UID: "5c9e36d9-5d53-46d8-a91a-22dc9338ab58"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.254734 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c25243e-b6d9-40f5-9c3b-31947cf74cc9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8c25243e-b6d9-40f5-9c3b-31947cf74cc9" (UID: "8c25243e-b6d9-40f5-9c3b-31947cf74cc9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.255025 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-scripts" (OuterVolumeSpecName: "scripts") pod "46fd8768-dfd3-4bb1-b7c9-f4d803bf829f" (UID: "46fd8768-dfd3-4bb1-b7c9-f4d803bf829f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.255038 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1029eddb-2336-4ec5-af4a-b8fed82d3d55-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1029eddb-2336-4ec5-af4a-b8fed82d3d55" (UID: "1029eddb-2336-4ec5-af4a-b8fed82d3d55"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.255062 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "46fd8768-dfd3-4bb1-b7c9-f4d803bf829f" (UID: "46fd8768-dfd3-4bb1-b7c9-f4d803bf829f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.255308 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a68fa18-1c49-4d3d-bc5f-75763944d818-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7a68fa18-1c49-4d3d-bc5f-75763944d818" (UID: "7a68fa18-1c49-4d3d-bc5f-75763944d818"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.255672 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "46fd8768-dfd3-4bb1-b7c9-f4d803bf829f" (UID: "46fd8768-dfd3-4bb1-b7c9-f4d803bf829f"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.256938 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a68fa18-1c49-4d3d-bc5f-75763944d818-kube-api-access-b8wvk" (OuterVolumeSpecName: "kube-api-access-b8wvk") pod "7a68fa18-1c49-4d3d-bc5f-75763944d818" (UID: "7a68fa18-1c49-4d3d-bc5f-75763944d818"). InnerVolumeSpecName "kube-api-access-b8wvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.256972 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1029eddb-2336-4ec5-af4a-b8fed82d3d55-kube-api-access-ftbnl" (OuterVolumeSpecName: "kube-api-access-ftbnl") pod "1029eddb-2336-4ec5-af4a-b8fed82d3d55" (UID: "1029eddb-2336-4ec5-af4a-b8fed82d3d55"). InnerVolumeSpecName "kube-api-access-ftbnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.257604 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c9e36d9-5d53-46d8-a91a-22dc9338ab58-kube-api-access-9l6p4" (OuterVolumeSpecName: "kube-api-access-9l6p4") pod "5c9e36d9-5d53-46d8-a91a-22dc9338ab58" (UID: "5c9e36d9-5d53-46d8-a91a-22dc9338ab58"). InnerVolumeSpecName "kube-api-access-9l6p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.258349 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-kube-api-access-2xbdb" (OuterVolumeSpecName: "kube-api-access-2xbdb") pod "46fd8768-dfd3-4bb1-b7c9-f4d803bf829f" (UID: "46fd8768-dfd3-4bb1-b7c9-f4d803bf829f"). InnerVolumeSpecName "kube-api-access-2xbdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.259076 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c25243e-b6d9-40f5-9c3b-31947cf74cc9-kube-api-access-6vh8z" (OuterVolumeSpecName: "kube-api-access-6vh8z") pod "8c25243e-b6d9-40f5-9c3b-31947cf74cc9" (UID: "8c25243e-b6d9-40f5-9c3b-31947cf74cc9"). InnerVolumeSpecName "kube-api-access-6vh8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.355773 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c910eba-ce23-4fd9-b08a-54b96fe6a2da-operator-scripts\") pod \"4c910eba-ce23-4fd9-b08a-54b96fe6a2da\" (UID: \"4c910eba-ce23-4fd9-b08a-54b96fe6a2da\") " Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.355862 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7spwj\" (UniqueName: \"kubernetes.io/projected/4c910eba-ce23-4fd9-b08a-54b96fe6a2da-kube-api-access-7spwj\") pod \"4c910eba-ce23-4fd9-b08a-54b96fe6a2da\" (UID: \"4c910eba-ce23-4fd9-b08a-54b96fe6a2da\") " Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.356404 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c910eba-ce23-4fd9-b08a-54b96fe6a2da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4c910eba-ce23-4fd9-b08a-54b96fe6a2da" (UID: "4c910eba-ce23-4fd9-b08a-54b96fe6a2da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.356438 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.356467 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8wvk\" (UniqueName: \"kubernetes.io/projected/7a68fa18-1c49-4d3d-bc5f-75763944d818-kube-api-access-b8wvk\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.356487 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1029eddb-2336-4ec5-af4a-b8fed82d3d55-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.356499 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vh8z\" (UniqueName: \"kubernetes.io/projected/8c25243e-b6d9-40f5-9c3b-31947cf74cc9-kube-api-access-6vh8z\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.356511 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l6p4\" (UniqueName: \"kubernetes.io/projected/5c9e36d9-5d53-46d8-a91a-22dc9338ab58-kube-api-access-9l6p4\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.356523 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c9e36d9-5d53-46d8-a91a-22dc9338ab58-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.356535 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c25243e-b6d9-40f5-9c3b-31947cf74cc9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.356547 4699 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-var-run\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.356559 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a68fa18-1c49-4d3d-bc5f-75763944d818-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.356573 4699 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.356585 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xbdb\" (UniqueName: \"kubernetes.io/projected/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-kube-api-access-2xbdb\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.356598 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftbnl\" (UniqueName: \"kubernetes.io/projected/1029eddb-2336-4ec5-af4a-b8fed82d3d55-kube-api-access-ftbnl\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.356610 4699 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.359103 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c910eba-ce23-4fd9-b08a-54b96fe6a2da-kube-api-access-7spwj" (OuterVolumeSpecName: "kube-api-access-7spwj") pod "4c910eba-ce23-4fd9-b08a-54b96fe6a2da" (UID: "4c910eba-ce23-4fd9-b08a-54b96fe6a2da"). InnerVolumeSpecName "kube-api-access-7spwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.457802 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c910eba-ce23-4fd9-b08a-54b96fe6a2da-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.457849 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7spwj\" (UniqueName: \"kubernetes.io/projected/4c910eba-ce23-4fd9-b08a-54b96fe6a2da-kube-api-access-7spwj\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.610485 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bl9wp" event={"ID":"1029eddb-2336-4ec5-af4a-b8fed82d3d55","Type":"ContainerDied","Data":"681ac56ff402c4cccc4c7060f9f4f090a2936d815a9584b9ec075bbc793e90c8"} Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.610756 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="681ac56ff402c4cccc4c7060f9f4f090a2936d815a9584b9ec075bbc793e90c8" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.610653 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bl9wp" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.614573 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v9z8k" event={"ID":"7f040612-306e-4ce2-b289-ed5be7bbc9e3","Type":"ContainerStarted","Data":"5b4e9b46d7abb3978f9445cbfeebb825f9cd664cf115705fdae6f65a2a171de8"} Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.617268 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-v77r5" event={"ID":"758bbe1c-d826-47f7-aff6-54e9fc4ebe63","Type":"ContainerDied","Data":"9e7dbf7b2fb001aa6400d97b6a5c91f4d444c0f906552e3eb3f37bb196932e99"} Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.617295 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e7dbf7b2fb001aa6400d97b6a5c91f4d444c0f906552e3eb3f37bb196932e99" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.617323 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v77r5" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.619054 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f3b2-account-create-update-xhgnq" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.619080 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f3b2-account-create-update-xhgnq" event={"ID":"8c25243e-b6d9-40f5-9c3b-31947cf74cc9","Type":"ContainerDied","Data":"1e068fdc7bd3850e6a731714657c56632475a810f22a5b66f44175ff08b5063d"} Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.619107 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e068fdc7bd3850e6a731714657c56632475a810f22a5b66f44175ff08b5063d" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.620582 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a7a2-account-create-update-l2mt4" event={"ID":"7a68fa18-1c49-4d3d-bc5f-75763944d818","Type":"ContainerDied","Data":"decc5b383a239243923da333848dd63209b18fa929f715b34f5afb37b483d9dc"} Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.620610 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="decc5b383a239243923da333848dd63209b18fa929f715b34f5afb37b483d9dc" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.620660 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a7a2-account-create-update-l2mt4" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.623333 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nrvng-config-gbwh4" event={"ID":"46fd8768-dfd3-4bb1-b7c9-f4d803bf829f","Type":"ContainerDied","Data":"343bd3b7a46be415742ed56cacad4cbc06c91d65f0c4265fc8e92491ddf876ad"} Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.623371 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="343bd3b7a46be415742ed56cacad4cbc06c91d65f0c4265fc8e92491ddf876ad" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.623577 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nrvng-config-gbwh4" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.625145 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5a9e-account-create-update-fzhw8" event={"ID":"4c910eba-ce23-4fd9-b08a-54b96fe6a2da","Type":"ContainerDied","Data":"ccd60b62a46256153e5613142403578ee18ee15936b92911827adb6661d9a59f"} Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.625205 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5a9e-account-create-update-fzhw8" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.625180 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccd60b62a46256153e5613142403578ee18ee15936b92911827adb6661d9a59f" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.642700 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4fx8g" event={"ID":"5c9e36d9-5d53-46d8-a91a-22dc9338ab58","Type":"ContainerDied","Data":"cc8acab0f309ee31e4dd19e0655a0a1330830aac52464ec3c2b6fa9110dbc2ad"} Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.642744 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc8acab0f309ee31e4dd19e0655a0a1330830aac52464ec3c2b6fa9110dbc2ad" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.642832 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4fx8g" Feb 26 11:31:10 crc kubenswrapper[4699]: I0226 11:31:10.644569 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-v9z8k" podStartSLOduration=3.038721853 podStartE2EDuration="7.644542241s" podCreationTimestamp="2026-02-26 11:31:03 +0000 UTC" firstStartedPulling="2026-02-26 11:31:05.348335976 +0000 UTC m=+1211.159162410" lastFinishedPulling="2026-02-26 11:31:09.954156364 +0000 UTC m=+1215.764982798" observedRunningTime="2026-02-26 11:31:10.636479699 +0000 UTC m=+1216.447306133" watchObservedRunningTime="2026-02-26 11:31:10.644542241 +0000 UTC m=+1216.455368675" Feb 26 11:31:11 crc kubenswrapper[4699]: I0226 11:31:11.220293 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-nrvng-config-gbwh4"] Feb 26 11:31:11 crc kubenswrapper[4699]: I0226 11:31:11.227184 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-nrvng-config-gbwh4"] Feb 26 11:31:11 crc kubenswrapper[4699]: I0226 11:31:11.585143 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:31:11 crc kubenswrapper[4699]: I0226 11:31:11.585203 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:31:11 crc kubenswrapper[4699]: I0226 11:31:11.585251 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:31:11 crc kubenswrapper[4699]: I0226 11:31:11.585954 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c2d25c558a927e58d9962b6f55de97dac3f222cb5bc89a35791fca832759b03"} pod="openshift-machine-config-operator/machine-config-daemon-28p79" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 11:31:11 crc kubenswrapper[4699]: I0226 11:31:11.586019 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" containerID="cri-o://2c2d25c558a927e58d9962b6f55de97dac3f222cb5bc89a35791fca832759b03" gracePeriod=600 Feb 26 11:31:12 crc kubenswrapper[4699]: I0226 11:31:12.271979 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46fd8768-dfd3-4bb1-b7c9-f4d803bf829f" path="/var/lib/kubelet/pods/46fd8768-dfd3-4bb1-b7c9-f4d803bf829f/volumes" Feb 26 11:31:12 crc kubenswrapper[4699]: I0226 11:31:12.659554 4699 generic.go:334] "Generic (PLEG): container finished" podID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerID="2c2d25c558a927e58d9962b6f55de97dac3f222cb5bc89a35791fca832759b03" exitCode=0 Feb 26 11:31:12 crc kubenswrapper[4699]: I0226 11:31:12.659612 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerDied","Data":"2c2d25c558a927e58d9962b6f55de97dac3f222cb5bc89a35791fca832759b03"} Feb 26 11:31:12 crc kubenswrapper[4699]: I0226 11:31:12.659652 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerStarted","Data":"e281597aa593fa5c9ddd67a617de4ed4d3363a8c5b9ebcaaf78cd70cd013eef6"} Feb 26 11:31:12 crc kubenswrapper[4699]: I0226 11:31:12.659673 4699 scope.go:117] "RemoveContainer" containerID="119837a96f7eb017f5f7e56268e9cf0e4a17276f8f8dd21ae8a57f4864ea4cf7" Feb 26 11:31:13 crc kubenswrapper[4699]: I0226 11:31:13.669841 4699 generic.go:334] "Generic (PLEG): container finished" podID="72c1d656-4f85-483b-b7a2-6132b71ae093" containerID="99b2baa30a79cd9b1afa4299366118e58d2c6c18512f6454267d08d3b636f3e6" exitCode=0 Feb 26 11:31:13 crc kubenswrapper[4699]: I0226 11:31:13.669997 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nblvp" event={"ID":"72c1d656-4f85-483b-b7a2-6132b71ae093","Type":"ContainerDied","Data":"99b2baa30a79cd9b1afa4299366118e58d2c6c18512f6454267d08d3b636f3e6"} Feb 26 11:31:13 crc kubenswrapper[4699]: I0226 11:31:13.671859 4699 generic.go:334] "Generic (PLEG): container finished" podID="7f040612-306e-4ce2-b289-ed5be7bbc9e3" containerID="5b4e9b46d7abb3978f9445cbfeebb825f9cd664cf115705fdae6f65a2a171de8" exitCode=0 Feb 26 11:31:13 crc kubenswrapper[4699]: I0226 11:31:13.671922 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v9z8k" event={"ID":"7f040612-306e-4ce2-b289-ed5be7bbc9e3","Type":"ContainerDied","Data":"5b4e9b46d7abb3978f9445cbfeebb825f9cd664cf115705fdae6f65a2a171de8"} Feb 26 11:31:14 crc kubenswrapper[4699]: I0226 11:31:14.985616 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v9z8k" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.131547 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsndg\" (UniqueName: \"kubernetes.io/projected/7f040612-306e-4ce2-b289-ed5be7bbc9e3-kube-api-access-hsndg\") pod \"7f040612-306e-4ce2-b289-ed5be7bbc9e3\" (UID: \"7f040612-306e-4ce2-b289-ed5be7bbc9e3\") " Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.131976 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f040612-306e-4ce2-b289-ed5be7bbc9e3-combined-ca-bundle\") pod \"7f040612-306e-4ce2-b289-ed5be7bbc9e3\" (UID: \"7f040612-306e-4ce2-b289-ed5be7bbc9e3\") " Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.132017 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f040612-306e-4ce2-b289-ed5be7bbc9e3-config-data\") pod \"7f040612-306e-4ce2-b289-ed5be7bbc9e3\" (UID: \"7f040612-306e-4ce2-b289-ed5be7bbc9e3\") " Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.137620 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f040612-306e-4ce2-b289-ed5be7bbc9e3-kube-api-access-hsndg" (OuterVolumeSpecName: "kube-api-access-hsndg") pod "7f040612-306e-4ce2-b289-ed5be7bbc9e3" (UID: "7f040612-306e-4ce2-b289-ed5be7bbc9e3"). InnerVolumeSpecName "kube-api-access-hsndg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.154022 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f040612-306e-4ce2-b289-ed5be7bbc9e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f040612-306e-4ce2-b289-ed5be7bbc9e3" (UID: "7f040612-306e-4ce2-b289-ed5be7bbc9e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.205063 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f040612-306e-4ce2-b289-ed5be7bbc9e3-config-data" (OuterVolumeSpecName: "config-data") pod "7f040612-306e-4ce2-b289-ed5be7bbc9e3" (UID: "7f040612-306e-4ce2-b289-ed5be7bbc9e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.235051 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f040612-306e-4ce2-b289-ed5be7bbc9e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.235076 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f040612-306e-4ce2-b289-ed5be7bbc9e3-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.235105 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsndg\" (UniqueName: \"kubernetes.io/projected/7f040612-306e-4ce2-b289-ed5be7bbc9e3-kube-api-access-hsndg\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.261488 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nblvp" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.335676 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbj5f\" (UniqueName: \"kubernetes.io/projected/72c1d656-4f85-483b-b7a2-6132b71ae093-kube-api-access-vbj5f\") pod \"72c1d656-4f85-483b-b7a2-6132b71ae093\" (UID: \"72c1d656-4f85-483b-b7a2-6132b71ae093\") " Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.335777 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72c1d656-4f85-483b-b7a2-6132b71ae093-config-data\") pod \"72c1d656-4f85-483b-b7a2-6132b71ae093\" (UID: \"72c1d656-4f85-483b-b7a2-6132b71ae093\") " Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.335818 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72c1d656-4f85-483b-b7a2-6132b71ae093-combined-ca-bundle\") pod \"72c1d656-4f85-483b-b7a2-6132b71ae093\" (UID: \"72c1d656-4f85-483b-b7a2-6132b71ae093\") " Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.336371 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72c1d656-4f85-483b-b7a2-6132b71ae093-db-sync-config-data\") pod \"72c1d656-4f85-483b-b7a2-6132b71ae093\" (UID: \"72c1d656-4f85-483b-b7a2-6132b71ae093\") " Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.339596 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72c1d656-4f85-483b-b7a2-6132b71ae093-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "72c1d656-4f85-483b-b7a2-6132b71ae093" (UID: "72c1d656-4f85-483b-b7a2-6132b71ae093"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.339784 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72c1d656-4f85-483b-b7a2-6132b71ae093-kube-api-access-vbj5f" (OuterVolumeSpecName: "kube-api-access-vbj5f") pod "72c1d656-4f85-483b-b7a2-6132b71ae093" (UID: "72c1d656-4f85-483b-b7a2-6132b71ae093"). InnerVolumeSpecName "kube-api-access-vbj5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.358100 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72c1d656-4f85-483b-b7a2-6132b71ae093-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72c1d656-4f85-483b-b7a2-6132b71ae093" (UID: "72c1d656-4f85-483b-b7a2-6132b71ae093"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.379116 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72c1d656-4f85-483b-b7a2-6132b71ae093-config-data" (OuterVolumeSpecName: "config-data") pod "72c1d656-4f85-483b-b7a2-6132b71ae093" (UID: "72c1d656-4f85-483b-b7a2-6132b71ae093"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.438092 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbj5f\" (UniqueName: \"kubernetes.io/projected/72c1d656-4f85-483b-b7a2-6132b71ae093-kube-api-access-vbj5f\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.438143 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72c1d656-4f85-483b-b7a2-6132b71ae093-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.438154 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72c1d656-4f85-483b-b7a2-6132b71ae093-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.438164 4699 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72c1d656-4f85-483b-b7a2-6132b71ae093-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.693275 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v9z8k" event={"ID":"7f040612-306e-4ce2-b289-ed5be7bbc9e3","Type":"ContainerDied","Data":"8bd38e2addb3bddd3fc1ffb0bcdd8462f8cbea6db1e2b7de24e31f6134f41fd6"} Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.693313 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v9z8k" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.693327 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bd38e2addb3bddd3fc1ffb0bcdd8462f8cbea6db1e2b7de24e31f6134f41fd6" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.701847 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nblvp" event={"ID":"72c1d656-4f85-483b-b7a2-6132b71ae093","Type":"ContainerDied","Data":"c8f798d0cd617868616cba798cc6a29d56d3b9e5026ef1c6b84fc3016e7bc40e"} Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.701901 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8f798d0cd617868616cba798cc6a29d56d3b9e5026ef1c6b84fc3016e7bc40e" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.701930 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nblvp" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.957518 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-4bmfl"] Feb 26 11:31:15 crc kubenswrapper[4699]: E0226 11:31:15.964441 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72c1d656-4f85-483b-b7a2-6132b71ae093" containerName="glance-db-sync" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.964473 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="72c1d656-4f85-483b-b7a2-6132b71ae093" containerName="glance-db-sync" Feb 26 11:31:15 crc kubenswrapper[4699]: E0226 11:31:15.964493 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="758bbe1c-d826-47f7-aff6-54e9fc4ebe63" containerName="mariadb-database-create" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.964504 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="758bbe1c-d826-47f7-aff6-54e9fc4ebe63" containerName="mariadb-database-create" Feb 26 11:31:15 crc kubenswrapper[4699]: E0226 11:31:15.964520 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c102f5c-cbaf-429e-b487-8b179f989720" containerName="mariadb-account-create-update" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.964528 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c102f5c-cbaf-429e-b487-8b179f989720" containerName="mariadb-account-create-update" Feb 26 11:31:15 crc kubenswrapper[4699]: E0226 11:31:15.964544 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1029eddb-2336-4ec5-af4a-b8fed82d3d55" containerName="mariadb-database-create" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.964554 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="1029eddb-2336-4ec5-af4a-b8fed82d3d55" containerName="mariadb-database-create" Feb 26 11:31:15 crc kubenswrapper[4699]: E0226 11:31:15.964570 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c25243e-b6d9-40f5-9c3b-31947cf74cc9" containerName="mariadb-account-create-update" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.964579 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c25243e-b6d9-40f5-9c3b-31947cf74cc9" containerName="mariadb-account-create-update" Feb 26 11:31:15 crc kubenswrapper[4699]: E0226 11:31:15.964590 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f040612-306e-4ce2-b289-ed5be7bbc9e3" containerName="keystone-db-sync" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.964597 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f040612-306e-4ce2-b289-ed5be7bbc9e3" containerName="keystone-db-sync" Feb 26 11:31:15 crc kubenswrapper[4699]: E0226 11:31:15.964614 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a166832-199a-436c-85a2-4ccde527f180" containerName="init" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.964622 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a166832-199a-436c-85a2-4ccde527f180" containerName="init" Feb 26 11:31:15 crc kubenswrapper[4699]: E0226 11:31:15.964631 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c910eba-ce23-4fd9-b08a-54b96fe6a2da" containerName="mariadb-account-create-update" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.964639 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c910eba-ce23-4fd9-b08a-54b96fe6a2da" containerName="mariadb-account-create-update" Feb 26 11:31:15 crc kubenswrapper[4699]: E0226 11:31:15.964646 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46fd8768-dfd3-4bb1-b7c9-f4d803bf829f" containerName="ovn-config" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.964654 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="46fd8768-dfd3-4bb1-b7c9-f4d803bf829f" containerName="ovn-config" Feb 26 11:31:15 crc kubenswrapper[4699]: E0226 11:31:15.964667 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c9e36d9-5d53-46d8-a91a-22dc9338ab58" containerName="mariadb-database-create" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.964674 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9e36d9-5d53-46d8-a91a-22dc9338ab58" containerName="mariadb-database-create" Feb 26 11:31:15 crc kubenswrapper[4699]: E0226 11:31:15.964688 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a68fa18-1c49-4d3d-bc5f-75763944d818" containerName="mariadb-account-create-update" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.964698 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a68fa18-1c49-4d3d-bc5f-75763944d818" containerName="mariadb-account-create-update" Feb 26 11:31:15 crc kubenswrapper[4699]: E0226 11:31:15.964716 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a166832-199a-436c-85a2-4ccde527f180" containerName="dnsmasq-dns" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.964723 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a166832-199a-436c-85a2-4ccde527f180" containerName="dnsmasq-dns" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.964933 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="1029eddb-2336-4ec5-af4a-b8fed82d3d55" containerName="mariadb-database-create" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.964949 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="758bbe1c-d826-47f7-aff6-54e9fc4ebe63" containerName="mariadb-database-create" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.964961 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c9e36d9-5d53-46d8-a91a-22dc9338ab58" containerName="mariadb-database-create" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.964981 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="72c1d656-4f85-483b-b7a2-6132b71ae093" containerName="glance-db-sync" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.964990 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a68fa18-1c49-4d3d-bc5f-75763944d818" containerName="mariadb-account-create-update" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.965001 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c102f5c-cbaf-429e-b487-8b179f989720" containerName="mariadb-account-create-update" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.965009 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c25243e-b6d9-40f5-9c3b-31947cf74cc9" containerName="mariadb-account-create-update" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.965019 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c910eba-ce23-4fd9-b08a-54b96fe6a2da" containerName="mariadb-account-create-update" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.965028 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f040612-306e-4ce2-b289-ed5be7bbc9e3" containerName="keystone-db-sync" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.965037 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a166832-199a-436c-85a2-4ccde527f180" containerName="dnsmasq-dns" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.965050 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="46fd8768-dfd3-4bb1-b7c9-f4d803bf829f" containerName="ovn-config" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.966215 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" Feb 26 11:31:15 crc kubenswrapper[4699]: I0226 11:31:15.996205 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-4bmfl"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.009460 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rx6w7"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.010705 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rx6w7" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.039250 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.039380 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qbntt" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.039451 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.039615 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.039688 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.098693 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rx6w7"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.152371 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w8wd\" (UniqueName: \"kubernetes.io/projected/833927c0-710f-446e-a3be-0df2b2399638-kube-api-access-9w8wd\") pod \"keystone-bootstrap-rx6w7\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " pod="openstack/keystone-bootstrap-rx6w7" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.152455 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-dns-svc\") pod \"dnsmasq-dns-6f8c45789f-4bmfl\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.152495 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8c45789f-4bmfl\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.152526 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-scripts\") pod \"keystone-bootstrap-rx6w7\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " pod="openstack/keystone-bootstrap-rx6w7" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.152560 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-config-data\") pod \"keystone-bootstrap-rx6w7\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " pod="openstack/keystone-bootstrap-rx6w7" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.152600 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-combined-ca-bundle\") pod \"keystone-bootstrap-rx6w7\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " pod="openstack/keystone-bootstrap-rx6w7" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.152701 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-dns-swift-storage-0\") pod \"dnsmasq-dns-6f8c45789f-4bmfl\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.152758 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8c45789f-4bmfl\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.152787 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-config\") pod \"dnsmasq-dns-6f8c45789f-4bmfl\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.152860 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgf56\" (UniqueName: \"kubernetes.io/projected/7fc97c22-3dce-4a90-bd78-d976a368e56c-kube-api-access-dgf56\") pod \"dnsmasq-dns-6f8c45789f-4bmfl\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.152895 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-credential-keys\") pod \"keystone-bootstrap-rx6w7\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " pod="openstack/keystone-bootstrap-rx6w7" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.152932 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-fernet-keys\") pod \"keystone-bootstrap-rx6w7\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " pod="openstack/keystone-bootstrap-rx6w7" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.227052 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-db87b77d9-ns48f"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.228885 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-db87b77d9-ns48f" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.234623 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.234914 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.235165 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.236825 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-84wm7" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.257766 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w8wd\" (UniqueName: \"kubernetes.io/projected/833927c0-710f-446e-a3be-0df2b2399638-kube-api-access-9w8wd\") pod \"keystone-bootstrap-rx6w7\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " pod="openstack/keystone-bootstrap-rx6w7" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.257813 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-dns-svc\") pod \"dnsmasq-dns-6f8c45789f-4bmfl\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.257838 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8c45789f-4bmfl\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.257853 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-scripts\") pod \"keystone-bootstrap-rx6w7\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " pod="openstack/keystone-bootstrap-rx6w7" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.257867 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-config-data\") pod \"keystone-bootstrap-rx6w7\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " pod="openstack/keystone-bootstrap-rx6w7" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.257886 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-combined-ca-bundle\") pod \"keystone-bootstrap-rx6w7\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " pod="openstack/keystone-bootstrap-rx6w7" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.257944 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-dns-swift-storage-0\") pod \"dnsmasq-dns-6f8c45789f-4bmfl\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.257972 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8c45789f-4bmfl\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.257989 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-config\") pod \"dnsmasq-dns-6f8c45789f-4bmfl\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.258030 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgf56\" (UniqueName: \"kubernetes.io/projected/7fc97c22-3dce-4a90-bd78-d976a368e56c-kube-api-access-dgf56\") pod \"dnsmasq-dns-6f8c45789f-4bmfl\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.258050 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-credential-keys\") pod \"keystone-bootstrap-rx6w7\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " pod="openstack/keystone-bootstrap-rx6w7" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.258069 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-fernet-keys\") pod \"keystone-bootstrap-rx6w7\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " pod="openstack/keystone-bootstrap-rx6w7" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.265964 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-fernet-keys\") pod \"keystone-bootstrap-rx6w7\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " pod="openstack/keystone-bootstrap-rx6w7" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.266638 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-dns-swift-storage-0\") pod \"dnsmasq-dns-6f8c45789f-4bmfl\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.267194 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8c45789f-4bmfl\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.267680 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-config\") pod \"dnsmasq-dns-6f8c45789f-4bmfl\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.268266 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8c45789f-4bmfl\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.257765 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-db87b77d9-ns48f"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.274572 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-scripts\") pod \"keystone-bootstrap-rx6w7\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " pod="openstack/keystone-bootstrap-rx6w7" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.275715 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-dns-svc\") pod \"dnsmasq-dns-6f8c45789f-4bmfl\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.281527 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-credential-keys\") pod \"keystone-bootstrap-rx6w7\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " pod="openstack/keystone-bootstrap-rx6w7" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.285847 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-combined-ca-bundle\") pod \"keystone-bootstrap-rx6w7\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " pod="openstack/keystone-bootstrap-rx6w7" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.286391 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-config-data\") pod \"keystone-bootstrap-rx6w7\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " pod="openstack/keystone-bootstrap-rx6w7" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.292643 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-f49xd"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.300145 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w8wd\" (UniqueName: \"kubernetes.io/projected/833927c0-710f-446e-a3be-0df2b2399638-kube-api-access-9w8wd\") pod \"keystone-bootstrap-rx6w7\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " pod="openstack/keystone-bootstrap-rx6w7" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.301877 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f49xd" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.312174 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgf56\" (UniqueName: \"kubernetes.io/projected/7fc97c22-3dce-4a90-bd78-d976a368e56c-kube-api-access-dgf56\") pod \"dnsmasq-dns-6f8c45789f-4bmfl\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.326020 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.326239 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.326262 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-bgvh2" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.339419 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-f49xd"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.359006 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k4xk\" (UniqueName: \"kubernetes.io/projected/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-kube-api-access-8k4xk\") pod \"horizon-db87b77d9-ns48f\" (UID: \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\") " pod="openstack/horizon-db87b77d9-ns48f" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.359054 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-logs\") pod \"horizon-db87b77d9-ns48f\" (UID: \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\") " pod="openstack/horizon-db87b77d9-ns48f" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.359083 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-config-data\") pod \"horizon-db87b77d9-ns48f\" (UID: \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\") " pod="openstack/horizon-db87b77d9-ns48f" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.359156 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-horizon-secret-key\") pod \"horizon-db87b77d9-ns48f\" (UID: \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\") " pod="openstack/horizon-db87b77d9-ns48f" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.359218 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-scripts\") pod \"horizon-db87b77d9-ns48f\" (UID: \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\") " pod="openstack/horizon-db87b77d9-ns48f" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.367388 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.369320 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.376795 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-4bmfl"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.377853 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.384294 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.384529 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.386724 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rx6w7" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.411633 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.436437 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-dr78q"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.437598 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dr78q" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.445391 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.445632 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.445782 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xrfkn" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.460450 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-scripts\") pod \"ceilometer-0\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " pod="openstack/ceilometer-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.460486 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-horizon-secret-key\") pod \"horizon-db87b77d9-ns48f\" (UID: \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\") " pod="openstack/horizon-db87b77d9-ns48f" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.460516 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8426fd89-9eba-46fa-8611-e98cc7636b41-etc-machine-id\") pod \"cinder-db-sync-f49xd\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " pod="openstack/cinder-db-sync-f49xd" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.460550 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-scripts\") pod \"cinder-db-sync-f49xd\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " pod="openstack/cinder-db-sync-f49xd" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.460576 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-combined-ca-bundle\") pod \"cinder-db-sync-f49xd\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " pod="openstack/cinder-db-sync-f49xd" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.460604 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-scripts\") pod \"horizon-db87b77d9-ns48f\" (UID: \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\") " pod="openstack/horizon-db87b77d9-ns48f" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.460634 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cec2d73-9ca8-4a8b-836d-efce961fbde8-log-httpd\") pod \"ceilometer-0\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " pod="openstack/ceilometer-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.460659 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-config-data\") pod \"cinder-db-sync-f49xd\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " pod="openstack/cinder-db-sync-f49xd" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.460696 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " pod="openstack/ceilometer-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.460714 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-config-data\") pod \"ceilometer-0\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " pod="openstack/ceilometer-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.460740 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k4xk\" (UniqueName: \"kubernetes.io/projected/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-kube-api-access-8k4xk\") pod \"horizon-db87b77d9-ns48f\" (UID: \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\") " pod="openstack/horizon-db87b77d9-ns48f" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.460761 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-logs\") pod \"horizon-db87b77d9-ns48f\" (UID: \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\") " pod="openstack/horizon-db87b77d9-ns48f" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.460786 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-config-data\") pod \"horizon-db87b77d9-ns48f\" (UID: \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\") " pod="openstack/horizon-db87b77d9-ns48f" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.460816 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srl4m\" (UniqueName: \"kubernetes.io/projected/7cec2d73-9ca8-4a8b-836d-efce961fbde8-kube-api-access-srl4m\") pod \"ceilometer-0\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " pod="openstack/ceilometer-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.460850 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cec2d73-9ca8-4a8b-836d-efce961fbde8-run-httpd\") pod \"ceilometer-0\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " pod="openstack/ceilometer-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.460877 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-db-sync-config-data\") pod \"cinder-db-sync-f49xd\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " pod="openstack/cinder-db-sync-f49xd" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.460907 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " pod="openstack/ceilometer-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.460926 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr9sd\" (UniqueName: \"kubernetes.io/projected/8426fd89-9eba-46fa-8611-e98cc7636b41-kube-api-access-mr9sd\") pod \"cinder-db-sync-f49xd\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " pod="openstack/cinder-db-sync-f49xd" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.464990 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-scripts\") pod \"horizon-db87b77d9-ns48f\" (UID: \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\") " pod="openstack/horizon-db87b77d9-ns48f" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.465141 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-logs\") pod \"horizon-db87b77d9-ns48f\" (UID: \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\") " pod="openstack/horizon-db87b77d9-ns48f" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.466096 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-config-data\") pod \"horizon-db87b77d9-ns48f\" (UID: \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\") " pod="openstack/horizon-db87b77d9-ns48f" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.476104 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-horizon-secret-key\") pod \"horizon-db87b77d9-ns48f\" (UID: \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\") " pod="openstack/horizon-db87b77d9-ns48f" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.503863 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-xbnb6"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.505322 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-xbnb6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.507755 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k4xk\" (UniqueName: \"kubernetes.io/projected/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-kube-api-access-8k4xk\") pod \"horizon-db87b77d9-ns48f\" (UID: \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\") " pod="openstack/horizon-db87b77d9-ns48f" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.533184 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-7g59c"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.534274 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7g59c" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.553540 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-zs6cf" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.553657 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7f6f7dcd75-m9jm6"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.553725 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.557452 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f6f7dcd75-m9jm6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.562897 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srl4m\" (UniqueName: \"kubernetes.io/projected/7cec2d73-9ca8-4a8b-836d-efce961fbde8-kube-api-access-srl4m\") pod \"ceilometer-0\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " pod="openstack/ceilometer-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.562954 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ae813248-510e-4b19-bcd8-39cefca6cd37-config\") pod \"neutron-db-sync-dr78q\" (UID: \"ae813248-510e-4b19-bcd8-39cefca6cd37\") " pod="openstack/neutron-db-sync-dr78q" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.562974 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cec2d73-9ca8-4a8b-836d-efce961fbde8-run-httpd\") pod \"ceilometer-0\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " pod="openstack/ceilometer-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.562995 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-db-sync-config-data\") pod \"cinder-db-sync-f49xd\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " pod="openstack/cinder-db-sync-f49xd" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.563018 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr9sd\" (UniqueName: \"kubernetes.io/projected/8426fd89-9eba-46fa-8611-e98cc7636b41-kube-api-access-mr9sd\") pod \"cinder-db-sync-f49xd\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " pod="openstack/cinder-db-sync-f49xd" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.563035 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " pod="openstack/ceilometer-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.563060 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-scripts\") pod \"ceilometer-0\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " pod="openstack/ceilometer-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.563081 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8426fd89-9eba-46fa-8611-e98cc7636b41-etc-machine-id\") pod \"cinder-db-sync-f49xd\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " pod="openstack/cinder-db-sync-f49xd" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.563103 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae813248-510e-4b19-bcd8-39cefca6cd37-combined-ca-bundle\") pod \"neutron-db-sync-dr78q\" (UID: \"ae813248-510e-4b19-bcd8-39cefca6cd37\") " pod="openstack/neutron-db-sync-dr78q" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.563149 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-scripts\") pod \"cinder-db-sync-f49xd\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " pod="openstack/cinder-db-sync-f49xd" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.563172 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-combined-ca-bundle\") pod \"cinder-db-sync-f49xd\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " pod="openstack/cinder-db-sync-f49xd" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.563203 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cec2d73-9ca8-4a8b-836d-efce961fbde8-log-httpd\") pod \"ceilometer-0\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " pod="openstack/ceilometer-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.563229 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-config-data\") pod \"cinder-db-sync-f49xd\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " pod="openstack/cinder-db-sync-f49xd" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.563258 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqk94\" (UniqueName: \"kubernetes.io/projected/ae813248-510e-4b19-bcd8-39cefca6cd37-kube-api-access-pqk94\") pod \"neutron-db-sync-dr78q\" (UID: \"ae813248-510e-4b19-bcd8-39cefca6cd37\") " pod="openstack/neutron-db-sync-dr78q" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.563278 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " pod="openstack/ceilometer-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.563296 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-config-data\") pod \"ceilometer-0\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " pod="openstack/ceilometer-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.569321 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8426fd89-9eba-46fa-8611-e98cc7636b41-etc-machine-id\") pod \"cinder-db-sync-f49xd\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " pod="openstack/cinder-db-sync-f49xd" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.569802 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cec2d73-9ca8-4a8b-836d-efce961fbde8-log-httpd\") pod \"ceilometer-0\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " pod="openstack/ceilometer-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.570063 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cec2d73-9ca8-4a8b-836d-efce961fbde8-run-httpd\") pod \"ceilometer-0\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " pod="openstack/ceilometer-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.570512 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-dr78q"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.576189 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-db87b77d9-ns48f" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.578343 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-7g59c"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.587969 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " pod="openstack/ceilometer-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.588175 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-db-sync-config-data\") pod \"cinder-db-sync-f49xd\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " pod="openstack/cinder-db-sync-f49xd" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.588516 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-config-data\") pod \"ceilometer-0\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " pod="openstack/ceilometer-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.588909 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f6f7dcd75-m9jm6"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.590018 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-config-data\") pod \"cinder-db-sync-f49xd\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " pod="openstack/cinder-db-sync-f49xd" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.590563 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-scripts\") pod \"cinder-db-sync-f49xd\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " pod="openstack/cinder-db-sync-f49xd" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.594540 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " pod="openstack/ceilometer-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.600502 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-scripts\") pod \"ceilometer-0\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " pod="openstack/ceilometer-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.607586 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-combined-ca-bundle\") pod \"cinder-db-sync-f49xd\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " pod="openstack/cinder-db-sync-f49xd" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.607616 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srl4m\" (UniqueName: \"kubernetes.io/projected/7cec2d73-9ca8-4a8b-836d-efce961fbde8-kube-api-access-srl4m\") pod \"ceilometer-0\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " pod="openstack/ceilometer-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.620474 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr9sd\" (UniqueName: \"kubernetes.io/projected/8426fd89-9eba-46fa-8611-e98cc7636b41-kube-api-access-mr9sd\") pod \"cinder-db-sync-f49xd\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " pod="openstack/cinder-db-sync-f49xd" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.621853 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-xbnb6"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.638580 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-z6w9z"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.639848 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-z6w9z" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.649650 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2ghn5" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.649839 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.652708 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.673382 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d45d20cb-c561-4b84-b327-9b096865e8bb-db-sync-config-data\") pod \"barbican-db-sync-7g59c\" (UID: \"d45d20cb-c561-4b84-b327-9b096865e8bb\") " pod="openstack/barbican-db-sync-7g59c" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.673433 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-config\") pod \"dnsmasq-dns-6c9c9f998c-xbnb6\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " pod="openstack/dnsmasq-dns-6c9c9f998c-xbnb6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.673462 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-xbnb6\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " pod="openstack/dnsmasq-dns-6c9c9f998c-xbnb6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.673485 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-xbnb6\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " pod="openstack/dnsmasq-dns-6c9c9f998c-xbnb6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.673515 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqk94\" (UniqueName: \"kubernetes.io/projected/ae813248-510e-4b19-bcd8-39cefca6cd37-kube-api-access-pqk94\") pod \"neutron-db-sync-dr78q\" (UID: \"ae813248-510e-4b19-bcd8-39cefca6cd37\") " pod="openstack/neutron-db-sync-dr78q" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.673533 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41ed545b-f613-4408-bd1c-df5a09432e39-scripts\") pod \"horizon-7f6f7dcd75-m9jm6\" (UID: \"41ed545b-f613-4408-bd1c-df5a09432e39\") " pod="openstack/horizon-7f6f7dcd75-m9jm6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.673557 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n82h\" (UniqueName: \"kubernetes.io/projected/41ed545b-f613-4408-bd1c-df5a09432e39-kube-api-access-7n82h\") pod \"horizon-7f6f7dcd75-m9jm6\" (UID: \"41ed545b-f613-4408-bd1c-df5a09432e39\") " pod="openstack/horizon-7f6f7dcd75-m9jm6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.673578 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-xbnb6\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " pod="openstack/dnsmasq-dns-6c9c9f998c-xbnb6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.673598 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-xbnb6\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " pod="openstack/dnsmasq-dns-6c9c9f998c-xbnb6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.673624 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvm74\" (UniqueName: \"kubernetes.io/projected/e1315502-3c1c-4d70-b105-d31a6e2fe754-kube-api-access-tvm74\") pod \"dnsmasq-dns-6c9c9f998c-xbnb6\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " pod="openstack/dnsmasq-dns-6c9c9f998c-xbnb6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.673647 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/41ed545b-f613-4408-bd1c-df5a09432e39-horizon-secret-key\") pod \"horizon-7f6f7dcd75-m9jm6\" (UID: \"41ed545b-f613-4408-bd1c-df5a09432e39\") " pod="openstack/horizon-7f6f7dcd75-m9jm6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.673664 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ae813248-510e-4b19-bcd8-39cefca6cd37-config\") pod \"neutron-db-sync-dr78q\" (UID: \"ae813248-510e-4b19-bcd8-39cefca6cd37\") " pod="openstack/neutron-db-sync-dr78q" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.673684 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/41ed545b-f613-4408-bd1c-df5a09432e39-config-data\") pod \"horizon-7f6f7dcd75-m9jm6\" (UID: \"41ed545b-f613-4408-bd1c-df5a09432e39\") " pod="openstack/horizon-7f6f7dcd75-m9jm6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.673713 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d45d20cb-c561-4b84-b327-9b096865e8bb-combined-ca-bundle\") pod \"barbican-db-sync-7g59c\" (UID: \"d45d20cb-c561-4b84-b327-9b096865e8bb\") " pod="openstack/barbican-db-sync-7g59c" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.673748 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae813248-510e-4b19-bcd8-39cefca6cd37-combined-ca-bundle\") pod \"neutron-db-sync-dr78q\" (UID: \"ae813248-510e-4b19-bcd8-39cefca6cd37\") " pod="openstack/neutron-db-sync-dr78q" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.673773 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ed545b-f613-4408-bd1c-df5a09432e39-logs\") pod \"horizon-7f6f7dcd75-m9jm6\" (UID: \"41ed545b-f613-4408-bd1c-df5a09432e39\") " pod="openstack/horizon-7f6f7dcd75-m9jm6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.673796 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmccd\" (UniqueName: \"kubernetes.io/projected/d45d20cb-c561-4b84-b327-9b096865e8bb-kube-api-access-xmccd\") pod \"barbican-db-sync-7g59c\" (UID: \"d45d20cb-c561-4b84-b327-9b096865e8bb\") " pod="openstack/barbican-db-sync-7g59c" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.680021 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ae813248-510e-4b19-bcd8-39cefca6cd37-config\") pod \"neutron-db-sync-dr78q\" (UID: \"ae813248-510e-4b19-bcd8-39cefca6cd37\") " pod="openstack/neutron-db-sync-dr78q" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.687847 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae813248-510e-4b19-bcd8-39cefca6cd37-combined-ca-bundle\") pod \"neutron-db-sync-dr78q\" (UID: \"ae813248-510e-4b19-bcd8-39cefca6cd37\") " pod="openstack/neutron-db-sync-dr78q" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.706200 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-z6w9z"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.707508 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f49xd" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.714751 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqk94\" (UniqueName: \"kubernetes.io/projected/ae813248-510e-4b19-bcd8-39cefca6cd37-kube-api-access-pqk94\") pod \"neutron-db-sync-dr78q\" (UID: \"ae813248-510e-4b19-bcd8-39cefca6cd37\") " pod="openstack/neutron-db-sync-dr78q" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.715540 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.733930 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-xbnb6"] Feb 26 11:31:16 crc kubenswrapper[4699]: E0226 11:31:16.734782 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-tvm74 ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-6c9c9f998c-xbnb6" podUID="e1315502-3c1c-4d70-b105-d31a6e2fe754" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.755378 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.778472 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d45d20cb-c561-4b84-b327-9b096865e8bb-db-sync-config-data\") pod \"barbican-db-sync-7g59c\" (UID: \"d45d20cb-c561-4b84-b327-9b096865e8bb\") " pod="openstack/barbican-db-sync-7g59c" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.778514 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-scripts\") pod \"placement-db-sync-z6w9z\" (UID: \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\") " pod="openstack/placement-db-sync-z6w9z" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.778547 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-config\") pod \"dnsmasq-dns-6c9c9f998c-xbnb6\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " pod="openstack/dnsmasq-dns-6c9c9f998c-xbnb6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.778579 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-xbnb6\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " pod="openstack/dnsmasq-dns-6c9c9f998c-xbnb6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.778599 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-xbnb6\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " pod="openstack/dnsmasq-dns-6c9c9f998c-xbnb6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.778622 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-combined-ca-bundle\") pod \"placement-db-sync-z6w9z\" (UID: \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\") " pod="openstack/placement-db-sync-z6w9z" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.778658 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41ed545b-f613-4408-bd1c-df5a09432e39-scripts\") pod \"horizon-7f6f7dcd75-m9jm6\" (UID: \"41ed545b-f613-4408-bd1c-df5a09432e39\") " pod="openstack/horizon-7f6f7dcd75-m9jm6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.778692 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n82h\" (UniqueName: \"kubernetes.io/projected/41ed545b-f613-4408-bd1c-df5a09432e39-kube-api-access-7n82h\") pod \"horizon-7f6f7dcd75-m9jm6\" (UID: \"41ed545b-f613-4408-bd1c-df5a09432e39\") " pod="openstack/horizon-7f6f7dcd75-m9jm6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.778717 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-xbnb6\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " pod="openstack/dnsmasq-dns-6c9c9f998c-xbnb6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.778739 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-xbnb6\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " pod="openstack/dnsmasq-dns-6c9c9f998c-xbnb6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.778769 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvm74\" (UniqueName: \"kubernetes.io/projected/e1315502-3c1c-4d70-b105-d31a6e2fe754-kube-api-access-tvm74\") pod \"dnsmasq-dns-6c9c9f998c-xbnb6\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " pod="openstack/dnsmasq-dns-6c9c9f998c-xbnb6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.780185 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/41ed545b-f613-4408-bd1c-df5a09432e39-horizon-secret-key\") pod \"horizon-7f6f7dcd75-m9jm6\" (UID: \"41ed545b-f613-4408-bd1c-df5a09432e39\") " pod="openstack/horizon-7f6f7dcd75-m9jm6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.780221 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/41ed545b-f613-4408-bd1c-df5a09432e39-config-data\") pod \"horizon-7f6f7dcd75-m9jm6\" (UID: \"41ed545b-f613-4408-bd1c-df5a09432e39\") " pod="openstack/horizon-7f6f7dcd75-m9jm6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.780318 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99w54\" (UniqueName: \"kubernetes.io/projected/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-kube-api-access-99w54\") pod \"placement-db-sync-z6w9z\" (UID: \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\") " pod="openstack/placement-db-sync-z6w9z" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.780510 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d45d20cb-c561-4b84-b327-9b096865e8bb-combined-ca-bundle\") pod \"barbican-db-sync-7g59c\" (UID: \"d45d20cb-c561-4b84-b327-9b096865e8bb\") " pod="openstack/barbican-db-sync-7g59c" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.780561 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-config-data\") pod \"placement-db-sync-z6w9z\" (UID: \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\") " pod="openstack/placement-db-sync-z6w9z" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.780613 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ed545b-f613-4408-bd1c-df5a09432e39-logs\") pod \"horizon-7f6f7dcd75-m9jm6\" (UID: \"41ed545b-f613-4408-bd1c-df5a09432e39\") " pod="openstack/horizon-7f6f7dcd75-m9jm6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.780632 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-logs\") pod \"placement-db-sync-z6w9z\" (UID: \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\") " pod="openstack/placement-db-sync-z6w9z" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.780667 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmccd\" (UniqueName: \"kubernetes.io/projected/d45d20cb-c561-4b84-b327-9b096865e8bb-kube-api-access-xmccd\") pod \"barbican-db-sync-7g59c\" (UID: \"d45d20cb-c561-4b84-b327-9b096865e8bb\") " pod="openstack/barbican-db-sync-7g59c" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.781776 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-xbnb6\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " pod="openstack/dnsmasq-dns-6c9c9f998c-xbnb6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.784545 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41ed545b-f613-4408-bd1c-df5a09432e39-scripts\") pod \"horizon-7f6f7dcd75-m9jm6\" (UID: \"41ed545b-f613-4408-bd1c-df5a09432e39\") " pod="openstack/horizon-7f6f7dcd75-m9jm6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.787350 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-xbnb6\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " pod="openstack/dnsmasq-dns-6c9c9f998c-xbnb6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.789586 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-xbnb6\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " pod="openstack/dnsmasq-dns-6c9c9f998c-xbnb6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.789854 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-xbnb6\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " pod="openstack/dnsmasq-dns-6c9c9f998c-xbnb6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.789966 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d45d20cb-c561-4b84-b327-9b096865e8bb-db-sync-config-data\") pod \"barbican-db-sync-7g59c\" (UID: \"d45d20cb-c561-4b84-b327-9b096865e8bb\") " pod="openstack/barbican-db-sync-7g59c" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.790145 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ed545b-f613-4408-bd1c-df5a09432e39-logs\") pod \"horizon-7f6f7dcd75-m9jm6\" (UID: \"41ed545b-f613-4408-bd1c-df5a09432e39\") " pod="openstack/horizon-7f6f7dcd75-m9jm6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.790976 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/41ed545b-f613-4408-bd1c-df5a09432e39-config-data\") pod \"horizon-7f6f7dcd75-m9jm6\" (UID: \"41ed545b-f613-4408-bd1c-df5a09432e39\") " pod="openstack/horizon-7f6f7dcd75-m9jm6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.792642 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-config\") pod \"dnsmasq-dns-6c9c9f998c-xbnb6\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " pod="openstack/dnsmasq-dns-6c9c9f998c-xbnb6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.798488 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.800404 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dr78q" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.803391 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.803670 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-j4q6c" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.806979 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.810747 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/41ed545b-f613-4408-bd1c-df5a09432e39-horizon-secret-key\") pod \"horizon-7f6f7dcd75-m9jm6\" (UID: \"41ed545b-f613-4408-bd1c-df5a09432e39\") " pod="openstack/horizon-7f6f7dcd75-m9jm6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.812387 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.819786 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmccd\" (UniqueName: \"kubernetes.io/projected/d45d20cb-c561-4b84-b327-9b096865e8bb-kube-api-access-xmccd\") pod \"barbican-db-sync-7g59c\" (UID: \"d45d20cb-c561-4b84-b327-9b096865e8bb\") " pod="openstack/barbican-db-sync-7g59c" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.820286 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n82h\" (UniqueName: \"kubernetes.io/projected/41ed545b-f613-4408-bd1c-df5a09432e39-kube-api-access-7n82h\") pod \"horizon-7f6f7dcd75-m9jm6\" (UID: \"41ed545b-f613-4408-bd1c-df5a09432e39\") " pod="openstack/horizon-7f6f7dcd75-m9jm6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.840412 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d45d20cb-c561-4b84-b327-9b096865e8bb-combined-ca-bundle\") pod \"barbican-db-sync-7g59c\" (UID: \"d45d20cb-c561-4b84-b327-9b096865e8bb\") " pod="openstack/barbican-db-sync-7g59c" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.861109 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvm74\" (UniqueName: \"kubernetes.io/projected/e1315502-3c1c-4d70-b105-d31a6e2fe754-kube-api-access-tvm74\") pod \"dnsmasq-dns-6c9c9f998c-xbnb6\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " pod="openstack/dnsmasq-dns-6c9c9f998c-xbnb6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.876335 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-nhzhh"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.877774 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.882237 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztqww\" (UniqueName: \"kubernetes.io/projected/630dc7fb-8bb5-4136-accd-eb460ad0e940-kube-api-access-ztqww\") pod \"glance-default-external-api-0\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.882457 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-logs\") pod \"placement-db-sync-z6w9z\" (UID: \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\") " pod="openstack/placement-db-sync-z6w9z" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.882609 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-scripts\") pod \"placement-db-sync-z6w9z\" (UID: \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\") " pod="openstack/placement-db-sync-z6w9z" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.882705 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/630dc7fb-8bb5-4136-accd-eb460ad0e940-logs\") pod \"glance-default-external-api-0\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.882845 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/630dc7fb-8bb5-4136-accd-eb460ad0e940-config-data\") pod \"glance-default-external-api-0\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.882958 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-combined-ca-bundle\") pod \"placement-db-sync-z6w9z\" (UID: \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\") " pod="openstack/placement-db-sync-z6w9z" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.883304 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/630dc7fb-8bb5-4136-accd-eb460ad0e940-scripts\") pod \"glance-default-external-api-0\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.885598 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-logs\") pod \"placement-db-sync-z6w9z\" (UID: \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\") " pod="openstack/placement-db-sync-z6w9z" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.888792 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-nhzhh"] Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.889980 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99w54\" (UniqueName: \"kubernetes.io/projected/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-kube-api-access-99w54\") pod \"placement-db-sync-z6w9z\" (UID: \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\") " pod="openstack/placement-db-sync-z6w9z" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.891591 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/630dc7fb-8bb5-4136-accd-eb460ad0e940-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.892593 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-config-data\") pod \"placement-db-sync-z6w9z\" (UID: \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\") " pod="openstack/placement-db-sync-z6w9z" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.891616 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-combined-ca-bundle\") pod \"placement-db-sync-z6w9z\" (UID: \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\") " pod="openstack/placement-db-sync-z6w9z" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.890187 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-scripts\") pod \"placement-db-sync-z6w9z\" (UID: \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\") " pod="openstack/placement-db-sync-z6w9z" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.890582 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7g59c" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.894319 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.899943 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/630dc7fb-8bb5-4136-accd-eb460ad0e940-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.898709 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-config-data\") pod \"placement-db-sync-z6w9z\" (UID: \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\") " pod="openstack/placement-db-sync-z6w9z" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.916689 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f6f7dcd75-m9jm6" Feb 26 11:31:16 crc kubenswrapper[4699]: I0226 11:31:16.929141 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99w54\" (UniqueName: \"kubernetes.io/projected/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-kube-api-access-99w54\") pod \"placement-db-sync-z6w9z\" (UID: \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\") " pod="openstack/placement-db-sync-z6w9z" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.001770 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-z6w9z" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.002219 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-nhzhh\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.003343 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/630dc7fb-8bb5-4136-accd-eb460ad0e940-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.003492 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.003608 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/630dc7fb-8bb5-4136-accd-eb460ad0e940-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.003915 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztqww\" (UniqueName: \"kubernetes.io/projected/630dc7fb-8bb5-4136-accd-eb460ad0e940-kube-api-access-ztqww\") pod \"glance-default-external-api-0\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.004247 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.003994 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/630dc7fb-8bb5-4136-accd-eb460ad0e940-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.005057 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-nhzhh\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.005363 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2rj6\" (UniqueName: \"kubernetes.io/projected/81843e2c-774f-402a-bd90-c4485ab24c05-kube-api-access-z2rj6\") pod \"dnsmasq-dns-57c957c4ff-nhzhh\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.005517 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-config\") pod \"dnsmasq-dns-57c957c4ff-nhzhh\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.005615 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/630dc7fb-8bb5-4136-accd-eb460ad0e940-logs\") pod \"glance-default-external-api-0\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.005711 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/630dc7fb-8bb5-4136-accd-eb460ad0e940-config-data\") pod \"glance-default-external-api-0\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.005828 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-nhzhh\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.006357 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-nhzhh\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.007256 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/630dc7fb-8bb5-4136-accd-eb460ad0e940-scripts\") pod \"glance-default-external-api-0\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.007006 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/630dc7fb-8bb5-4136-accd-eb460ad0e940-logs\") pod \"glance-default-external-api-0\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.010875 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/630dc7fb-8bb5-4136-accd-eb460ad0e940-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.016732 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/630dc7fb-8bb5-4136-accd-eb460ad0e940-scripts\") pod \"glance-default-external-api-0\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.021228 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/630dc7fb-8bb5-4136-accd-eb460ad0e940-config-data\") pod \"glance-default-external-api-0\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.034345 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztqww\" (UniqueName: \"kubernetes.io/projected/630dc7fb-8bb5-4136-accd-eb460ad0e940-kube-api-access-ztqww\") pod \"glance-default-external-api-0\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.073706 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.110427 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-nhzhh\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.110496 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-nhzhh\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.110513 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2rj6\" (UniqueName: \"kubernetes.io/projected/81843e2c-774f-402a-bd90-c4485ab24c05-kube-api-access-z2rj6\") pod \"dnsmasq-dns-57c957c4ff-nhzhh\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.110544 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-config\") pod \"dnsmasq-dns-57c957c4ff-nhzhh\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.110584 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-nhzhh\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.110599 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-nhzhh\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.111782 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-nhzhh\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.112068 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-nhzhh\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.112185 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-nhzhh\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.112720 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-nhzhh\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.114765 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-config\") pod \"dnsmasq-dns-57c957c4ff-nhzhh\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.154516 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.169679 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2rj6\" (UniqueName: \"kubernetes.io/projected/81843e2c-774f-402a-bd90-c4485ab24c05-kube-api-access-z2rj6\") pod \"dnsmasq-dns-57c957c4ff-nhzhh\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:31:17 crc kubenswrapper[4699]: W0226 11:31:17.240184 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fc97c22_3dce_4a90_bd78_d976a368e56c.slice/crio-9f0340f743584c76493adaa36d5d0b55236ad93a692317a2e0b7aec568cbee52 WatchSource:0}: Error finding container 9f0340f743584c76493adaa36d5d0b55236ad93a692317a2e0b7aec568cbee52: Status 404 returned error can't find the container with id 9f0340f743584c76493adaa36d5d0b55236ad93a692317a2e0b7aec568cbee52 Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.260513 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rx6w7"] Feb 26 11:31:17 crc kubenswrapper[4699]: W0226 11:31:17.278947 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod833927c0_710f_446e_a3be_0df2b2399638.slice/crio-654ff57a8dfa79f3d0a638f1486da08c6ebdf1d037f74a27a5ccdbee84215eb5 WatchSource:0}: Error finding container 654ff57a8dfa79f3d0a638f1486da08c6ebdf1d037f74a27a5ccdbee84215eb5: Status 404 returned error can't find the container with id 654ff57a8dfa79f3d0a638f1486da08c6ebdf1d037f74a27a5ccdbee84215eb5 Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.279005 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-4bmfl"] Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.407240 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.410456 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.415590 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.421751 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.424173 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.520763 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.525418 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c24335e-75be-481e-b1c8-631913d074ee-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.525484 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c24335e-75be-481e-b1c8-631913d074ee-logs\") pod \"glance-default-internal-api-0\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.525541 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c24335e-75be-481e-b1c8-631913d074ee-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.525564 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c24335e-75be-481e-b1c8-631913d074ee-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.525593 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.525619 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c24335e-75be-481e-b1c8-631913d074ee-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.525696 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xwxg\" (UniqueName: \"kubernetes.io/projected/9c24335e-75be-481e-b1c8-631913d074ee-kube-api-access-6xwxg\") pod \"glance-default-internal-api-0\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.539733 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-db87b77d9-ns48f"] Feb 26 11:31:17 crc kubenswrapper[4699]: W0226 11:31:17.550261 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9cf42d8_ed15_44dd_aaed_fbffa29417c4.slice/crio-1e699d538cdebfe589c475a344848f907766835460184b9e330ed614ebb6483c WatchSource:0}: Error finding container 1e699d538cdebfe589c475a344848f907766835460184b9e330ed614ebb6483c: Status 404 returned error can't find the container with id 1e699d538cdebfe589c475a344848f907766835460184b9e330ed614ebb6483c Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.631182 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c24335e-75be-481e-b1c8-631913d074ee-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.631235 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c24335e-75be-481e-b1c8-631913d074ee-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.631270 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.631294 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c24335e-75be-481e-b1c8-631913d074ee-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.631367 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xwxg\" (UniqueName: \"kubernetes.io/projected/9c24335e-75be-481e-b1c8-631913d074ee-kube-api-access-6xwxg\") pod \"glance-default-internal-api-0\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.631406 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c24335e-75be-481e-b1c8-631913d074ee-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.631436 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c24335e-75be-481e-b1c8-631913d074ee-logs\") pod \"glance-default-internal-api-0\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.631866 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c24335e-75be-481e-b1c8-631913d074ee-logs\") pod \"glance-default-internal-api-0\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.632110 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c24335e-75be-481e-b1c8-631913d074ee-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.638381 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.642492 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c24335e-75be-481e-b1c8-631913d074ee-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.643604 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c24335e-75be-481e-b1c8-631913d074ee-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.660189 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c24335e-75be-481e-b1c8-631913d074ee-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.687484 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xwxg\" (UniqueName: \"kubernetes.io/projected/9c24335e-75be-481e-b1c8-631913d074ee-kube-api-access-6xwxg\") pod \"glance-default-internal-api-0\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.734353 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.760207 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-dr78q"] Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.778652 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-f49xd"] Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.785195 4699 generic.go:334] "Generic (PLEG): container finished" podID="7fc97c22-3dce-4a90-bd78-d976a368e56c" containerID="e826d5a3469d5b15dd136a91917fe06077ab83ca45f565c094384a045ed23b99" exitCode=0 Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.785251 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" event={"ID":"7fc97c22-3dce-4a90-bd78-d976a368e56c","Type":"ContainerDied","Data":"e826d5a3469d5b15dd136a91917fe06077ab83ca45f565c094384a045ed23b99"} Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.785276 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" event={"ID":"7fc97c22-3dce-4a90-bd78-d976a368e56c","Type":"ContainerStarted","Data":"9f0340f743584c76493adaa36d5d0b55236ad93a692317a2e0b7aec568cbee52"} Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.787980 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-7g59c"] Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.826344 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rx6w7" event={"ID":"833927c0-710f-446e-a3be-0df2b2399638","Type":"ContainerStarted","Data":"5822866374c533954891aab83b4e82e6518ecfafe343985ba49ddc3abdfd00dc"} Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.826398 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rx6w7" event={"ID":"833927c0-710f-446e-a3be-0df2b2399638","Type":"ContainerStarted","Data":"654ff57a8dfa79f3d0a638f1486da08c6ebdf1d037f74a27a5ccdbee84215eb5"} Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.853666 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-db87b77d9-ns48f" event={"ID":"c9cf42d8-ed15-44dd-aaed-fbffa29417c4","Type":"ContainerStarted","Data":"1e699d538cdebfe589c475a344848f907766835460184b9e330ed614ebb6483c"} Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.874333 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-xbnb6" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.874983 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7cec2d73-9ca8-4a8b-836d-efce961fbde8","Type":"ContainerStarted","Data":"50c24ca371e65d6a43a9a97ed072f4bd1eadffc6515aa3e571658b4eeec32c3b"} Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.882091 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rx6w7" podStartSLOduration=2.88206837 podStartE2EDuration="2.88206837s" podCreationTimestamp="2026-02-26 11:31:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:31:17.875263944 +0000 UTC m=+1223.686090388" watchObservedRunningTime="2026-02-26 11:31:17.88206837 +0000 UTC m=+1223.692894804" Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.918782 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-xbnb6" Feb 26 11:31:17 crc kubenswrapper[4699]: W0226 11:31:17.970607 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47a9d008_5b7e_4866_b92b_efcb60cbfdb0.slice/crio-920ac430b947cdc3b32b9b6348a1213ef17636f95a2668e9fab680798b77b616 WatchSource:0}: Error finding container 920ac430b947cdc3b32b9b6348a1213ef17636f95a2668e9fab680798b77b616: Status 404 returned error can't find the container with id 920ac430b947cdc3b32b9b6348a1213ef17636f95a2668e9fab680798b77b616 Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.979719 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f6f7dcd75-m9jm6"] Feb 26 11:31:17 crc kubenswrapper[4699]: I0226 11:31:17.987539 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-z6w9z"] Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.034849 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.052258 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-ovsdbserver-sb\") pod \"e1315502-3c1c-4d70-b105-d31a6e2fe754\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.052353 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-ovsdbserver-nb\") pod \"e1315502-3c1c-4d70-b105-d31a6e2fe754\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.052397 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-dns-svc\") pod \"e1315502-3c1c-4d70-b105-d31a6e2fe754\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.052497 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-dns-swift-storage-0\") pod \"e1315502-3c1c-4d70-b105-d31a6e2fe754\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.052602 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-config\") pod \"e1315502-3c1c-4d70-b105-d31a6e2fe754\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.052702 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvm74\" (UniqueName: \"kubernetes.io/projected/e1315502-3c1c-4d70-b105-d31a6e2fe754-kube-api-access-tvm74\") pod \"e1315502-3c1c-4d70-b105-d31a6e2fe754\" (UID: \"e1315502-3c1c-4d70-b105-d31a6e2fe754\") " Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.053801 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e1315502-3c1c-4d70-b105-d31a6e2fe754" (UID: "e1315502-3c1c-4d70-b105-d31a6e2fe754"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.053923 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e1315502-3c1c-4d70-b105-d31a6e2fe754" (UID: "e1315502-3c1c-4d70-b105-d31a6e2fe754"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.054134 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e1315502-3c1c-4d70-b105-d31a6e2fe754" (UID: "e1315502-3c1c-4d70-b105-d31a6e2fe754"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.054358 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-config" (OuterVolumeSpecName: "config") pod "e1315502-3c1c-4d70-b105-d31a6e2fe754" (UID: "e1315502-3c1c-4d70-b105-d31a6e2fe754"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.054519 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e1315502-3c1c-4d70-b105-d31a6e2fe754" (UID: "e1315502-3c1c-4d70-b105-d31a6e2fe754"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.058705 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1315502-3c1c-4d70-b105-d31a6e2fe754-kube-api-access-tvm74" (OuterVolumeSpecName: "kube-api-access-tvm74") pod "e1315502-3c1c-4d70-b105-d31a6e2fe754" (UID: "e1315502-3c1c-4d70-b105-d31a6e2fe754"). InnerVolumeSpecName "kube-api-access-tvm74". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.059797 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-nhzhh"] Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.161789 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.161842 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvm74\" (UniqueName: \"kubernetes.io/projected/e1315502-3c1c-4d70-b105-d31a6e2fe754-kube-api-access-tvm74\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.161855 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.161865 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.161876 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.161912 4699 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1315502-3c1c-4d70-b105-d31a6e2fe754-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.314388 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.317777 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.372741 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-config\") pod \"7fc97c22-3dce-4a90-bd78-d976a368e56c\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.372789 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-ovsdbserver-sb\") pod \"7fc97c22-3dce-4a90-bd78-d976a368e56c\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.372909 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-dns-swift-storage-0\") pod \"7fc97c22-3dce-4a90-bd78-d976a368e56c\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.373003 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgf56\" (UniqueName: \"kubernetes.io/projected/7fc97c22-3dce-4a90-bd78-d976a368e56c-kube-api-access-dgf56\") pod \"7fc97c22-3dce-4a90-bd78-d976a368e56c\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.373020 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-dns-svc\") pod \"7fc97c22-3dce-4a90-bd78-d976a368e56c\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.373082 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-ovsdbserver-nb\") pod \"7fc97c22-3dce-4a90-bd78-d976a368e56c\" (UID: \"7fc97c22-3dce-4a90-bd78-d976a368e56c\") " Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.393741 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fc97c22-3dce-4a90-bd78-d976a368e56c-kube-api-access-dgf56" (OuterVolumeSpecName: "kube-api-access-dgf56") pod "7fc97c22-3dce-4a90-bd78-d976a368e56c" (UID: "7fc97c22-3dce-4a90-bd78-d976a368e56c"). InnerVolumeSpecName "kube-api-access-dgf56". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.426396 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7fc97c22-3dce-4a90-bd78-d976a368e56c" (UID: "7fc97c22-3dce-4a90-bd78-d976a368e56c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.431405 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-config" (OuterVolumeSpecName: "config") pod "7fc97c22-3dce-4a90-bd78-d976a368e56c" (UID: "7fc97c22-3dce-4a90-bd78-d976a368e56c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.476047 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgf56\" (UniqueName: \"kubernetes.io/projected/7fc97c22-3dce-4a90-bd78-d976a368e56c-kube-api-access-dgf56\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.476316 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.476333 4699 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.492143 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.516539 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f6f7dcd75-m9jm6"] Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.579512 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6d649b895f-2cm8f"] Feb 26 11:31:18 crc kubenswrapper[4699]: E0226 11:31:18.581928 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fc97c22-3dce-4a90-bd78-d976a368e56c" containerName="init" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.581961 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc97c22-3dce-4a90-bd78-d976a368e56c" containerName="init" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.582210 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fc97c22-3dce-4a90-bd78-d976a368e56c" containerName="init" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.603204 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7fc97c22-3dce-4a90-bd78-d976a368e56c" (UID: "7fc97c22-3dce-4a90-bd78-d976a368e56c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.607426 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d649b895f-2cm8f" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.615254 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7fc97c22-3dce-4a90-bd78-d976a368e56c" (UID: "7fc97c22-3dce-4a90-bd78-d976a368e56c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.615724 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7fc97c22-3dce-4a90-bd78-d976a368e56c" (UID: "7fc97c22-3dce-4a90-bd78-d976a368e56c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.643943 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d649b895f-2cm8f"] Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.682266 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.683639 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6628395-d6a6-4719-b0ad-10984c3c172b-horizon-secret-key\") pod \"horizon-6d649b895f-2cm8f\" (UID: \"d6628395-d6a6-4719-b0ad-10984c3c172b\") " pod="openstack/horizon-6d649b895f-2cm8f" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.683706 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6628395-d6a6-4719-b0ad-10984c3c172b-logs\") pod \"horizon-6d649b895f-2cm8f\" (UID: \"d6628395-d6a6-4719-b0ad-10984c3c172b\") " pod="openstack/horizon-6d649b895f-2cm8f" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.683774 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6628395-d6a6-4719-b0ad-10984c3c172b-config-data\") pod \"horizon-6d649b895f-2cm8f\" (UID: \"d6628395-d6a6-4719-b0ad-10984c3c172b\") " pod="openstack/horizon-6d649b895f-2cm8f" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.684828 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfkrb\" (UniqueName: \"kubernetes.io/projected/d6628395-d6a6-4719-b0ad-10984c3c172b-kube-api-access-tfkrb\") pod \"horizon-6d649b895f-2cm8f\" (UID: \"d6628395-d6a6-4719-b0ad-10984c3c172b\") " pod="openstack/horizon-6d649b895f-2cm8f" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.684853 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6628395-d6a6-4719-b0ad-10984c3c172b-scripts\") pod \"horizon-6d649b895f-2cm8f\" (UID: \"d6628395-d6a6-4719-b0ad-10984c3c172b\") " pod="openstack/horizon-6d649b895f-2cm8f" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.684981 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.685002 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.685011 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fc97c22-3dce-4a90-bd78-d976a368e56c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.714668 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.778435 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 11:31:18 crc kubenswrapper[4699]: W0226 11:31:18.780357 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c24335e_75be_481e_b1c8_631913d074ee.slice/crio-3505d7f98aca89b6416ea19c75cdca4f118df015fab7cba4bb6ff9fd01c39fa6 WatchSource:0}: Error finding container 3505d7f98aca89b6416ea19c75cdca4f118df015fab7cba4bb6ff9fd01c39fa6: Status 404 returned error can't find the container with id 3505d7f98aca89b6416ea19c75cdca4f118df015fab7cba4bb6ff9fd01c39fa6 Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.788141 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6628395-d6a6-4719-b0ad-10984c3c172b-config-data\") pod \"horizon-6d649b895f-2cm8f\" (UID: \"d6628395-d6a6-4719-b0ad-10984c3c172b\") " pod="openstack/horizon-6d649b895f-2cm8f" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.788191 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfkrb\" (UniqueName: \"kubernetes.io/projected/d6628395-d6a6-4719-b0ad-10984c3c172b-kube-api-access-tfkrb\") pod \"horizon-6d649b895f-2cm8f\" (UID: \"d6628395-d6a6-4719-b0ad-10984c3c172b\") " pod="openstack/horizon-6d649b895f-2cm8f" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.788215 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6628395-d6a6-4719-b0ad-10984c3c172b-scripts\") pod \"horizon-6d649b895f-2cm8f\" (UID: \"d6628395-d6a6-4719-b0ad-10984c3c172b\") " pod="openstack/horizon-6d649b895f-2cm8f" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.788809 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6628395-d6a6-4719-b0ad-10984c3c172b-horizon-secret-key\") pod \"horizon-6d649b895f-2cm8f\" (UID: \"d6628395-d6a6-4719-b0ad-10984c3c172b\") " pod="openstack/horizon-6d649b895f-2cm8f" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.788879 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6628395-d6a6-4719-b0ad-10984c3c172b-logs\") pod \"horizon-6d649b895f-2cm8f\" (UID: \"d6628395-d6a6-4719-b0ad-10984c3c172b\") " pod="openstack/horizon-6d649b895f-2cm8f" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.789224 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6628395-d6a6-4719-b0ad-10984c3c172b-logs\") pod \"horizon-6d649b895f-2cm8f\" (UID: \"d6628395-d6a6-4719-b0ad-10984c3c172b\") " pod="openstack/horizon-6d649b895f-2cm8f" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.789419 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6628395-d6a6-4719-b0ad-10984c3c172b-scripts\") pod \"horizon-6d649b895f-2cm8f\" (UID: \"d6628395-d6a6-4719-b0ad-10984c3c172b\") " pod="openstack/horizon-6d649b895f-2cm8f" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.791512 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6628395-d6a6-4719-b0ad-10984c3c172b-config-data\") pod \"horizon-6d649b895f-2cm8f\" (UID: \"d6628395-d6a6-4719-b0ad-10984c3c172b\") " pod="openstack/horizon-6d649b895f-2cm8f" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.805313 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6628395-d6a6-4719-b0ad-10984c3c172b-horizon-secret-key\") pod \"horizon-6d649b895f-2cm8f\" (UID: \"d6628395-d6a6-4719-b0ad-10984c3c172b\") " pod="openstack/horizon-6d649b895f-2cm8f" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.819252 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfkrb\" (UniqueName: \"kubernetes.io/projected/d6628395-d6a6-4719-b0ad-10984c3c172b-kube-api-access-tfkrb\") pod \"horizon-6d649b895f-2cm8f\" (UID: \"d6628395-d6a6-4719-b0ad-10984c3c172b\") " pod="openstack/horizon-6d649b895f-2cm8f" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.908414 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-z6w9z" event={"ID":"47a9d008-5b7e-4866-b92b-efcb60cbfdb0","Type":"ContainerStarted","Data":"920ac430b947cdc3b32b9b6348a1213ef17636f95a2668e9fab680798b77b616"} Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.910908 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9c24335e-75be-481e-b1c8-631913d074ee","Type":"ContainerStarted","Data":"3505d7f98aca89b6416ea19c75cdca4f118df015fab7cba4bb6ff9fd01c39fa6"} Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.913322 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" event={"ID":"7fc97c22-3dce-4a90-bd78-d976a368e56c","Type":"ContainerDied","Data":"9f0340f743584c76493adaa36d5d0b55236ad93a692317a2e0b7aec568cbee52"} Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.913379 4699 scope.go:117] "RemoveContainer" containerID="e826d5a3469d5b15dd136a91917fe06077ab83ca45f565c094384a045ed23b99" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.913507 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8c45789f-4bmfl" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.927171 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f6f7dcd75-m9jm6" event={"ID":"41ed545b-f613-4408-bd1c-df5a09432e39","Type":"ContainerStarted","Data":"b1308b571f0b2b92fb651e80c640ed1db7c81e3d85041ae47619d6dae7c87aad"} Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.931428 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7g59c" event={"ID":"d45d20cb-c561-4b84-b327-9b096865e8bb","Type":"ContainerStarted","Data":"a8bf2edfe1a0cab1df993c5f3eabf3a6892b72d4d33db983d7476af16ba0c19b"} Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.934709 4699 generic.go:334] "Generic (PLEG): container finished" podID="81843e2c-774f-402a-bd90-c4485ab24c05" containerID="2161a9d96d5b3712e81eaf624a88f2f6f3ee6fc2f0aaa102d1a1b03d768333c4" exitCode=0 Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.934786 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" event={"ID":"81843e2c-774f-402a-bd90-c4485ab24c05","Type":"ContainerDied","Data":"2161a9d96d5b3712e81eaf624a88f2f6f3ee6fc2f0aaa102d1a1b03d768333c4"} Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.934808 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" event={"ID":"81843e2c-774f-402a-bd90-c4485ab24c05","Type":"ContainerStarted","Data":"fed26d1422b55affaace34ac700e5a58aa1d192cab8a88f61c67c7cb3b1ca3ed"} Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.938759 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f49xd" event={"ID":"8426fd89-9eba-46fa-8611-e98cc7636b41","Type":"ContainerStarted","Data":"3e0a4f4a5840bf076a02406c3b220ed5f7a7941a35ea7875a55be88dc0efa11e"} Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.942619 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d649b895f-2cm8f" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.943556 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dr78q" event={"ID":"ae813248-510e-4b19-bcd8-39cefca6cd37","Type":"ContainerStarted","Data":"0eab0de6a835999edb566f7a018ef04e992296918bfb17f761cbea8ef8c3775a"} Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.943598 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dr78q" event={"ID":"ae813248-510e-4b19-bcd8-39cefca6cd37","Type":"ContainerStarted","Data":"41ed63c8f69999d16d3b8a0632b0099f90cd743a1b305a70c63928dff741248e"} Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.945874 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-xbnb6" Feb 26 11:31:18 crc kubenswrapper[4699]: I0226 11:31:18.946931 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"630dc7fb-8bb5-4136-accd-eb460ad0e940","Type":"ContainerStarted","Data":"0462d99185a120468341d7f6efeca5ca1d1c779c506ddb0fb105a2de0f655ad5"} Feb 26 11:31:19 crc kubenswrapper[4699]: I0226 11:31:19.009324 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-dr78q" podStartSLOduration=3.009308383 podStartE2EDuration="3.009308383s" podCreationTimestamp="2026-02-26 11:31:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:31:18.998276135 +0000 UTC m=+1224.809102569" watchObservedRunningTime="2026-02-26 11:31:19.009308383 +0000 UTC m=+1224.820134817" Feb 26 11:31:19 crc kubenswrapper[4699]: I0226 11:31:19.207306 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-4bmfl"] Feb 26 11:31:19 crc kubenswrapper[4699]: I0226 11:31:19.218334 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-4bmfl"] Feb 26 11:31:19 crc kubenswrapper[4699]: I0226 11:31:19.266706 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-xbnb6"] Feb 26 11:31:19 crc kubenswrapper[4699]: I0226 11:31:19.289805 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-xbnb6"] Feb 26 11:31:19 crc kubenswrapper[4699]: I0226 11:31:19.469891 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d649b895f-2cm8f"] Feb 26 11:31:19 crc kubenswrapper[4699]: W0226 11:31:19.495449 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6628395_d6a6_4719_b0ad_10984c3c172b.slice/crio-cf033b29d548f1e02ed1b1bad110c9d77ffdf16f842c34bb4ebc18230fbed6bf WatchSource:0}: Error finding container cf033b29d548f1e02ed1b1bad110c9d77ffdf16f842c34bb4ebc18230fbed6bf: Status 404 returned error can't find the container with id cf033b29d548f1e02ed1b1bad110c9d77ffdf16f842c34bb4ebc18230fbed6bf Feb 26 11:31:19 crc kubenswrapper[4699]: I0226 11:31:19.958334 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"630dc7fb-8bb5-4136-accd-eb460ad0e940","Type":"ContainerStarted","Data":"790e1bee9a89611157009e82024f95d4afa15834cb397b52f1c9c892d7cb8150"} Feb 26 11:31:19 crc kubenswrapper[4699]: I0226 11:31:19.960766 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" event={"ID":"81843e2c-774f-402a-bd90-c4485ab24c05","Type":"ContainerStarted","Data":"3eda8514ede18fd03dc0849cf95cf8d9b4cb3f130429078ff465a976e2f5421b"} Feb 26 11:31:19 crc kubenswrapper[4699]: I0226 11:31:19.961088 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:31:19 crc kubenswrapper[4699]: I0226 11:31:19.965964 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d649b895f-2cm8f" event={"ID":"d6628395-d6a6-4719-b0ad-10984c3c172b","Type":"ContainerStarted","Data":"cf033b29d548f1e02ed1b1bad110c9d77ffdf16f842c34bb4ebc18230fbed6bf"} Feb 26 11:31:19 crc kubenswrapper[4699]: I0226 11:31:19.984578 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" podStartSLOduration=3.984522285 podStartE2EDuration="3.984522285s" podCreationTimestamp="2026-02-26 11:31:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:31:19.98120908 +0000 UTC m=+1225.792035524" watchObservedRunningTime="2026-02-26 11:31:19.984522285 +0000 UTC m=+1225.795348719" Feb 26 11:31:20 crc kubenswrapper[4699]: I0226 11:31:20.284180 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fc97c22-3dce-4a90-bd78-d976a368e56c" path="/var/lib/kubelet/pods/7fc97c22-3dce-4a90-bd78-d976a368e56c/volumes" Feb 26 11:31:20 crc kubenswrapper[4699]: I0226 11:31:20.284686 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1315502-3c1c-4d70-b105-d31a6e2fe754" path="/var/lib/kubelet/pods/e1315502-3c1c-4d70-b105-d31a6e2fe754/volumes" Feb 26 11:31:20 crc kubenswrapper[4699]: I0226 11:31:20.991533 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9c24335e-75be-481e-b1c8-631913d074ee","Type":"ContainerStarted","Data":"fcc40c7508a6a00f53ef699bf82940d37acb3bc8e8309bb9b5ea1335e70a77f3"} Feb 26 11:31:22 crc kubenswrapper[4699]: I0226 11:31:22.023755 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"630dc7fb-8bb5-4136-accd-eb460ad0e940","Type":"ContainerStarted","Data":"1784112af5cd06d4ca4320e949f04a24c30b53553bdf95678319674202498461"} Feb 26 11:31:22 crc kubenswrapper[4699]: I0226 11:31:22.023916 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="630dc7fb-8bb5-4136-accd-eb460ad0e940" containerName="glance-log" containerID="cri-o://790e1bee9a89611157009e82024f95d4afa15834cb397b52f1c9c892d7cb8150" gracePeriod=30 Feb 26 11:31:22 crc kubenswrapper[4699]: I0226 11:31:22.024208 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="630dc7fb-8bb5-4136-accd-eb460ad0e940" containerName="glance-httpd" containerID="cri-o://1784112af5cd06d4ca4320e949f04a24c30b53553bdf95678319674202498461" gracePeriod=30 Feb 26 11:31:22 crc kubenswrapper[4699]: I0226 11:31:22.035221 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9c24335e-75be-481e-b1c8-631913d074ee","Type":"ContainerStarted","Data":"bd25a0bbaec5054f7454612c57bed5e09bb1f26fa1edc54363bf9ae7af5130e5"} Feb 26 11:31:22 crc kubenswrapper[4699]: I0226 11:31:22.035424 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9c24335e-75be-481e-b1c8-631913d074ee" containerName="glance-log" containerID="cri-o://fcc40c7508a6a00f53ef699bf82940d37acb3bc8e8309bb9b5ea1335e70a77f3" gracePeriod=30 Feb 26 11:31:22 crc kubenswrapper[4699]: I0226 11:31:22.035471 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9c24335e-75be-481e-b1c8-631913d074ee" containerName="glance-httpd" containerID="cri-o://bd25a0bbaec5054f7454612c57bed5e09bb1f26fa1edc54363bf9ae7af5130e5" gracePeriod=30 Feb 26 11:31:22 crc kubenswrapper[4699]: I0226 11:31:22.079266 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.079246538 podStartE2EDuration="6.079246538s" podCreationTimestamp="2026-02-26 11:31:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:31:22.077537319 +0000 UTC m=+1227.888363753" watchObservedRunningTime="2026-02-26 11:31:22.079246538 +0000 UTC m=+1227.890072962" Feb 26 11:31:22 crc kubenswrapper[4699]: I0226 11:31:22.079574 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.079566508 podStartE2EDuration="6.079566508s" podCreationTimestamp="2026-02-26 11:31:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:31:22.0512526 +0000 UTC m=+1227.862079044" watchObservedRunningTime="2026-02-26 11:31:22.079566508 +0000 UTC m=+1227.890392942" Feb 26 11:31:23 crc kubenswrapper[4699]: I0226 11:31:23.047378 4699 generic.go:334] "Generic (PLEG): container finished" podID="630dc7fb-8bb5-4136-accd-eb460ad0e940" containerID="1784112af5cd06d4ca4320e949f04a24c30b53553bdf95678319674202498461" exitCode=143 Feb 26 11:31:23 crc kubenswrapper[4699]: I0226 11:31:23.047721 4699 generic.go:334] "Generic (PLEG): container finished" podID="630dc7fb-8bb5-4136-accd-eb460ad0e940" containerID="790e1bee9a89611157009e82024f95d4afa15834cb397b52f1c9c892d7cb8150" exitCode=143 Feb 26 11:31:23 crc kubenswrapper[4699]: I0226 11:31:23.047506 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"630dc7fb-8bb5-4136-accd-eb460ad0e940","Type":"ContainerDied","Data":"1784112af5cd06d4ca4320e949f04a24c30b53553bdf95678319674202498461"} Feb 26 11:31:23 crc kubenswrapper[4699]: I0226 11:31:23.047805 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"630dc7fb-8bb5-4136-accd-eb460ad0e940","Type":"ContainerDied","Data":"790e1bee9a89611157009e82024f95d4afa15834cb397b52f1c9c892d7cb8150"} Feb 26 11:31:23 crc kubenswrapper[4699]: I0226 11:31:23.052242 4699 generic.go:334] "Generic (PLEG): container finished" podID="9c24335e-75be-481e-b1c8-631913d074ee" containerID="bd25a0bbaec5054f7454612c57bed5e09bb1f26fa1edc54363bf9ae7af5130e5" exitCode=143 Feb 26 11:31:23 crc kubenswrapper[4699]: I0226 11:31:23.052279 4699 generic.go:334] "Generic (PLEG): container finished" podID="9c24335e-75be-481e-b1c8-631913d074ee" containerID="fcc40c7508a6a00f53ef699bf82940d37acb3bc8e8309bb9b5ea1335e70a77f3" exitCode=143 Feb 26 11:31:23 crc kubenswrapper[4699]: I0226 11:31:23.052323 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9c24335e-75be-481e-b1c8-631913d074ee","Type":"ContainerDied","Data":"bd25a0bbaec5054f7454612c57bed5e09bb1f26fa1edc54363bf9ae7af5130e5"} Feb 26 11:31:23 crc kubenswrapper[4699]: I0226 11:31:23.052353 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9c24335e-75be-481e-b1c8-631913d074ee","Type":"ContainerDied","Data":"fcc40c7508a6a00f53ef699bf82940d37acb3bc8e8309bb9b5ea1335e70a77f3"} Feb 26 11:31:24 crc kubenswrapper[4699]: I0226 11:31:24.072382 4699 generic.go:334] "Generic (PLEG): container finished" podID="833927c0-710f-446e-a3be-0df2b2399638" containerID="5822866374c533954891aab83b4e82e6518ecfafe343985ba49ddc3abdfd00dc" exitCode=0 Feb 26 11:31:24 crc kubenswrapper[4699]: I0226 11:31:24.072430 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rx6w7" event={"ID":"833927c0-710f-446e-a3be-0df2b2399638","Type":"ContainerDied","Data":"5822866374c533954891aab83b4e82e6518ecfafe343985ba49ddc3abdfd00dc"} Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.667076 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rx6w7" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.678592 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.689602 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.746617 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/630dc7fb-8bb5-4136-accd-eb460ad0e940-logs\") pod \"630dc7fb-8bb5-4136-accd-eb460ad0e940\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.746666 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c24335e-75be-481e-b1c8-631913d074ee-scripts\") pod \"9c24335e-75be-481e-b1c8-631913d074ee\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.746684 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-fernet-keys\") pod \"833927c0-710f-446e-a3be-0df2b2399638\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.746724 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c24335e-75be-481e-b1c8-631913d074ee-httpd-run\") pod \"9c24335e-75be-481e-b1c8-631913d074ee\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.746793 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-credential-keys\") pod \"833927c0-710f-446e-a3be-0df2b2399638\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.746812 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"9c24335e-75be-481e-b1c8-631913d074ee\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.746833 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c24335e-75be-481e-b1c8-631913d074ee-logs\") pod \"9c24335e-75be-481e-b1c8-631913d074ee\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.746848 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c24335e-75be-481e-b1c8-631913d074ee-combined-ca-bundle\") pod \"9c24335e-75be-481e-b1c8-631913d074ee\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.746866 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/630dc7fb-8bb5-4136-accd-eb460ad0e940-scripts\") pod \"630dc7fb-8bb5-4136-accd-eb460ad0e940\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.746882 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xwxg\" (UniqueName: \"kubernetes.io/projected/9c24335e-75be-481e-b1c8-631913d074ee-kube-api-access-6xwxg\") pod \"9c24335e-75be-481e-b1c8-631913d074ee\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.746900 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-config-data\") pod \"833927c0-710f-446e-a3be-0df2b2399638\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.746921 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-combined-ca-bundle\") pod \"833927c0-710f-446e-a3be-0df2b2399638\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.746936 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"630dc7fb-8bb5-4136-accd-eb460ad0e940\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.746953 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9w8wd\" (UniqueName: \"kubernetes.io/projected/833927c0-710f-446e-a3be-0df2b2399638-kube-api-access-9w8wd\") pod \"833927c0-710f-446e-a3be-0df2b2399638\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.747032 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c24335e-75be-481e-b1c8-631913d074ee-config-data\") pod \"9c24335e-75be-481e-b1c8-631913d074ee\" (UID: \"9c24335e-75be-481e-b1c8-631913d074ee\") " Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.747076 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztqww\" (UniqueName: \"kubernetes.io/projected/630dc7fb-8bb5-4136-accd-eb460ad0e940-kube-api-access-ztqww\") pod \"630dc7fb-8bb5-4136-accd-eb460ad0e940\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.747104 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-scripts\") pod \"833927c0-710f-446e-a3be-0df2b2399638\" (UID: \"833927c0-710f-446e-a3be-0df2b2399638\") " Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.747139 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/630dc7fb-8bb5-4136-accd-eb460ad0e940-config-data\") pod \"630dc7fb-8bb5-4136-accd-eb460ad0e940\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.747162 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/630dc7fb-8bb5-4136-accd-eb460ad0e940-httpd-run\") pod \"630dc7fb-8bb5-4136-accd-eb460ad0e940\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.747195 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/630dc7fb-8bb5-4136-accd-eb460ad0e940-combined-ca-bundle\") pod \"630dc7fb-8bb5-4136-accd-eb460ad0e940\" (UID: \"630dc7fb-8bb5-4136-accd-eb460ad0e940\") " Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.749924 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c24335e-75be-481e-b1c8-631913d074ee-logs" (OuterVolumeSpecName: "logs") pod "9c24335e-75be-481e-b1c8-631913d074ee" (UID: "9c24335e-75be-481e-b1c8-631913d074ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.751348 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/630dc7fb-8bb5-4136-accd-eb460ad0e940-logs" (OuterVolumeSpecName: "logs") pod "630dc7fb-8bb5-4136-accd-eb460ad0e940" (UID: "630dc7fb-8bb5-4136-accd-eb460ad0e940"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.754561 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c24335e-75be-481e-b1c8-631913d074ee-kube-api-access-6xwxg" (OuterVolumeSpecName: "kube-api-access-6xwxg") pod "9c24335e-75be-481e-b1c8-631913d074ee" (UID: "9c24335e-75be-481e-b1c8-631913d074ee"). InnerVolumeSpecName "kube-api-access-6xwxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.756279 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-scripts" (OuterVolumeSpecName: "scripts") pod "833927c0-710f-446e-a3be-0df2b2399638" (UID: "833927c0-710f-446e-a3be-0df2b2399638"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.757020 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c24335e-75be-481e-b1c8-631913d074ee-scripts" (OuterVolumeSpecName: "scripts") pod "9c24335e-75be-481e-b1c8-631913d074ee" (UID: "9c24335e-75be-481e-b1c8-631913d074ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.758908 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "833927c0-710f-446e-a3be-0df2b2399638" (UID: "833927c0-710f-446e-a3be-0df2b2399638"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.759437 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "9c24335e-75be-481e-b1c8-631913d074ee" (UID: "9c24335e-75be-481e-b1c8-631913d074ee"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.759825 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "833927c0-710f-446e-a3be-0df2b2399638" (UID: "833927c0-710f-446e-a3be-0df2b2399638"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.760317 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/630dc7fb-8bb5-4136-accd-eb460ad0e940-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "630dc7fb-8bb5-4136-accd-eb460ad0e940" (UID: "630dc7fb-8bb5-4136-accd-eb460ad0e940"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.760637 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c24335e-75be-481e-b1c8-631913d074ee-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9c24335e-75be-481e-b1c8-631913d074ee" (UID: "9c24335e-75be-481e-b1c8-631913d074ee"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.760671 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "630dc7fb-8bb5-4136-accd-eb460ad0e940" (UID: "630dc7fb-8bb5-4136-accd-eb460ad0e940"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.761706 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/630dc7fb-8bb5-4136-accd-eb460ad0e940-scripts" (OuterVolumeSpecName: "scripts") pod "630dc7fb-8bb5-4136-accd-eb460ad0e940" (UID: "630dc7fb-8bb5-4136-accd-eb460ad0e940"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.761986 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/630dc7fb-8bb5-4136-accd-eb460ad0e940-kube-api-access-ztqww" (OuterVolumeSpecName: "kube-api-access-ztqww") pod "630dc7fb-8bb5-4136-accd-eb460ad0e940" (UID: "630dc7fb-8bb5-4136-accd-eb460ad0e940"). InnerVolumeSpecName "kube-api-access-ztqww". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.772778 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/833927c0-710f-446e-a3be-0df2b2399638-kube-api-access-9w8wd" (OuterVolumeSpecName: "kube-api-access-9w8wd") pod "833927c0-710f-446e-a3be-0df2b2399638" (UID: "833927c0-710f-446e-a3be-0df2b2399638"). InnerVolumeSpecName "kube-api-access-9w8wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.800150 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "833927c0-710f-446e-a3be-0df2b2399638" (UID: "833927c0-710f-446e-a3be-0df2b2399638"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.809247 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/630dc7fb-8bb5-4136-accd-eb460ad0e940-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "630dc7fb-8bb5-4136-accd-eb460ad0e940" (UID: "630dc7fb-8bb5-4136-accd-eb460ad0e940"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.814922 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-config-data" (OuterVolumeSpecName: "config-data") pod "833927c0-710f-446e-a3be-0df2b2399638" (UID: "833927c0-710f-446e-a3be-0df2b2399638"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.822459 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c24335e-75be-481e-b1c8-631913d074ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c24335e-75be-481e-b1c8-631913d074ee" (UID: "9c24335e-75be-481e-b1c8-631913d074ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.823490 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c24335e-75be-481e-b1c8-631913d074ee-config-data" (OuterVolumeSpecName: "config-data") pod "9c24335e-75be-481e-b1c8-631913d074ee" (UID: "9c24335e-75be-481e-b1c8-631913d074ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.841104 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/630dc7fb-8bb5-4136-accd-eb460ad0e940-config-data" (OuterVolumeSpecName: "config-data") pod "630dc7fb-8bb5-4136-accd-eb460ad0e940" (UID: "630dc7fb-8bb5-4136-accd-eb460ad0e940"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.852244 4699 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.852480 4699 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.852541 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c24335e-75be-481e-b1c8-631913d074ee-logs\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.852633 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c24335e-75be-481e-b1c8-631913d074ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.852694 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/630dc7fb-8bb5-4136-accd-eb460ad0e940-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.852756 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xwxg\" (UniqueName: \"kubernetes.io/projected/9c24335e-75be-481e-b1c8-631913d074ee-kube-api-access-6xwxg\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.852814 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.852870 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.852937 4699 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.852996 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9w8wd\" (UniqueName: \"kubernetes.io/projected/833927c0-710f-446e-a3be-0df2b2399638-kube-api-access-9w8wd\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.853054 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c24335e-75be-481e-b1c8-631913d074ee-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.853128 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztqww\" (UniqueName: \"kubernetes.io/projected/630dc7fb-8bb5-4136-accd-eb460ad0e940-kube-api-access-ztqww\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.853211 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.853282 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/630dc7fb-8bb5-4136-accd-eb460ad0e940-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.853339 4699 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/630dc7fb-8bb5-4136-accd-eb460ad0e940-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.853407 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/630dc7fb-8bb5-4136-accd-eb460ad0e940-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.853463 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/630dc7fb-8bb5-4136-accd-eb460ad0e940-logs\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.853531 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c24335e-75be-481e-b1c8-631913d074ee-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.853592 4699 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/833927c0-710f-446e-a3be-0df2b2399638-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.853648 4699 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c24335e-75be-481e-b1c8-631913d074ee-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.873586 4699 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.879380 4699 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.955431 4699 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:25 crc kubenswrapper[4699]: I0226 11:31:25.955470 4699 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.093339 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.093363 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9c24335e-75be-481e-b1c8-631913d074ee","Type":"ContainerDied","Data":"3505d7f98aca89b6416ea19c75cdca4f118df015fab7cba4bb6ff9fd01c39fa6"} Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.093417 4699 scope.go:117] "RemoveContainer" containerID="bd25a0bbaec5054f7454612c57bed5e09bb1f26fa1edc54363bf9ae7af5130e5" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.096643 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rx6w7" event={"ID":"833927c0-710f-446e-a3be-0df2b2399638","Type":"ContainerDied","Data":"654ff57a8dfa79f3d0a638f1486da08c6ebdf1d037f74a27a5ccdbee84215eb5"} Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.096679 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="654ff57a8dfa79f3d0a638f1486da08c6ebdf1d037f74a27a5ccdbee84215eb5" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.096733 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rx6w7" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.101846 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.101780 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"630dc7fb-8bb5-4136-accd-eb460ad0e940","Type":"ContainerDied","Data":"0462d99185a120468341d7f6efeca5ca1d1c779c506ddb0fb105a2de0f655ad5"} Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.160676 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.170255 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.183493 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.198474 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.204647 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 11:31:26 crc kubenswrapper[4699]: E0226 11:31:26.205052 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630dc7fb-8bb5-4136-accd-eb460ad0e940" containerName="glance-log" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.205074 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="630dc7fb-8bb5-4136-accd-eb460ad0e940" containerName="glance-log" Feb 26 11:31:26 crc kubenswrapper[4699]: E0226 11:31:26.205094 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="833927c0-710f-446e-a3be-0df2b2399638" containerName="keystone-bootstrap" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.205101 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="833927c0-710f-446e-a3be-0df2b2399638" containerName="keystone-bootstrap" Feb 26 11:31:26 crc kubenswrapper[4699]: E0226 11:31:26.205124 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c24335e-75be-481e-b1c8-631913d074ee" containerName="glance-httpd" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.205135 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c24335e-75be-481e-b1c8-631913d074ee" containerName="glance-httpd" Feb 26 11:31:26 crc kubenswrapper[4699]: E0226 11:31:26.205148 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c24335e-75be-481e-b1c8-631913d074ee" containerName="glance-log" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.205154 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c24335e-75be-481e-b1c8-631913d074ee" containerName="glance-log" Feb 26 11:31:26 crc kubenswrapper[4699]: E0226 11:31:26.205175 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630dc7fb-8bb5-4136-accd-eb460ad0e940" containerName="glance-httpd" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.205182 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="630dc7fb-8bb5-4136-accd-eb460ad0e940" containerName="glance-httpd" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.205368 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="630dc7fb-8bb5-4136-accd-eb460ad0e940" containerName="glance-log" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.205384 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="833927c0-710f-446e-a3be-0df2b2399638" containerName="keystone-bootstrap" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.205399 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="630dc7fb-8bb5-4136-accd-eb460ad0e940" containerName="glance-httpd" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.205411 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c24335e-75be-481e-b1c8-631913d074ee" containerName="glance-log" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.205419 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c24335e-75be-481e-b1c8-631913d074ee" containerName="glance-httpd" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.206344 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.210315 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-j4q6c" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.211697 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.211932 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.214480 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rx6w7"] Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.228415 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.233095 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.235006 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.239748 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.260182 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c856fe4-2ae4-4e5d-8112-a367658a5082-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.260514 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c856fe4-2ae4-4e5d-8112-a367658a5082-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.260543 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c856fe4-2ae4-4e5d-8112-a367658a5082-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.260572 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c856fe4-2ae4-4e5d-8112-a367658a5082-logs\") pod \"glance-default-internal-api-0\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.260655 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-698q6\" (UniqueName: \"kubernetes.io/projected/1c856fe4-2ae4-4e5d-8112-a367658a5082-kube-api-access-698q6\") pod \"glance-default-internal-api-0\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.260728 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.260822 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c856fe4-2ae4-4e5d-8112-a367658a5082-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.300199 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="630dc7fb-8bb5-4136-accd-eb460ad0e940" path="/var/lib/kubelet/pods/630dc7fb-8bb5-4136-accd-eb460ad0e940/volumes" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.301084 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c24335e-75be-481e-b1c8-631913d074ee" path="/var/lib/kubelet/pods/9c24335e-75be-481e-b1c8-631913d074ee/volumes" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.301783 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rx6w7"] Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.301814 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.308566 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-28v5g"] Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.309658 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-28v5g" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.312695 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.312803 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.312865 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.313046 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qbntt" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.314810 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.323988 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-28v5g"] Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.362343 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-config-data\") pod \"keystone-bootstrap-28v5g\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " pod="openstack/keystone-bootstrap-28v5g" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.362388 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-fernet-keys\") pod \"keystone-bootstrap-28v5g\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " pod="openstack/keystone-bootstrap-28v5g" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.362410 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.362435 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fcef8c4-762f-45c5-9087-fdfd43cd166f-scripts\") pod \"glance-default-external-api-0\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.362696 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb644\" (UniqueName: \"kubernetes.io/projected/6fcef8c4-762f-45c5-9087-fdfd43cd166f-kube-api-access-wb644\") pod \"glance-default-external-api-0\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.362800 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.362963 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89dst\" (UniqueName: \"kubernetes.io/projected/b33c7b6e-a78a-4a10-848c-a65d01deee0b-kube-api-access-89dst\") pod \"keystone-bootstrap-28v5g\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " pod="openstack/keystone-bootstrap-28v5g" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.363005 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-credential-keys\") pod \"keystone-bootstrap-28v5g\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " pod="openstack/keystone-bootstrap-28v5g" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.363151 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6fcef8c4-762f-45c5-9087-fdfd43cd166f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.363181 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fcef8c4-762f-45c5-9087-fdfd43cd166f-logs\") pod \"glance-default-external-api-0\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.363249 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c856fe4-2ae4-4e5d-8112-a367658a5082-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.363310 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c856fe4-2ae4-4e5d-8112-a367658a5082-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.363340 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-combined-ca-bundle\") pod \"keystone-bootstrap-28v5g\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " pod="openstack/keystone-bootstrap-28v5g" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.363448 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-scripts\") pod \"keystone-bootstrap-28v5g\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " pod="openstack/keystone-bootstrap-28v5g" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.363480 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fcef8c4-762f-45c5-9087-fdfd43cd166f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.363551 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fcef8c4-762f-45c5-9087-fdfd43cd166f-config-data\") pod \"glance-default-external-api-0\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.363611 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c856fe4-2ae4-4e5d-8112-a367658a5082-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.363632 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c856fe4-2ae4-4e5d-8112-a367658a5082-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.363686 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c856fe4-2ae4-4e5d-8112-a367658a5082-logs\") pod \"glance-default-internal-api-0\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.363826 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-698q6\" (UniqueName: \"kubernetes.io/projected/1c856fe4-2ae4-4e5d-8112-a367658a5082-kube-api-access-698q6\") pod \"glance-default-internal-api-0\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.365897 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.366355 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c856fe4-2ae4-4e5d-8112-a367658a5082-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.366806 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c856fe4-2ae4-4e5d-8112-a367658a5082-logs\") pod \"glance-default-internal-api-0\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.369475 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c856fe4-2ae4-4e5d-8112-a367658a5082-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.370263 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c856fe4-2ae4-4e5d-8112-a367658a5082-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.375285 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c856fe4-2ae4-4e5d-8112-a367658a5082-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.390660 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-698q6\" (UniqueName: \"kubernetes.io/projected/1c856fe4-2ae4-4e5d-8112-a367658a5082-kube-api-access-698q6\") pod \"glance-default-internal-api-0\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.406417 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.467235 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-config-data\") pod \"keystone-bootstrap-28v5g\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " pod="openstack/keystone-bootstrap-28v5g" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.467285 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-fernet-keys\") pod \"keystone-bootstrap-28v5g\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " pod="openstack/keystone-bootstrap-28v5g" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.467311 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.467344 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fcef8c4-762f-45c5-9087-fdfd43cd166f-scripts\") pod \"glance-default-external-api-0\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.467371 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb644\" (UniqueName: \"kubernetes.io/projected/6fcef8c4-762f-45c5-9087-fdfd43cd166f-kube-api-access-wb644\") pod \"glance-default-external-api-0\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.467400 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89dst\" (UniqueName: \"kubernetes.io/projected/b33c7b6e-a78a-4a10-848c-a65d01deee0b-kube-api-access-89dst\") pod \"keystone-bootstrap-28v5g\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " pod="openstack/keystone-bootstrap-28v5g" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.467424 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-credential-keys\") pod \"keystone-bootstrap-28v5g\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " pod="openstack/keystone-bootstrap-28v5g" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.467466 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6fcef8c4-762f-45c5-9087-fdfd43cd166f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.467490 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fcef8c4-762f-45c5-9087-fdfd43cd166f-logs\") pod \"glance-default-external-api-0\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.467533 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-combined-ca-bundle\") pod \"keystone-bootstrap-28v5g\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " pod="openstack/keystone-bootstrap-28v5g" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.467568 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-scripts\") pod \"keystone-bootstrap-28v5g\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " pod="openstack/keystone-bootstrap-28v5g" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.467589 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fcef8c4-762f-45c5-9087-fdfd43cd166f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.467661 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fcef8c4-762f-45c5-9087-fdfd43cd166f-config-data\") pod \"glance-default-external-api-0\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.468812 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fcef8c4-762f-45c5-9087-fdfd43cd166f-logs\") pod \"glance-default-external-api-0\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.471543 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fcef8c4-762f-45c5-9087-fdfd43cd166f-config-data\") pod \"glance-default-external-api-0\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.472344 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-credential-keys\") pod \"keystone-bootstrap-28v5g\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " pod="openstack/keystone-bootstrap-28v5g" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.472701 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6fcef8c4-762f-45c5-9087-fdfd43cd166f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.472920 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.473822 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-config-data\") pod \"keystone-bootstrap-28v5g\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " pod="openstack/keystone-bootstrap-28v5g" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.475501 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-scripts\") pod \"keystone-bootstrap-28v5g\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " pod="openstack/keystone-bootstrap-28v5g" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.476084 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-combined-ca-bundle\") pod \"keystone-bootstrap-28v5g\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " pod="openstack/keystone-bootstrap-28v5g" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.489848 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fcef8c4-762f-45c5-9087-fdfd43cd166f-scripts\") pod \"glance-default-external-api-0\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.489882 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fcef8c4-762f-45c5-9087-fdfd43cd166f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.489996 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-fernet-keys\") pod \"keystone-bootstrap-28v5g\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " pod="openstack/keystone-bootstrap-28v5g" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.492556 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89dst\" (UniqueName: \"kubernetes.io/projected/b33c7b6e-a78a-4a10-848c-a65d01deee0b-kube-api-access-89dst\") pod \"keystone-bootstrap-28v5g\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " pod="openstack/keystone-bootstrap-28v5g" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.493560 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb644\" (UniqueName: \"kubernetes.io/projected/6fcef8c4-762f-45c5-9087-fdfd43cd166f-kube-api-access-wb644\") pod \"glance-default-external-api-0\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.502008 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.543586 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.589166 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 11:31:26 crc kubenswrapper[4699]: I0226 11:31:26.634956 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-28v5g" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.424346 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.497869 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-c6zn7"] Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.498164 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" podUID="6ebd52ff-bacc-40c7-afc5-83df5a1c36e9" containerName="dnsmasq-dns" containerID="cri-o://fe976bbefde2fa99a8167c39df0e86003afc4a567d5a020335a060d2c650e894" gracePeriod=10 Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.693535 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-db87b77d9-ns48f"] Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.742012 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.749421 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-57899c756d-w9pc5"] Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.750943 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.755605 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.763955 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57899c756d-w9pc5"] Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.801183 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/78d85906-b78a-46eb-b5dd-4da95c1222d8-horizon-secret-key\") pod \"horizon-57899c756d-w9pc5\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.801244 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78d85906-b78a-46eb-b5dd-4da95c1222d8-scripts\") pod \"horizon-57899c756d-w9pc5\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.801299 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-252gw\" (UniqueName: \"kubernetes.io/projected/78d85906-b78a-46eb-b5dd-4da95c1222d8-kube-api-access-252gw\") pod \"horizon-57899c756d-w9pc5\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.801406 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/78d85906-b78a-46eb-b5dd-4da95c1222d8-horizon-tls-certs\") pod \"horizon-57899c756d-w9pc5\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.801499 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78d85906-b78a-46eb-b5dd-4da95c1222d8-logs\") pod \"horizon-57899c756d-w9pc5\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.801522 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78d85906-b78a-46eb-b5dd-4da95c1222d8-config-data\") pod \"horizon-57899c756d-w9pc5\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.801556 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d85906-b78a-46eb-b5dd-4da95c1222d8-combined-ca-bundle\") pod \"horizon-57899c756d-w9pc5\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.833921 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d649b895f-2cm8f"] Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.866171 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.878755 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5795557cd8-dvzqq"] Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.880797 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.898257 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5795557cd8-dvzqq"] Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.902942 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0-combined-ca-bundle\") pod \"horizon-5795557cd8-dvzqq\" (UID: \"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0\") " pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.902988 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d85906-b78a-46eb-b5dd-4da95c1222d8-combined-ca-bundle\") pod \"horizon-57899c756d-w9pc5\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.903017 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/78d85906-b78a-46eb-b5dd-4da95c1222d8-horizon-secret-key\") pod \"horizon-57899c756d-w9pc5\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.903044 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78d85906-b78a-46eb-b5dd-4da95c1222d8-scripts\") pod \"horizon-57899c756d-w9pc5\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.903077 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-252gw\" (UniqueName: \"kubernetes.io/projected/78d85906-b78a-46eb-b5dd-4da95c1222d8-kube-api-access-252gw\") pod \"horizon-57899c756d-w9pc5\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.903153 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0-horizon-tls-certs\") pod \"horizon-5795557cd8-dvzqq\" (UID: \"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0\") " pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.903172 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0-horizon-secret-key\") pod \"horizon-5795557cd8-dvzqq\" (UID: \"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0\") " pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.903192 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0-scripts\") pod \"horizon-5795557cd8-dvzqq\" (UID: \"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0\") " pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.903230 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/78d85906-b78a-46eb-b5dd-4da95c1222d8-horizon-tls-certs\") pod \"horizon-57899c756d-w9pc5\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.903252 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0-logs\") pod \"horizon-5795557cd8-dvzqq\" (UID: \"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0\") " pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.903294 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0-config-data\") pod \"horizon-5795557cd8-dvzqq\" (UID: \"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0\") " pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.903321 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thkww\" (UniqueName: \"kubernetes.io/projected/15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0-kube-api-access-thkww\") pod \"horizon-5795557cd8-dvzqq\" (UID: \"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0\") " pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.903343 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78d85906-b78a-46eb-b5dd-4da95c1222d8-logs\") pod \"horizon-57899c756d-w9pc5\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.903359 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78d85906-b78a-46eb-b5dd-4da95c1222d8-config-data\") pod \"horizon-57899c756d-w9pc5\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.904551 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78d85906-b78a-46eb-b5dd-4da95c1222d8-config-data\") pod \"horizon-57899c756d-w9pc5\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.909033 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/78d85906-b78a-46eb-b5dd-4da95c1222d8-horizon-tls-certs\") pod \"horizon-57899c756d-w9pc5\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.909524 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78d85906-b78a-46eb-b5dd-4da95c1222d8-logs\") pod \"horizon-57899c756d-w9pc5\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.911362 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78d85906-b78a-46eb-b5dd-4da95c1222d8-scripts\") pod \"horizon-57899c756d-w9pc5\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.914530 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d85906-b78a-46eb-b5dd-4da95c1222d8-combined-ca-bundle\") pod \"horizon-57899c756d-w9pc5\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.918626 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/78d85906-b78a-46eb-b5dd-4da95c1222d8-horizon-secret-key\") pod \"horizon-57899c756d-w9pc5\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:27 crc kubenswrapper[4699]: I0226 11:31:27.943573 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-252gw\" (UniqueName: \"kubernetes.io/projected/78d85906-b78a-46eb-b5dd-4da95c1222d8-kube-api-access-252gw\") pod \"horizon-57899c756d-w9pc5\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:28 crc kubenswrapper[4699]: I0226 11:31:28.005210 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0-horizon-tls-certs\") pod \"horizon-5795557cd8-dvzqq\" (UID: \"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0\") " pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:28 crc kubenswrapper[4699]: I0226 11:31:28.005251 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0-horizon-secret-key\") pod \"horizon-5795557cd8-dvzqq\" (UID: \"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0\") " pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:28 crc kubenswrapper[4699]: I0226 11:31:28.005272 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0-scripts\") pod \"horizon-5795557cd8-dvzqq\" (UID: \"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0\") " pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:28 crc kubenswrapper[4699]: I0226 11:31:28.005296 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0-logs\") pod \"horizon-5795557cd8-dvzqq\" (UID: \"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0\") " pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:28 crc kubenswrapper[4699]: I0226 11:31:28.005339 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0-config-data\") pod \"horizon-5795557cd8-dvzqq\" (UID: \"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0\") " pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:28 crc kubenswrapper[4699]: I0226 11:31:28.005372 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thkww\" (UniqueName: \"kubernetes.io/projected/15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0-kube-api-access-thkww\") pod \"horizon-5795557cd8-dvzqq\" (UID: \"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0\") " pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:28 crc kubenswrapper[4699]: I0226 11:31:28.005399 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0-combined-ca-bundle\") pod \"horizon-5795557cd8-dvzqq\" (UID: \"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0\") " pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:28 crc kubenswrapper[4699]: I0226 11:31:28.006640 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0-logs\") pod \"horizon-5795557cd8-dvzqq\" (UID: \"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0\") " pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:28 crc kubenswrapper[4699]: I0226 11:31:28.006700 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0-scripts\") pod \"horizon-5795557cd8-dvzqq\" (UID: \"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0\") " pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:28 crc kubenswrapper[4699]: I0226 11:31:28.007950 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0-config-data\") pod \"horizon-5795557cd8-dvzqq\" (UID: \"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0\") " pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:28 crc kubenswrapper[4699]: I0226 11:31:28.009874 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0-combined-ca-bundle\") pod \"horizon-5795557cd8-dvzqq\" (UID: \"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0\") " pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:28 crc kubenswrapper[4699]: I0226 11:31:28.010689 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0-horizon-tls-certs\") pod \"horizon-5795557cd8-dvzqq\" (UID: \"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0\") " pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:28 crc kubenswrapper[4699]: I0226 11:31:28.026605 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thkww\" (UniqueName: \"kubernetes.io/projected/15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0-kube-api-access-thkww\") pod \"horizon-5795557cd8-dvzqq\" (UID: \"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0\") " pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:28 crc kubenswrapper[4699]: I0226 11:31:28.031840 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0-horizon-secret-key\") pod \"horizon-5795557cd8-dvzqq\" (UID: \"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0\") " pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:28 crc kubenswrapper[4699]: I0226 11:31:28.122525 4699 generic.go:334] "Generic (PLEG): container finished" podID="6ebd52ff-bacc-40c7-afc5-83df5a1c36e9" containerID="fe976bbefde2fa99a8167c39df0e86003afc4a567d5a020335a060d2c650e894" exitCode=0 Feb 26 11:31:28 crc kubenswrapper[4699]: I0226 11:31:28.122570 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" event={"ID":"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9","Type":"ContainerDied","Data":"fe976bbefde2fa99a8167c39df0e86003afc4a567d5a020335a060d2c650e894"} Feb 26 11:31:28 crc kubenswrapper[4699]: I0226 11:31:28.130405 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:28 crc kubenswrapper[4699]: I0226 11:31:28.211719 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:28 crc kubenswrapper[4699]: I0226 11:31:28.270603 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="833927c0-710f-446e-a3be-0df2b2399638" path="/var/lib/kubelet/pods/833927c0-710f-446e-a3be-0df2b2399638/volumes" Feb 26 11:31:28 crc kubenswrapper[4699]: I0226 11:31:28.977015 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" podUID="6ebd52ff-bacc-40c7-afc5-83df5a1c36e9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Feb 26 11:31:33 crc kubenswrapper[4699]: I0226 11:31:33.976745 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" podUID="6ebd52ff-bacc-40c7-afc5-83df5a1c36e9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Feb 26 11:31:35 crc kubenswrapper[4699]: E0226 11:31:35.764125 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 26 11:31:35 crc kubenswrapper[4699]: E0226 11:31:35.764559 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n564h569h585h556h5c5h677h95h567h95h59dh59h56dh654h646hdchd4hd5h5cdh88h666hfbh665h5bdh5fbh5ffhd5h5c5h76hf6h57chddh68cq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8k4xk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-db87b77d9-ns48f_openstack(c9cf42d8-ed15-44dd-aaed-fbffa29417c4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 11:31:35 crc kubenswrapper[4699]: E0226 11:31:35.767091 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-db87b77d9-ns48f" podUID="c9cf42d8-ed15-44dd-aaed-fbffa29417c4" Feb 26 11:31:37 crc kubenswrapper[4699]: E0226 11:31:37.141248 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Feb 26 11:31:37 crc kubenswrapper[4699]: E0226 11:31:37.141698 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-99w54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-z6w9z_openstack(47a9d008-5b7e-4866-b92b-efcb60cbfdb0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 11:31:37 crc kubenswrapper[4699]: E0226 11:31:37.143081 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-z6w9z" podUID="47a9d008-5b7e-4866-b92b-efcb60cbfdb0" Feb 26 11:31:37 crc kubenswrapper[4699]: E0226 11:31:37.157711 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 26 11:31:37 crc kubenswrapper[4699]: E0226 11:31:37.157882 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n58bh588h58hf9h674h68bh66ch98h668h55hf5h59dh8fh5ffhb5h6dh596h5c4h8ch8fh657hc7h68fh59bh58h64fhf7h66dh657h86h9fh668q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7n82h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7f6f7dcd75-m9jm6_openstack(41ed545b-f613-4408-bd1c-df5a09432e39): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 11:31:37 crc kubenswrapper[4699]: E0226 11:31:37.160378 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7f6f7dcd75-m9jm6" podUID="41ed545b-f613-4408-bd1c-df5a09432e39" Feb 26 11:31:37 crc kubenswrapper[4699]: E0226 11:31:37.180939 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 26 11:31:37 crc kubenswrapper[4699]: E0226 11:31:37.181089 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n55bh57dh5b4h99hb7h587h5cch687h678hcdh699h58ch666h5bhdch559h65dh66fh99h698h5f7h656h54dh58ch65h666h679h5bfhc7h6fh5ddh679q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tfkrb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6d649b895f-2cm8f_openstack(d6628395-d6a6-4719-b0ad-10984c3c172b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 11:31:37 crc kubenswrapper[4699]: E0226 11:31:37.183619 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6d649b895f-2cm8f" podUID="d6628395-d6a6-4719-b0ad-10984c3c172b" Feb 26 11:31:37 crc kubenswrapper[4699]: E0226 11:31:37.193344 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-z6w9z" podUID="47a9d008-5b7e-4866-b92b-efcb60cbfdb0" Feb 26 11:31:40 crc kubenswrapper[4699]: I0226 11:31:40.213355 4699 generic.go:334] "Generic (PLEG): container finished" podID="ae813248-510e-4b19-bcd8-39cefca6cd37" containerID="0eab0de6a835999edb566f7a018ef04e992296918bfb17f761cbea8ef8c3775a" exitCode=0 Feb 26 11:31:40 crc kubenswrapper[4699]: I0226 11:31:40.213502 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dr78q" event={"ID":"ae813248-510e-4b19-bcd8-39cefca6cd37","Type":"ContainerDied","Data":"0eab0de6a835999edb566f7a018ef04e992296918bfb17f761cbea8ef8c3775a"} Feb 26 11:31:43 crc kubenswrapper[4699]: I0226 11:31:43.978197 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" podUID="6ebd52ff-bacc-40c7-afc5-83df5a1c36e9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: i/o timeout" Feb 26 11:31:43 crc kubenswrapper[4699]: I0226 11:31:43.978957 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:31:45 crc kubenswrapper[4699]: E0226 11:31:45.632644 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Feb 26 11:31:45 crc kubenswrapper[4699]: E0226 11:31:45.633132 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n7h67bh688h56bhd8hd5h5h58fh95h5b5hb4h5b6h564h658h6fh58fh5ch5dfh556hfch657h678h694h67fh566h98h5b7h64h576hc4h5bch695q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-srl4m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(7cec2d73-9ca8-4a8b-836d-efce961fbde8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.692620 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-db87b77d9-ns48f" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.724191 4699 scope.go:117] "RemoveContainer" containerID="fcc40c7508a6a00f53ef699bf82940d37acb3bc8e8309bb9b5ea1335e70a77f3" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.742484 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-config-data\") pod \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\" (UID: \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.742635 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-scripts\") pod \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\" (UID: \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.742671 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-logs\") pod \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\" (UID: \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.742774 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k4xk\" (UniqueName: \"kubernetes.io/projected/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-kube-api-access-8k4xk\") pod \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\" (UID: \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.742863 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-horizon-secret-key\") pod \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\" (UID: \"c9cf42d8-ed15-44dd-aaed-fbffa29417c4\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.743375 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-logs" (OuterVolumeSpecName: "logs") pod "c9cf42d8-ed15-44dd-aaed-fbffa29417c4" (UID: "c9cf42d8-ed15-44dd-aaed-fbffa29417c4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.743631 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-config-data" (OuterVolumeSpecName: "config-data") pod "c9cf42d8-ed15-44dd-aaed-fbffa29417c4" (UID: "c9cf42d8-ed15-44dd-aaed-fbffa29417c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.743666 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-scripts" (OuterVolumeSpecName: "scripts") pod "c9cf42d8-ed15-44dd-aaed-fbffa29417c4" (UID: "c9cf42d8-ed15-44dd-aaed-fbffa29417c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.747068 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-kube-api-access-8k4xk" (OuterVolumeSpecName: "kube-api-access-8k4xk") pod "c9cf42d8-ed15-44dd-aaed-fbffa29417c4" (UID: "c9cf42d8-ed15-44dd-aaed-fbffa29417c4"). InnerVolumeSpecName "kube-api-access-8k4xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.747065 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c9cf42d8-ed15-44dd-aaed-fbffa29417c4" (UID: "c9cf42d8-ed15-44dd-aaed-fbffa29417c4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.844849 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k4xk\" (UniqueName: \"kubernetes.io/projected/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-kube-api-access-8k4xk\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.844884 4699 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.844894 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.844907 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.844919 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9cf42d8-ed15-44dd-aaed-fbffa29417c4-logs\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.846446 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dr78q" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.853416 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.865451 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d649b895f-2cm8f" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.878945 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f6f7dcd75-m9jm6" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.945814 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-dns-svc\") pod \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.945877 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/41ed545b-f613-4408-bd1c-df5a09432e39-horizon-secret-key\") pod \"41ed545b-f613-4408-bd1c-df5a09432e39\" (UID: \"41ed545b-f613-4408-bd1c-df5a09432e39\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.945919 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-ovsdbserver-nb\") pod \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.945975 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrxfq\" (UniqueName: \"kubernetes.io/projected/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-kube-api-access-lrxfq\") pod \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.946014 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-ovsdbserver-sb\") pod \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.946038 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae813248-510e-4b19-bcd8-39cefca6cd37-combined-ca-bundle\") pod \"ae813248-510e-4b19-bcd8-39cefca6cd37\" (UID: \"ae813248-510e-4b19-bcd8-39cefca6cd37\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.946062 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n82h\" (UniqueName: \"kubernetes.io/projected/41ed545b-f613-4408-bd1c-df5a09432e39-kube-api-access-7n82h\") pod \"41ed545b-f613-4408-bd1c-df5a09432e39\" (UID: \"41ed545b-f613-4408-bd1c-df5a09432e39\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.946138 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqk94\" (UniqueName: \"kubernetes.io/projected/ae813248-510e-4b19-bcd8-39cefca6cd37-kube-api-access-pqk94\") pod \"ae813248-510e-4b19-bcd8-39cefca6cd37\" (UID: \"ae813248-510e-4b19-bcd8-39cefca6cd37\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.946174 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6628395-d6a6-4719-b0ad-10984c3c172b-scripts\") pod \"d6628395-d6a6-4719-b0ad-10984c3c172b\" (UID: \"d6628395-d6a6-4719-b0ad-10984c3c172b\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.946270 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ed545b-f613-4408-bd1c-df5a09432e39-logs\") pod \"41ed545b-f613-4408-bd1c-df5a09432e39\" (UID: \"41ed545b-f613-4408-bd1c-df5a09432e39\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.946301 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-dns-swift-storage-0\") pod \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.946328 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ae813248-510e-4b19-bcd8-39cefca6cd37-config\") pod \"ae813248-510e-4b19-bcd8-39cefca6cd37\" (UID: \"ae813248-510e-4b19-bcd8-39cefca6cd37\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.946366 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfkrb\" (UniqueName: \"kubernetes.io/projected/d6628395-d6a6-4719-b0ad-10984c3c172b-kube-api-access-tfkrb\") pod \"d6628395-d6a6-4719-b0ad-10984c3c172b\" (UID: \"d6628395-d6a6-4719-b0ad-10984c3c172b\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.946408 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/41ed545b-f613-4408-bd1c-df5a09432e39-config-data\") pod \"41ed545b-f613-4408-bd1c-df5a09432e39\" (UID: \"41ed545b-f613-4408-bd1c-df5a09432e39\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.946441 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6628395-d6a6-4719-b0ad-10984c3c172b-config-data\") pod \"d6628395-d6a6-4719-b0ad-10984c3c172b\" (UID: \"d6628395-d6a6-4719-b0ad-10984c3c172b\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.946465 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-config\") pod \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\" (UID: \"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.946489 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6628395-d6a6-4719-b0ad-10984c3c172b-horizon-secret-key\") pod \"d6628395-d6a6-4719-b0ad-10984c3c172b\" (UID: \"d6628395-d6a6-4719-b0ad-10984c3c172b\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.946522 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6628395-d6a6-4719-b0ad-10984c3c172b-logs\") pod \"d6628395-d6a6-4719-b0ad-10984c3c172b\" (UID: \"d6628395-d6a6-4719-b0ad-10984c3c172b\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.946558 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41ed545b-f613-4408-bd1c-df5a09432e39-scripts\") pod \"41ed545b-f613-4408-bd1c-df5a09432e39\" (UID: \"41ed545b-f613-4408-bd1c-df5a09432e39\") " Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.947652 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41ed545b-f613-4408-bd1c-df5a09432e39-scripts" (OuterVolumeSpecName: "scripts") pod "41ed545b-f613-4408-bd1c-df5a09432e39" (UID: "41ed545b-f613-4408-bd1c-df5a09432e39"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.947699 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6628395-d6a6-4719-b0ad-10984c3c172b-config-data" (OuterVolumeSpecName: "config-data") pod "d6628395-d6a6-4719-b0ad-10984c3c172b" (UID: "d6628395-d6a6-4719-b0ad-10984c3c172b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.954341 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6628395-d6a6-4719-b0ad-10984c3c172b-scripts" (OuterVolumeSpecName: "scripts") pod "d6628395-d6a6-4719-b0ad-10984c3c172b" (UID: "d6628395-d6a6-4719-b0ad-10984c3c172b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.954594 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae813248-510e-4b19-bcd8-39cefca6cd37-kube-api-access-pqk94" (OuterVolumeSpecName: "kube-api-access-pqk94") pod "ae813248-510e-4b19-bcd8-39cefca6cd37" (UID: "ae813248-510e-4b19-bcd8-39cefca6cd37"). InnerVolumeSpecName "kube-api-access-pqk94". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.956073 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41ed545b-f613-4408-bd1c-df5a09432e39-kube-api-access-7n82h" (OuterVolumeSpecName: "kube-api-access-7n82h") pod "41ed545b-f613-4408-bd1c-df5a09432e39" (UID: "41ed545b-f613-4408-bd1c-df5a09432e39"). InnerVolumeSpecName "kube-api-access-7n82h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.956454 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-kube-api-access-lrxfq" (OuterVolumeSpecName: "kube-api-access-lrxfq") pod "6ebd52ff-bacc-40c7-afc5-83df5a1c36e9" (UID: "6ebd52ff-bacc-40c7-afc5-83df5a1c36e9"). InnerVolumeSpecName "kube-api-access-lrxfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.956460 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6628395-d6a6-4719-b0ad-10984c3c172b-logs" (OuterVolumeSpecName: "logs") pod "d6628395-d6a6-4719-b0ad-10984c3c172b" (UID: "d6628395-d6a6-4719-b0ad-10984c3c172b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.956916 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41ed545b-f613-4408-bd1c-df5a09432e39-logs" (OuterVolumeSpecName: "logs") pod "41ed545b-f613-4408-bd1c-df5a09432e39" (UID: "41ed545b-f613-4408-bd1c-df5a09432e39"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.957147 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41ed545b-f613-4408-bd1c-df5a09432e39-config-data" (OuterVolumeSpecName: "config-data") pod "41ed545b-f613-4408-bd1c-df5a09432e39" (UID: "41ed545b-f613-4408-bd1c-df5a09432e39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.960051 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ed545b-f613-4408-bd1c-df5a09432e39-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "41ed545b-f613-4408-bd1c-df5a09432e39" (UID: "41ed545b-f613-4408-bd1c-df5a09432e39"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.962337 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6628395-d6a6-4719-b0ad-10984c3c172b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d6628395-d6a6-4719-b0ad-10984c3c172b" (UID: "d6628395-d6a6-4719-b0ad-10984c3c172b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.962814 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6628395-d6a6-4719-b0ad-10984c3c172b-kube-api-access-tfkrb" (OuterVolumeSpecName: "kube-api-access-tfkrb") pod "d6628395-d6a6-4719-b0ad-10984c3c172b" (UID: "d6628395-d6a6-4719-b0ad-10984c3c172b"). InnerVolumeSpecName "kube-api-access-tfkrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.981700 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae813248-510e-4b19-bcd8-39cefca6cd37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae813248-510e-4b19-bcd8-39cefca6cd37" (UID: "ae813248-510e-4b19-bcd8-39cefca6cd37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:45 crc kubenswrapper[4699]: I0226 11:31:45.984281 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae813248-510e-4b19-bcd8-39cefca6cd37-config" (OuterVolumeSpecName: "config") pod "ae813248-510e-4b19-bcd8-39cefca6cd37" (UID: "ae813248-510e-4b19-bcd8-39cefca6cd37"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.003818 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6ebd52ff-bacc-40c7-afc5-83df5a1c36e9" (UID: "6ebd52ff-bacc-40c7-afc5-83df5a1c36e9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.003839 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-config" (OuterVolumeSpecName: "config") pod "6ebd52ff-bacc-40c7-afc5-83df5a1c36e9" (UID: "6ebd52ff-bacc-40c7-afc5-83df5a1c36e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.004804 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6ebd52ff-bacc-40c7-afc5-83df5a1c36e9" (UID: "6ebd52ff-bacc-40c7-afc5-83df5a1c36e9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.004905 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6ebd52ff-bacc-40c7-afc5-83df5a1c36e9" (UID: "6ebd52ff-bacc-40c7-afc5-83df5a1c36e9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.005961 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6ebd52ff-bacc-40c7-afc5-83df5a1c36e9" (UID: "6ebd52ff-bacc-40c7-afc5-83df5a1c36e9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.048507 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6628395-d6a6-4719-b0ad-10984c3c172b-logs\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.048545 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41ed545b-f613-4408-bd1c-df5a09432e39-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.048559 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.048573 4699 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/41ed545b-f613-4408-bd1c-df5a09432e39-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.048589 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.048599 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrxfq\" (UniqueName: \"kubernetes.io/projected/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-kube-api-access-lrxfq\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.048610 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.048623 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae813248-510e-4b19-bcd8-39cefca6cd37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.048633 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n82h\" (UniqueName: \"kubernetes.io/projected/41ed545b-f613-4408-bd1c-df5a09432e39-kube-api-access-7n82h\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.048642 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqk94\" (UniqueName: \"kubernetes.io/projected/ae813248-510e-4b19-bcd8-39cefca6cd37-kube-api-access-pqk94\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.048652 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6628395-d6a6-4719-b0ad-10984c3c172b-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.048661 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ed545b-f613-4408-bd1c-df5a09432e39-logs\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.048670 4699 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.048680 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ae813248-510e-4b19-bcd8-39cefca6cd37-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.048690 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfkrb\" (UniqueName: \"kubernetes.io/projected/d6628395-d6a6-4719-b0ad-10984c3c172b-kube-api-access-tfkrb\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.048700 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/41ed545b-f613-4408-bd1c-df5a09432e39-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.048712 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6628395-d6a6-4719-b0ad-10984c3c172b-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.048722 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.048731 4699 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6628395-d6a6-4719-b0ad-10984c3c172b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.265865 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-db87b77d9-ns48f" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.265887 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f6f7dcd75-m9jm6" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.265922 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.266302 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d649b895f-2cm8f" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.269618 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dr78q" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.283816 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-db87b77d9-ns48f" event={"ID":"c9cf42d8-ed15-44dd-aaed-fbffa29417c4","Type":"ContainerDied","Data":"1e699d538cdebfe589c475a344848f907766835460184b9e330ed614ebb6483c"} Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.283861 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f6f7dcd75-m9jm6" event={"ID":"41ed545b-f613-4408-bd1c-df5a09432e39","Type":"ContainerDied","Data":"b1308b571f0b2b92fb651e80c640ed1db7c81e3d85041ae47619d6dae7c87aad"} Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.283877 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" event={"ID":"6ebd52ff-bacc-40c7-afc5-83df5a1c36e9","Type":"ContainerDied","Data":"db12e6ab7e70b99da81ac4834b205007d7df170db9b9e0a8bd4ab5007bbb10d9"} Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.283894 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d649b895f-2cm8f" event={"ID":"d6628395-d6a6-4719-b0ad-10984c3c172b","Type":"ContainerDied","Data":"cf033b29d548f1e02ed1b1bad110c9d77ffdf16f842c34bb4ebc18230fbed6bf"} Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.283908 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dr78q" event={"ID":"ae813248-510e-4b19-bcd8-39cefca6cd37","Type":"ContainerDied","Data":"41ed63c8f69999d16d3b8a0632b0099f90cd743a1b305a70c63928dff741248e"} Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.283922 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41ed63c8f69999d16d3b8a0632b0099f90cd743a1b305a70c63928dff741248e" Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.371420 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f6f7dcd75-m9jm6"] Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.386534 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7f6f7dcd75-m9jm6"] Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.411973 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d649b895f-2cm8f"] Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.432972 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6d649b895f-2cm8f"] Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.446520 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-db87b77d9-ns48f"] Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.453023 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-db87b77d9-ns48f"] Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.459098 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-c6zn7"] Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.465669 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-c6zn7"] Feb 26 11:31:46 crc kubenswrapper[4699]: I0226 11:31:46.984309 4699 scope.go:117] "RemoveContainer" containerID="1784112af5cd06d4ca4320e949f04a24c30b53553bdf95678319674202498461" Feb 26 11:31:47 crc kubenswrapper[4699]: E0226 11:31:47.078476 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 26 11:31:47 crc kubenswrapper[4699]: E0226 11:31:47.079054 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mr9sd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-f49xd_openstack(8426fd89-9eba-46fa-8611-e98cc7636b41): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 11:31:47 crc kubenswrapper[4699]: E0226 11:31:47.080199 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-f49xd" podUID="8426fd89-9eba-46fa-8611-e98cc7636b41" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.137259 4699 scope.go:117] "RemoveContainer" containerID="790e1bee9a89611157009e82024f95d4afa15834cb397b52f1c9c892d7cb8150" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.153903 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-x6m79"] Feb 26 11:31:47 crc kubenswrapper[4699]: E0226 11:31:47.154328 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae813248-510e-4b19-bcd8-39cefca6cd37" containerName="neutron-db-sync" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.154342 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae813248-510e-4b19-bcd8-39cefca6cd37" containerName="neutron-db-sync" Feb 26 11:31:47 crc kubenswrapper[4699]: E0226 11:31:47.154368 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ebd52ff-bacc-40c7-afc5-83df5a1c36e9" containerName="dnsmasq-dns" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.154374 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ebd52ff-bacc-40c7-afc5-83df5a1c36e9" containerName="dnsmasq-dns" Feb 26 11:31:47 crc kubenswrapper[4699]: E0226 11:31:47.154394 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ebd52ff-bacc-40c7-afc5-83df5a1c36e9" containerName="init" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.154400 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ebd52ff-bacc-40c7-afc5-83df5a1c36e9" containerName="init" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.154587 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae813248-510e-4b19-bcd8-39cefca6cd37" containerName="neutron-db-sync" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.154606 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ebd52ff-bacc-40c7-afc5-83df5a1c36e9" containerName="dnsmasq-dns" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.159829 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.178074 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-x6m79"] Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.194375 4699 scope.go:117] "RemoveContainer" containerID="fe976bbefde2fa99a8167c39df0e86003afc4a567d5a020335a060d2c650e894" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.252990 4699 scope.go:117] "RemoveContainer" containerID="9682f0a3316099cd400015d1d5abe7c7f75f2f43640ff21520a7cddc2ba23260" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.272563 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vznpb\" (UniqueName: \"kubernetes.io/projected/a1a11bcd-db42-43bf-86ca-90fafb25674e-kube-api-access-vznpb\") pod \"dnsmasq-dns-5ccc5c4795-x6m79\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.272951 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-x6m79\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.272973 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-x6m79\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.273004 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-x6m79\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.273079 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-x6m79\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.273165 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-config\") pod \"dnsmasq-dns-5ccc5c4795-x6m79\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.333223 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-59dd795c56-7kv72"] Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.339062 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59dd795c56-7kv72" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.342003 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.342454 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xrfkn" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.342525 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.343360 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 26 11:31:47 crc kubenswrapper[4699]: E0226 11:31:47.347031 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-f49xd" podUID="8426fd89-9eba-46fa-8611-e98cc7636b41" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.371170 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-59dd795c56-7kv72"] Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.376524 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-config\") pod \"dnsmasq-dns-5ccc5c4795-x6m79\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.376778 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vznpb\" (UniqueName: \"kubernetes.io/projected/a1a11bcd-db42-43bf-86ca-90fafb25674e-kube-api-access-vznpb\") pod \"dnsmasq-dns-5ccc5c4795-x6m79\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.376801 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-x6m79\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.376819 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-x6m79\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.376846 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-x6m79\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.376902 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-x6m79\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.377913 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-x6m79\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.378005 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-x6m79\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.378159 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-config\") pod \"dnsmasq-dns-5ccc5c4795-x6m79\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.378951 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-x6m79\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.380812 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-x6m79\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.390805 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57899c756d-w9pc5"] Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.406816 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vznpb\" (UniqueName: \"kubernetes.io/projected/a1a11bcd-db42-43bf-86ca-90fafb25674e-kube-api-access-vznpb\") pod \"dnsmasq-dns-5ccc5c4795-x6m79\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.478010 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-combined-ca-bundle\") pod \"neutron-59dd795c56-7kv72\" (UID: \"715a80f0-cdba-439c-8a82-4838bf8f7e50\") " pod="openstack/neutron-59dd795c56-7kv72" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.478055 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-config\") pod \"neutron-59dd795c56-7kv72\" (UID: \"715a80f0-cdba-439c-8a82-4838bf8f7e50\") " pod="openstack/neutron-59dd795c56-7kv72" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.478092 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st8wb\" (UniqueName: \"kubernetes.io/projected/715a80f0-cdba-439c-8a82-4838bf8f7e50-kube-api-access-st8wb\") pod \"neutron-59dd795c56-7kv72\" (UID: \"715a80f0-cdba-439c-8a82-4838bf8f7e50\") " pod="openstack/neutron-59dd795c56-7kv72" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.478148 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-ovndb-tls-certs\") pod \"neutron-59dd795c56-7kv72\" (UID: \"715a80f0-cdba-439c-8a82-4838bf8f7e50\") " pod="openstack/neutron-59dd795c56-7kv72" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.478220 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-httpd-config\") pod \"neutron-59dd795c56-7kv72\" (UID: \"715a80f0-cdba-439c-8a82-4838bf8f7e50\") " pod="openstack/neutron-59dd795c56-7kv72" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.566851 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.581126 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-combined-ca-bundle\") pod \"neutron-59dd795c56-7kv72\" (UID: \"715a80f0-cdba-439c-8a82-4838bf8f7e50\") " pod="openstack/neutron-59dd795c56-7kv72" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.581174 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-config\") pod \"neutron-59dd795c56-7kv72\" (UID: \"715a80f0-cdba-439c-8a82-4838bf8f7e50\") " pod="openstack/neutron-59dd795c56-7kv72" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.581206 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st8wb\" (UniqueName: \"kubernetes.io/projected/715a80f0-cdba-439c-8a82-4838bf8f7e50-kube-api-access-st8wb\") pod \"neutron-59dd795c56-7kv72\" (UID: \"715a80f0-cdba-439c-8a82-4838bf8f7e50\") " pod="openstack/neutron-59dd795c56-7kv72" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.581237 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-ovndb-tls-certs\") pod \"neutron-59dd795c56-7kv72\" (UID: \"715a80f0-cdba-439c-8a82-4838bf8f7e50\") " pod="openstack/neutron-59dd795c56-7kv72" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.581290 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-httpd-config\") pod \"neutron-59dd795c56-7kv72\" (UID: \"715a80f0-cdba-439c-8a82-4838bf8f7e50\") " pod="openstack/neutron-59dd795c56-7kv72" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.587784 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-ovndb-tls-certs\") pod \"neutron-59dd795c56-7kv72\" (UID: \"715a80f0-cdba-439c-8a82-4838bf8f7e50\") " pod="openstack/neutron-59dd795c56-7kv72" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.590451 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-config\") pod \"neutron-59dd795c56-7kv72\" (UID: \"715a80f0-cdba-439c-8a82-4838bf8f7e50\") " pod="openstack/neutron-59dd795c56-7kv72" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.593360 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-combined-ca-bundle\") pod \"neutron-59dd795c56-7kv72\" (UID: \"715a80f0-cdba-439c-8a82-4838bf8f7e50\") " pod="openstack/neutron-59dd795c56-7kv72" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.602226 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-httpd-config\") pod \"neutron-59dd795c56-7kv72\" (UID: \"715a80f0-cdba-439c-8a82-4838bf8f7e50\") " pod="openstack/neutron-59dd795c56-7kv72" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.606075 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st8wb\" (UniqueName: \"kubernetes.io/projected/715a80f0-cdba-439c-8a82-4838bf8f7e50-kube-api-access-st8wb\") pod \"neutron-59dd795c56-7kv72\" (UID: \"715a80f0-cdba-439c-8a82-4838bf8f7e50\") " pod="openstack/neutron-59dd795c56-7kv72" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.655010 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5795557cd8-dvzqq"] Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.682996 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59dd795c56-7kv72" Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.756043 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-28v5g"] Feb 26 11:31:47 crc kubenswrapper[4699]: I0226 11:31:47.892081 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 11:31:47 crc kubenswrapper[4699]: W0226 11:31:47.954393 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fcef8c4_762f_45c5_9087_fdfd43cd166f.slice/crio-d882d7425cfad82c938bd8e161f347bff453aefcb2f7ce55cc7f9962a1234809 WatchSource:0}: Error finding container d882d7425cfad82c938bd8e161f347bff453aefcb2f7ce55cc7f9962a1234809: Status 404 returned error can't find the container with id d882d7425cfad82c938bd8e161f347bff453aefcb2f7ce55cc7f9962a1234809 Feb 26 11:31:48 crc kubenswrapper[4699]: I0226 11:31:48.236042 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-x6m79"] Feb 26 11:31:48 crc kubenswrapper[4699]: I0226 11:31:48.273197 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41ed545b-f613-4408-bd1c-df5a09432e39" path="/var/lib/kubelet/pods/41ed545b-f613-4408-bd1c-df5a09432e39/volumes" Feb 26 11:31:48 crc kubenswrapper[4699]: I0226 11:31:48.273563 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ebd52ff-bacc-40c7-afc5-83df5a1c36e9" path="/var/lib/kubelet/pods/6ebd52ff-bacc-40c7-afc5-83df5a1c36e9/volumes" Feb 26 11:31:48 crc kubenswrapper[4699]: I0226 11:31:48.274368 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9cf42d8-ed15-44dd-aaed-fbffa29417c4" path="/var/lib/kubelet/pods/c9cf42d8-ed15-44dd-aaed-fbffa29417c4/volumes" Feb 26 11:31:48 crc kubenswrapper[4699]: I0226 11:31:48.274787 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6628395-d6a6-4719-b0ad-10984c3c172b" path="/var/lib/kubelet/pods/d6628395-d6a6-4719-b0ad-10984c3c172b/volumes" Feb 26 11:31:48 crc kubenswrapper[4699]: I0226 11:31:48.349795 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5795557cd8-dvzqq" event={"ID":"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0","Type":"ContainerStarted","Data":"e395594ad1f61e5feb4034016d0fe14bffeb3165820e50ecb46c83448ae5661a"} Feb 26 11:31:48 crc kubenswrapper[4699]: I0226 11:31:48.351483 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57899c756d-w9pc5" event={"ID":"78d85906-b78a-46eb-b5dd-4da95c1222d8","Type":"ContainerStarted","Data":"70b6c63ca13b9c59a7d033612c4fd91b9c2d11c7f06db99a50ef89d5c7c7c5da"} Feb 26 11:31:48 crc kubenswrapper[4699]: I0226 11:31:48.352892 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6fcef8c4-762f-45c5-9087-fdfd43cd166f","Type":"ContainerStarted","Data":"d882d7425cfad82c938bd8e161f347bff453aefcb2f7ce55cc7f9962a1234809"} Feb 26 11:31:48 crc kubenswrapper[4699]: I0226 11:31:48.358469 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7g59c" event={"ID":"d45d20cb-c561-4b84-b327-9b096865e8bb","Type":"ContainerStarted","Data":"4266f5dcbf67cb6303072faf9cd69cd6aabcaee0bb9544fa39ab82b24cc3c4e5"} Feb 26 11:31:48 crc kubenswrapper[4699]: I0226 11:31:48.367686 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-28v5g" event={"ID":"b33c7b6e-a78a-4a10-848c-a65d01deee0b","Type":"ContainerStarted","Data":"2c50ad90e0d44eb8ed21f890b451db6090ce5a989b38e99bb109caa8d5b20956"} Feb 26 11:31:48 crc kubenswrapper[4699]: I0226 11:31:48.384650 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-7g59c" podStartSLOduration=4.470968065 podStartE2EDuration="32.384622459s" podCreationTimestamp="2026-02-26 11:31:16 +0000 UTC" firstStartedPulling="2026-02-26 11:31:17.810524714 +0000 UTC m=+1223.621351148" lastFinishedPulling="2026-02-26 11:31:45.724179098 +0000 UTC m=+1251.535005542" observedRunningTime="2026-02-26 11:31:48.37359064 +0000 UTC m=+1254.184417094" watchObservedRunningTime="2026-02-26 11:31:48.384622459 +0000 UTC m=+1254.195448893" Feb 26 11:31:48 crc kubenswrapper[4699]: I0226 11:31:48.417129 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-59dd795c56-7kv72"] Feb 26 11:31:48 crc kubenswrapper[4699]: W0226 11:31:48.445082 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1a11bcd_db42_43bf_86ca_90fafb25674e.slice/crio-f9de2c3df25a9cf2b2ee0fc78e93892cae6d343216a055e8b673d55fa947c455 WatchSource:0}: Error finding container f9de2c3df25a9cf2b2ee0fc78e93892cae6d343216a055e8b673d55fa947c455: Status 404 returned error can't find the container with id f9de2c3df25a9cf2b2ee0fc78e93892cae6d343216a055e8b673d55fa947c455 Feb 26 11:31:48 crc kubenswrapper[4699]: I0226 11:31:48.827630 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 11:31:48 crc kubenswrapper[4699]: W0226 11:31:48.860051 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c856fe4_2ae4_4e5d_8112_a367658a5082.slice/crio-534a666cc64d5aef0fcdad971cc27654c030250fd2a92fa19d0af2b3628f9287 WatchSource:0}: Error finding container 534a666cc64d5aef0fcdad971cc27654c030250fd2a92fa19d0af2b3628f9287: Status 404 returned error can't find the container with id 534a666cc64d5aef0fcdad971cc27654c030250fd2a92fa19d0af2b3628f9287 Feb 26 11:31:48 crc kubenswrapper[4699]: I0226 11:31:48.978977 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d5b6d6b67-c6zn7" podUID="6ebd52ff-bacc-40c7-afc5-83df5a1c36e9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: i/o timeout" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.436161 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1c856fe4-2ae4-4e5d-8112-a367658a5082","Type":"ContainerStarted","Data":"534a666cc64d5aef0fcdad971cc27654c030250fd2a92fa19d0af2b3628f9287"} Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.439488 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-28v5g" event={"ID":"b33c7b6e-a78a-4a10-848c-a65d01deee0b","Type":"ContainerStarted","Data":"861736c6decfb2ac1c3010699205e1df4da771409780863184ec8e9136dd76db"} Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.462637 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59dd795c56-7kv72" event={"ID":"715a80f0-cdba-439c-8a82-4838bf8f7e50","Type":"ContainerStarted","Data":"79c878075032024e487997f5af9db4e3be830d392f6df8fc08ba5dbf79596db4"} Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.462696 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59dd795c56-7kv72" event={"ID":"715a80f0-cdba-439c-8a82-4838bf8f7e50","Type":"ContainerStarted","Data":"e5405c70871ee395752c1da1df07066a938aedcb1ac422f960283753ab469ce2"} Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.462706 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59dd795c56-7kv72" event={"ID":"715a80f0-cdba-439c-8a82-4838bf8f7e50","Type":"ContainerStarted","Data":"6345d756a7b816036dc69f325dd74145097fc551abbeb710dfcdf0451b76e1c8"} Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.463023 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-59dd795c56-7kv72" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.465104 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-28v5g" podStartSLOduration=23.46508292 podStartE2EDuration="23.46508292s" podCreationTimestamp="2026-02-26 11:31:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:31:49.460005113 +0000 UTC m=+1255.270831557" watchObservedRunningTime="2026-02-26 11:31:49.46508292 +0000 UTC m=+1255.275909354" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.469962 4699 generic.go:334] "Generic (PLEG): container finished" podID="a1a11bcd-db42-43bf-86ca-90fafb25674e" containerID="3c60b289616323cd6352bf0b5554d4a5d5ee327ffbb6b71e27e82bb85958f651" exitCode=0 Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.470059 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" event={"ID":"a1a11bcd-db42-43bf-86ca-90fafb25674e","Type":"ContainerDied","Data":"3c60b289616323cd6352bf0b5554d4a5d5ee327ffbb6b71e27e82bb85958f651"} Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.470090 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" event={"ID":"a1a11bcd-db42-43bf-86ca-90fafb25674e","Type":"ContainerStarted","Data":"f9de2c3df25a9cf2b2ee0fc78e93892cae6d343216a055e8b673d55fa947c455"} Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.474673 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5795557cd8-dvzqq" event={"ID":"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0","Type":"ContainerStarted","Data":"1767b2abf735105a3b07ed8e99603ab13b42e9a5e09f31946a5ccb228e9ee1f0"} Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.474714 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5795557cd8-dvzqq" event={"ID":"15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0","Type":"ContainerStarted","Data":"aafcea7d5b89f880d565d186cbde95af7c3060362e13d7c6782b92f5d4756b45"} Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.497615 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-59dd795c56-7kv72" podStartSLOduration=2.497582799 podStartE2EDuration="2.497582799s" podCreationTimestamp="2026-02-26 11:31:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:31:49.480820784 +0000 UTC m=+1255.291647238" watchObservedRunningTime="2026-02-26 11:31:49.497582799 +0000 UTC m=+1255.308409233" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.518238 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5795557cd8-dvzqq" podStartSLOduration=21.684463887 podStartE2EDuration="22.518216814s" podCreationTimestamp="2026-02-26 11:31:27 +0000 UTC" firstStartedPulling="2026-02-26 11:31:47.728372517 +0000 UTC m=+1253.539198951" lastFinishedPulling="2026-02-26 11:31:48.562125444 +0000 UTC m=+1254.372951878" observedRunningTime="2026-02-26 11:31:49.512300204 +0000 UTC m=+1255.323126658" watchObservedRunningTime="2026-02-26 11:31:49.518216814 +0000 UTC m=+1255.329043248" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.524053 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57899c756d-w9pc5" event={"ID":"78d85906-b78a-46eb-b5dd-4da95c1222d8","Type":"ContainerStarted","Data":"5570b961c7c2f73533bbe65fa87a9f8cc0b880e79add1f25b918377e32b9375d"} Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.524109 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57899c756d-w9pc5" event={"ID":"78d85906-b78a-46eb-b5dd-4da95c1222d8","Type":"ContainerStarted","Data":"de9a25314ef41f7d3414b57dcaeec2a9add4d5ecb708b80dc9af27c79856ba9b"} Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.545735 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6fcef8c4-762f-45c5-9087-fdfd43cd166f","Type":"ContainerStarted","Data":"1046f31d04f2776b06f0de44d09b87f8e5ad14a1988284dd4ff3881de9d6b1cb"} Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.548910 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7cec2d73-9ca8-4a8b-836d-efce961fbde8","Type":"ContainerStarted","Data":"2f8839485194d3f95fa8291b794501d86f5716d9498983ff8ecd5c989e725c0d"} Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.586023 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-57899c756d-w9pc5" podStartSLOduration=21.498345813 podStartE2EDuration="22.586005842s" podCreationTimestamp="2026-02-26 11:31:27 +0000 UTC" firstStartedPulling="2026-02-26 11:31:47.360964357 +0000 UTC m=+1253.171790791" lastFinishedPulling="2026-02-26 11:31:48.448624386 +0000 UTC m=+1254.259450820" observedRunningTime="2026-02-26 11:31:49.581978146 +0000 UTC m=+1255.392804580" watchObservedRunningTime="2026-02-26 11:31:49.586005842 +0000 UTC m=+1255.396832296" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.633476 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6dc5565bbf-zgvcg"] Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.637066 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.638472 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6dc5565bbf-zgvcg"] Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.641257 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.641449 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.734703 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-httpd-config\") pod \"neutron-6dc5565bbf-zgvcg\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.734814 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-internal-tls-certs\") pod \"neutron-6dc5565bbf-zgvcg\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.734849 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-config\") pod \"neutron-6dc5565bbf-zgvcg\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.734906 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-combined-ca-bundle\") pod \"neutron-6dc5565bbf-zgvcg\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.734933 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-ovndb-tls-certs\") pod \"neutron-6dc5565bbf-zgvcg\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.734958 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g2x8\" (UniqueName: \"kubernetes.io/projected/73fd43db-ab24-441d-9912-881ef04d4572-kube-api-access-6g2x8\") pod \"neutron-6dc5565bbf-zgvcg\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.735021 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-public-tls-certs\") pod \"neutron-6dc5565bbf-zgvcg\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.837183 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-public-tls-certs\") pod \"neutron-6dc5565bbf-zgvcg\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.837535 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-httpd-config\") pod \"neutron-6dc5565bbf-zgvcg\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.837581 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-internal-tls-certs\") pod \"neutron-6dc5565bbf-zgvcg\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.837600 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-config\") pod \"neutron-6dc5565bbf-zgvcg\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.837648 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-combined-ca-bundle\") pod \"neutron-6dc5565bbf-zgvcg\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.837667 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-ovndb-tls-certs\") pod \"neutron-6dc5565bbf-zgvcg\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.837691 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g2x8\" (UniqueName: \"kubernetes.io/projected/73fd43db-ab24-441d-9912-881ef04d4572-kube-api-access-6g2x8\") pod \"neutron-6dc5565bbf-zgvcg\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.863461 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-config\") pod \"neutron-6dc5565bbf-zgvcg\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.864154 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-httpd-config\") pod \"neutron-6dc5565bbf-zgvcg\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.864395 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-combined-ca-bundle\") pod \"neutron-6dc5565bbf-zgvcg\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.864160 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-internal-tls-certs\") pod \"neutron-6dc5565bbf-zgvcg\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.865084 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-ovndb-tls-certs\") pod \"neutron-6dc5565bbf-zgvcg\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.876971 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-public-tls-certs\") pod \"neutron-6dc5565bbf-zgvcg\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:49 crc kubenswrapper[4699]: I0226 11:31:49.888036 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g2x8\" (UniqueName: \"kubernetes.io/projected/73fd43db-ab24-441d-9912-881ef04d4572-kube-api-access-6g2x8\") pod \"neutron-6dc5565bbf-zgvcg\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:50 crc kubenswrapper[4699]: I0226 11:31:50.007864 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:50 crc kubenswrapper[4699]: I0226 11:31:50.572229 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6fcef8c4-762f-45c5-9087-fdfd43cd166f","Type":"ContainerStarted","Data":"1ae84e423f8195999f8eff78d0bcd4d1b0012d9d2c7560a448febb038de27c9d"} Feb 26 11:31:50 crc kubenswrapper[4699]: I0226 11:31:50.572696 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6fcef8c4-762f-45c5-9087-fdfd43cd166f" containerName="glance-log" containerID="cri-o://1046f31d04f2776b06f0de44d09b87f8e5ad14a1988284dd4ff3881de9d6b1cb" gracePeriod=30 Feb 26 11:31:50 crc kubenswrapper[4699]: I0226 11:31:50.572846 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6fcef8c4-762f-45c5-9087-fdfd43cd166f" containerName="glance-httpd" containerID="cri-o://1ae84e423f8195999f8eff78d0bcd4d1b0012d9d2c7560a448febb038de27c9d" gracePeriod=30 Feb 26 11:31:50 crc kubenswrapper[4699]: I0226 11:31:50.584814 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1c856fe4-2ae4-4e5d-8112-a367658a5082","Type":"ContainerStarted","Data":"86f7637b447ffd260d1c029f9ebaf1fd5c0a52784cd3264877063821ada8e279"} Feb 26 11:31:50 crc kubenswrapper[4699]: I0226 11:31:50.597744 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=24.597726239 podStartE2EDuration="24.597726239s" podCreationTimestamp="2026-02-26 11:31:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:31:50.594130305 +0000 UTC m=+1256.404956769" watchObservedRunningTime="2026-02-26 11:31:50.597726239 +0000 UTC m=+1256.408552673" Feb 26 11:31:50 crc kubenswrapper[4699]: I0226 11:31:50.599198 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" event={"ID":"a1a11bcd-db42-43bf-86ca-90fafb25674e","Type":"ContainerStarted","Data":"f6df4021899217dba2f01191869995ca628d93d69016b474ac26db11ce7351f9"} Feb 26 11:31:50 crc kubenswrapper[4699]: I0226 11:31:50.600285 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" Feb 26 11:31:50 crc kubenswrapper[4699]: I0226 11:31:50.625041 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6dc5565bbf-zgvcg"] Feb 26 11:31:50 crc kubenswrapper[4699]: I0226 11:31:50.641873 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" podStartSLOduration=3.641854994 podStartE2EDuration="3.641854994s" podCreationTimestamp="2026-02-26 11:31:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:31:50.62719826 +0000 UTC m=+1256.438024694" watchObservedRunningTime="2026-02-26 11:31:50.641854994 +0000 UTC m=+1256.452681428" Feb 26 11:31:50 crc kubenswrapper[4699]: W0226 11:31:50.827092 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73fd43db_ab24_441d_9912_881ef04d4572.slice/crio-31b648d87b25df09b072d95e938824b9e321e65ded6b88d9eed7727a038a5155 WatchSource:0}: Error finding container 31b648d87b25df09b072d95e938824b9e321e65ded6b88d9eed7727a038a5155: Status 404 returned error can't find the container with id 31b648d87b25df09b072d95e938824b9e321e65ded6b88d9eed7727a038a5155 Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.309859 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.370808 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fcef8c4-762f-45c5-9087-fdfd43cd166f-config-data\") pod \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.370909 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fcef8c4-762f-45c5-9087-fdfd43cd166f-logs\") pod \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.370936 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.370983 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fcef8c4-762f-45c5-9087-fdfd43cd166f-scripts\") pod \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.371060 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb644\" (UniqueName: \"kubernetes.io/projected/6fcef8c4-762f-45c5-9087-fdfd43cd166f-kube-api-access-wb644\") pod \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.371095 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6fcef8c4-762f-45c5-9087-fdfd43cd166f-httpd-run\") pod \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.371132 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fcef8c4-762f-45c5-9087-fdfd43cd166f-combined-ca-bundle\") pod \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\" (UID: \"6fcef8c4-762f-45c5-9087-fdfd43cd166f\") " Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.371367 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fcef8c4-762f-45c5-9087-fdfd43cd166f-logs" (OuterVolumeSpecName: "logs") pod "6fcef8c4-762f-45c5-9087-fdfd43cd166f" (UID: "6fcef8c4-762f-45c5-9087-fdfd43cd166f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.371789 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fcef8c4-762f-45c5-9087-fdfd43cd166f-logs\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.372621 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fcef8c4-762f-45c5-9087-fdfd43cd166f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6fcef8c4-762f-45c5-9087-fdfd43cd166f" (UID: "6fcef8c4-762f-45c5-9087-fdfd43cd166f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.377621 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fcef8c4-762f-45c5-9087-fdfd43cd166f-kube-api-access-wb644" (OuterVolumeSpecName: "kube-api-access-wb644") pod "6fcef8c4-762f-45c5-9087-fdfd43cd166f" (UID: "6fcef8c4-762f-45c5-9087-fdfd43cd166f"). InnerVolumeSpecName "kube-api-access-wb644". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.379244 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fcef8c4-762f-45c5-9087-fdfd43cd166f-scripts" (OuterVolumeSpecName: "scripts") pod "6fcef8c4-762f-45c5-9087-fdfd43cd166f" (UID: "6fcef8c4-762f-45c5-9087-fdfd43cd166f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.380220 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "6fcef8c4-762f-45c5-9087-fdfd43cd166f" (UID: "6fcef8c4-762f-45c5-9087-fdfd43cd166f"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.404345 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fcef8c4-762f-45c5-9087-fdfd43cd166f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6fcef8c4-762f-45c5-9087-fdfd43cd166f" (UID: "6fcef8c4-762f-45c5-9087-fdfd43cd166f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.441403 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fcef8c4-762f-45c5-9087-fdfd43cd166f-config-data" (OuterVolumeSpecName: "config-data") pod "6fcef8c4-762f-45c5-9087-fdfd43cd166f" (UID: "6fcef8c4-762f-45c5-9087-fdfd43cd166f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.474756 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb644\" (UniqueName: \"kubernetes.io/projected/6fcef8c4-762f-45c5-9087-fdfd43cd166f-kube-api-access-wb644\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.474789 4699 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6fcef8c4-762f-45c5-9087-fdfd43cd166f-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.474799 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fcef8c4-762f-45c5-9087-fdfd43cd166f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.474809 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fcef8c4-762f-45c5-9087-fdfd43cd166f-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.474842 4699 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.474855 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fcef8c4-762f-45c5-9087-fdfd43cd166f-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.502672 4699 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.576358 4699 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.616062 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dc5565bbf-zgvcg" event={"ID":"73fd43db-ab24-441d-9912-881ef04d4572","Type":"ContainerStarted","Data":"31b648d87b25df09b072d95e938824b9e321e65ded6b88d9eed7727a038a5155"} Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.618777 4699 generic.go:334] "Generic (PLEG): container finished" podID="6fcef8c4-762f-45c5-9087-fdfd43cd166f" containerID="1ae84e423f8195999f8eff78d0bcd4d1b0012d9d2c7560a448febb038de27c9d" exitCode=143 Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.618801 4699 generic.go:334] "Generic (PLEG): container finished" podID="6fcef8c4-762f-45c5-9087-fdfd43cd166f" containerID="1046f31d04f2776b06f0de44d09b87f8e5ad14a1988284dd4ff3881de9d6b1cb" exitCode=143 Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.618835 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6fcef8c4-762f-45c5-9087-fdfd43cd166f","Type":"ContainerDied","Data":"1ae84e423f8195999f8eff78d0bcd4d1b0012d9d2c7560a448febb038de27c9d"} Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.618857 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6fcef8c4-762f-45c5-9087-fdfd43cd166f","Type":"ContainerDied","Data":"1046f31d04f2776b06f0de44d09b87f8e5ad14a1988284dd4ff3881de9d6b1cb"} Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.618867 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6fcef8c4-762f-45c5-9087-fdfd43cd166f","Type":"ContainerDied","Data":"d882d7425cfad82c938bd8e161f347bff453aefcb2f7ce55cc7f9962a1234809"} Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.618882 4699 scope.go:117] "RemoveContainer" containerID="1ae84e423f8195999f8eff78d0bcd4d1b0012d9d2c7560a448febb038de27c9d" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.618999 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.622759 4699 generic.go:334] "Generic (PLEG): container finished" podID="d45d20cb-c561-4b84-b327-9b096865e8bb" containerID="4266f5dcbf67cb6303072faf9cd69cd6aabcaee0bb9544fa39ab82b24cc3c4e5" exitCode=0 Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.623313 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7g59c" event={"ID":"d45d20cb-c561-4b84-b327-9b096865e8bb","Type":"ContainerDied","Data":"4266f5dcbf67cb6303072faf9cd69cd6aabcaee0bb9544fa39ab82b24cc3c4e5"} Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.715844 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.736810 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.746863 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 11:31:51 crc kubenswrapper[4699]: E0226 11:31:51.750378 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fcef8c4-762f-45c5-9087-fdfd43cd166f" containerName="glance-log" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.750421 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fcef8c4-762f-45c5-9087-fdfd43cd166f" containerName="glance-log" Feb 26 11:31:51 crc kubenswrapper[4699]: E0226 11:31:51.750446 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fcef8c4-762f-45c5-9087-fdfd43cd166f" containerName="glance-httpd" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.750454 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fcef8c4-762f-45c5-9087-fdfd43cd166f" containerName="glance-httpd" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.750738 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fcef8c4-762f-45c5-9087-fdfd43cd166f" containerName="glance-log" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.750765 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fcef8c4-762f-45c5-9087-fdfd43cd166f" containerName="glance-httpd" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.751898 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.759783 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.761835 4699 scope.go:117] "RemoveContainer" containerID="1046f31d04f2776b06f0de44d09b87f8e5ad14a1988284dd4ff3881de9d6b1cb" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.776463 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.776486 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.885177 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.885671 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d42e724c-224e-4c68-b5b4-b72d72d4ded8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.885702 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-config-data\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.885757 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.885779 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q7ml\" (UniqueName: \"kubernetes.io/projected/d42e724c-224e-4c68-b5b4-b72d72d4ded8-kube-api-access-6q7ml\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.885801 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.885865 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-scripts\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.885889 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d42e724c-224e-4c68-b5b4-b72d72d4ded8-logs\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.928789 4699 scope.go:117] "RemoveContainer" containerID="1ae84e423f8195999f8eff78d0bcd4d1b0012d9d2c7560a448febb038de27c9d" Feb 26 11:31:51 crc kubenswrapper[4699]: E0226 11:31:51.932253 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ae84e423f8195999f8eff78d0bcd4d1b0012d9d2c7560a448febb038de27c9d\": container with ID starting with 1ae84e423f8195999f8eff78d0bcd4d1b0012d9d2c7560a448febb038de27c9d not found: ID does not exist" containerID="1ae84e423f8195999f8eff78d0bcd4d1b0012d9d2c7560a448febb038de27c9d" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.932288 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae84e423f8195999f8eff78d0bcd4d1b0012d9d2c7560a448febb038de27c9d"} err="failed to get container status \"1ae84e423f8195999f8eff78d0bcd4d1b0012d9d2c7560a448febb038de27c9d\": rpc error: code = NotFound desc = could not find container \"1ae84e423f8195999f8eff78d0bcd4d1b0012d9d2c7560a448febb038de27c9d\": container with ID starting with 1ae84e423f8195999f8eff78d0bcd4d1b0012d9d2c7560a448febb038de27c9d not found: ID does not exist" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.932309 4699 scope.go:117] "RemoveContainer" containerID="1046f31d04f2776b06f0de44d09b87f8e5ad14a1988284dd4ff3881de9d6b1cb" Feb 26 11:31:51 crc kubenswrapper[4699]: E0226 11:31:51.935200 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1046f31d04f2776b06f0de44d09b87f8e5ad14a1988284dd4ff3881de9d6b1cb\": container with ID starting with 1046f31d04f2776b06f0de44d09b87f8e5ad14a1988284dd4ff3881de9d6b1cb not found: ID does not exist" containerID="1046f31d04f2776b06f0de44d09b87f8e5ad14a1988284dd4ff3881de9d6b1cb" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.935255 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1046f31d04f2776b06f0de44d09b87f8e5ad14a1988284dd4ff3881de9d6b1cb"} err="failed to get container status \"1046f31d04f2776b06f0de44d09b87f8e5ad14a1988284dd4ff3881de9d6b1cb\": rpc error: code = NotFound desc = could not find container \"1046f31d04f2776b06f0de44d09b87f8e5ad14a1988284dd4ff3881de9d6b1cb\": container with ID starting with 1046f31d04f2776b06f0de44d09b87f8e5ad14a1988284dd4ff3881de9d6b1cb not found: ID does not exist" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.935275 4699 scope.go:117] "RemoveContainer" containerID="1ae84e423f8195999f8eff78d0bcd4d1b0012d9d2c7560a448febb038de27c9d" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.936875 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae84e423f8195999f8eff78d0bcd4d1b0012d9d2c7560a448febb038de27c9d"} err="failed to get container status \"1ae84e423f8195999f8eff78d0bcd4d1b0012d9d2c7560a448febb038de27c9d\": rpc error: code = NotFound desc = could not find container \"1ae84e423f8195999f8eff78d0bcd4d1b0012d9d2c7560a448febb038de27c9d\": container with ID starting with 1ae84e423f8195999f8eff78d0bcd4d1b0012d9d2c7560a448febb038de27c9d not found: ID does not exist" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.936898 4699 scope.go:117] "RemoveContainer" containerID="1046f31d04f2776b06f0de44d09b87f8e5ad14a1988284dd4ff3881de9d6b1cb" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.939301 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1046f31d04f2776b06f0de44d09b87f8e5ad14a1988284dd4ff3881de9d6b1cb"} err="failed to get container status \"1046f31d04f2776b06f0de44d09b87f8e5ad14a1988284dd4ff3881de9d6b1cb\": rpc error: code = NotFound desc = could not find container \"1046f31d04f2776b06f0de44d09b87f8e5ad14a1988284dd4ff3881de9d6b1cb\": container with ID starting with 1046f31d04f2776b06f0de44d09b87f8e5ad14a1988284dd4ff3881de9d6b1cb not found: ID does not exist" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.988182 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-scripts\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.988231 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d42e724c-224e-4c68-b5b4-b72d72d4ded8-logs\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.988268 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.988300 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d42e724c-224e-4c68-b5b4-b72d72d4ded8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.988321 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-config-data\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.988366 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.988386 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q7ml\" (UniqueName: \"kubernetes.io/projected/d42e724c-224e-4c68-b5b4-b72d72d4ded8-kube-api-access-6q7ml\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.988406 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.989213 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.989817 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d42e724c-224e-4c68-b5b4-b72d72d4ded8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:51 crc kubenswrapper[4699]: I0226 11:31:51.994860 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d42e724c-224e-4c68-b5b4-b72d72d4ded8-logs\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:52 crc kubenswrapper[4699]: I0226 11:31:52.002692 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:52 crc kubenswrapper[4699]: I0226 11:31:52.004009 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:52 crc kubenswrapper[4699]: I0226 11:31:52.005995 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-config-data\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:52 crc kubenswrapper[4699]: I0226 11:31:52.023983 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q7ml\" (UniqueName: \"kubernetes.io/projected/d42e724c-224e-4c68-b5b4-b72d72d4ded8-kube-api-access-6q7ml\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:52 crc kubenswrapper[4699]: I0226 11:31:52.027569 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-scripts\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:52 crc kubenswrapper[4699]: I0226 11:31:52.060182 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " pod="openstack/glance-default-external-api-0" Feb 26 11:31:52 crc kubenswrapper[4699]: I0226 11:31:52.201532 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 11:31:52 crc kubenswrapper[4699]: I0226 11:31:52.280827 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fcef8c4-762f-45c5-9087-fdfd43cd166f" path="/var/lib/kubelet/pods/6fcef8c4-762f-45c5-9087-fdfd43cd166f/volumes" Feb 26 11:31:52 crc kubenswrapper[4699]: I0226 11:31:52.640882 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1c856fe4-2ae4-4e5d-8112-a367658a5082","Type":"ContainerStarted","Data":"da66386168828e12898775322c74105b9a00cb1f54506a25a4d1fcf0d9e86a7e"} Feb 26 11:31:52 crc kubenswrapper[4699]: I0226 11:31:52.641032 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1c856fe4-2ae4-4e5d-8112-a367658a5082" containerName="glance-log" containerID="cri-o://86f7637b447ffd260d1c029f9ebaf1fd5c0a52784cd3264877063821ada8e279" gracePeriod=30 Feb 26 11:31:52 crc kubenswrapper[4699]: I0226 11:31:52.641642 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1c856fe4-2ae4-4e5d-8112-a367658a5082" containerName="glance-httpd" containerID="cri-o://da66386168828e12898775322c74105b9a00cb1f54506a25a4d1fcf0d9e86a7e" gracePeriod=30 Feb 26 11:31:52 crc kubenswrapper[4699]: I0226 11:31:52.646535 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-z6w9z" event={"ID":"47a9d008-5b7e-4866-b92b-efcb60cbfdb0","Type":"ContainerStarted","Data":"45bdc052e6dc259f4ccec396b223ed5d541f623efae769fc3c166913b1ca187a"} Feb 26 11:31:52 crc kubenswrapper[4699]: I0226 11:31:52.647842 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dc5565bbf-zgvcg" event={"ID":"73fd43db-ab24-441d-9912-881ef04d4572","Type":"ContainerStarted","Data":"f1b43b05d45b05ac3c54d378fa118972d9e5848b345eada8b66bb2c67ea89c63"} Feb 26 11:31:52 crc kubenswrapper[4699]: I0226 11:31:52.678131 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=26.678096637 podStartE2EDuration="26.678096637s" podCreationTimestamp="2026-02-26 11:31:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:31:52.669477768 +0000 UTC m=+1258.480304222" watchObservedRunningTime="2026-02-26 11:31:52.678096637 +0000 UTC m=+1258.488923081" Feb 26 11:31:52 crc kubenswrapper[4699]: I0226 11:31:52.699199 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-z6w9z" podStartSLOduration=3.789479583 podStartE2EDuration="36.699179116s" podCreationTimestamp="2026-02-26 11:31:16 +0000 UTC" firstStartedPulling="2026-02-26 11:31:17.976464666 +0000 UTC m=+1223.787291100" lastFinishedPulling="2026-02-26 11:31:50.886164199 +0000 UTC m=+1256.696990633" observedRunningTime="2026-02-26 11:31:52.692869483 +0000 UTC m=+1258.503695927" watchObservedRunningTime="2026-02-26 11:31:52.699179116 +0000 UTC m=+1258.510005570" Feb 26 11:31:52 crc kubenswrapper[4699]: I0226 11:31:52.793957 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.035770 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7g59c" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.120128 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d45d20cb-c561-4b84-b327-9b096865e8bb-combined-ca-bundle\") pod \"d45d20cb-c561-4b84-b327-9b096865e8bb\" (UID: \"d45d20cb-c561-4b84-b327-9b096865e8bb\") " Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.120278 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d45d20cb-c561-4b84-b327-9b096865e8bb-db-sync-config-data\") pod \"d45d20cb-c561-4b84-b327-9b096865e8bb\" (UID: \"d45d20cb-c561-4b84-b327-9b096865e8bb\") " Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.120378 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmccd\" (UniqueName: \"kubernetes.io/projected/d45d20cb-c561-4b84-b327-9b096865e8bb-kube-api-access-xmccd\") pod \"d45d20cb-c561-4b84-b327-9b096865e8bb\" (UID: \"d45d20cb-c561-4b84-b327-9b096865e8bb\") " Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.127654 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d45d20cb-c561-4b84-b327-9b096865e8bb-kube-api-access-xmccd" (OuterVolumeSpecName: "kube-api-access-xmccd") pod "d45d20cb-c561-4b84-b327-9b096865e8bb" (UID: "d45d20cb-c561-4b84-b327-9b096865e8bb"). InnerVolumeSpecName "kube-api-access-xmccd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.128429 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d45d20cb-c561-4b84-b327-9b096865e8bb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d45d20cb-c561-4b84-b327-9b096865e8bb" (UID: "d45d20cb-c561-4b84-b327-9b096865e8bb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.144818 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d45d20cb-c561-4b84-b327-9b096865e8bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d45d20cb-c561-4b84-b327-9b096865e8bb" (UID: "d45d20cb-c561-4b84-b327-9b096865e8bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.222320 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmccd\" (UniqueName: \"kubernetes.io/projected/d45d20cb-c561-4b84-b327-9b096865e8bb-kube-api-access-xmccd\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.222361 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d45d20cb-c561-4b84-b327-9b096865e8bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.222376 4699 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d45d20cb-c561-4b84-b327-9b096865e8bb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.689409 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d42e724c-224e-4c68-b5b4-b72d72d4ded8","Type":"ContainerStarted","Data":"93414ab02ed9ca4e817beb6280ab1441d20975697c632df8c1a82aa6fe45a0b0"} Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.692240 4699 generic.go:334] "Generic (PLEG): container finished" podID="1c856fe4-2ae4-4e5d-8112-a367658a5082" containerID="da66386168828e12898775322c74105b9a00cb1f54506a25a4d1fcf0d9e86a7e" exitCode=0 Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.692276 4699 generic.go:334] "Generic (PLEG): container finished" podID="1c856fe4-2ae4-4e5d-8112-a367658a5082" containerID="86f7637b447ffd260d1c029f9ebaf1fd5c0a52784cd3264877063821ada8e279" exitCode=143 Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.692315 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1c856fe4-2ae4-4e5d-8112-a367658a5082","Type":"ContainerDied","Data":"da66386168828e12898775322c74105b9a00cb1f54506a25a4d1fcf0d9e86a7e"} Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.692331 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1c856fe4-2ae4-4e5d-8112-a367658a5082","Type":"ContainerDied","Data":"86f7637b447ffd260d1c029f9ebaf1fd5c0a52784cd3264877063821ada8e279"} Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.695766 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7g59c" event={"ID":"d45d20cb-c561-4b84-b327-9b096865e8bb","Type":"ContainerDied","Data":"a8bf2edfe1a0cab1df993c5f3eabf3a6892b72d4d33db983d7476af16ba0c19b"} Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.695822 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8bf2edfe1a0cab1df993c5f3eabf3a6892b72d4d33db983d7476af16ba0c19b" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.697988 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7g59c" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.710068 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dc5565bbf-zgvcg" event={"ID":"73fd43db-ab24-441d-9912-881ef04d4572","Type":"ContainerStarted","Data":"fb5409015c0850abe735cc049f283c49118298bd94a368b4191042b9fb38469f"} Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.710763 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.733034 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6dc5565bbf-zgvcg" podStartSLOduration=4.7330120220000005 podStartE2EDuration="4.733012022s" podCreationTimestamp="2026-02-26 11:31:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:31:53.728643795 +0000 UTC m=+1259.539470249" watchObservedRunningTime="2026-02-26 11:31:53.733012022 +0000 UTC m=+1259.543838476" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.802585 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5fd9f445b9-bnr2j"] Feb 26 11:31:53 crc kubenswrapper[4699]: E0226 11:31:53.803034 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d45d20cb-c561-4b84-b327-9b096865e8bb" containerName="barbican-db-sync" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.803057 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="d45d20cb-c561-4b84-b327-9b096865e8bb" containerName="barbican-db-sync" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.803279 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="d45d20cb-c561-4b84-b327-9b096865e8bb" containerName="barbican-db-sync" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.804208 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5fd9f445b9-bnr2j" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.806646 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-zs6cf" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.806817 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.807409 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.831326 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5fd9f445b9-bnr2j"] Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.877295 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-8dc77f9b6-7s844"] Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.895427 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.903321 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.941055 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-config-data-custom\") pod \"barbican-worker-5fd9f445b9-bnr2j\" (UID: \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\") " pod="openstack/barbican-worker-5fd9f445b9-bnr2j" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.941677 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb7t4\" (UniqueName: \"kubernetes.io/projected/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-kube-api-access-wb7t4\") pod \"barbican-worker-5fd9f445b9-bnr2j\" (UID: \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\") " pod="openstack/barbican-worker-5fd9f445b9-bnr2j" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.941794 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-logs\") pod \"barbican-worker-5fd9f445b9-bnr2j\" (UID: \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\") " pod="openstack/barbican-worker-5fd9f445b9-bnr2j" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.942083 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-config-data\") pod \"barbican-worker-5fd9f445b9-bnr2j\" (UID: \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\") " pod="openstack/barbican-worker-5fd9f445b9-bnr2j" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.942324 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-combined-ca-bundle\") pod \"barbican-worker-5fd9f445b9-bnr2j\" (UID: \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\") " pod="openstack/barbican-worker-5fd9f445b9-bnr2j" Feb 26 11:31:53 crc kubenswrapper[4699]: I0226 11:31:53.947674 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8dc77f9b6-7s844"] Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.040183 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-x6m79"] Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.046050 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-config-data\") pod \"barbican-worker-5fd9f445b9-bnr2j\" (UID: \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\") " pod="openstack/barbican-worker-5fd9f445b9-bnr2j" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.078725 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-config-data\") pod \"barbican-keystone-listener-8dc77f9b6-7s844\" (UID: \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\") " pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.078840 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-logs\") pod \"barbican-keystone-listener-8dc77f9b6-7s844\" (UID: \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\") " pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.078917 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-combined-ca-bundle\") pod \"barbican-worker-5fd9f445b9-bnr2j\" (UID: \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\") " pod="openstack/barbican-worker-5fd9f445b9-bnr2j" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.079078 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-config-data-custom\") pod \"barbican-keystone-listener-8dc77f9b6-7s844\" (UID: \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\") " pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.079129 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-config-data-custom\") pod \"barbican-worker-5fd9f445b9-bnr2j\" (UID: \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\") " pod="openstack/barbican-worker-5fd9f445b9-bnr2j" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.079159 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-combined-ca-bundle\") pod \"barbican-keystone-listener-8dc77f9b6-7s844\" (UID: \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\") " pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.079175 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzs9k\" (UniqueName: \"kubernetes.io/projected/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-kube-api-access-mzs9k\") pod \"barbican-keystone-listener-8dc77f9b6-7s844\" (UID: \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\") " pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.079246 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb7t4\" (UniqueName: \"kubernetes.io/projected/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-kube-api-access-wb7t4\") pod \"barbican-worker-5fd9f445b9-bnr2j\" (UID: \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\") " pod="openstack/barbican-worker-5fd9f445b9-bnr2j" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.079317 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-logs\") pod \"barbican-worker-5fd9f445b9-bnr2j\" (UID: \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\") " pod="openstack/barbican-worker-5fd9f445b9-bnr2j" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.055456 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" podUID="a1a11bcd-db42-43bf-86ca-90fafb25674e" containerName="dnsmasq-dns" containerID="cri-o://f6df4021899217dba2f01191869995ca628d93d69016b474ac26db11ce7351f9" gracePeriod=10 Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.112938 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7c455f6f5b-f25td"] Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.114393 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c455f6f5b-f25td" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.142187 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-gg27w"] Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.143743 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.163200 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c455f6f5b-f25td"] Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.181221 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqdfx\" (UniqueName: \"kubernetes.io/projected/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-kube-api-access-sqdfx\") pod \"barbican-api-7c455f6f5b-f25td\" (UID: \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\") " pod="openstack/barbican-api-7c455f6f5b-f25td" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.181282 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-config-data-custom\") pod \"barbican-api-7c455f6f5b-f25td\" (UID: \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\") " pod="openstack/barbican-api-7c455f6f5b-f25td" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.181306 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-gg27w\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.181326 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-gg27w\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.181402 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-config-data\") pod \"barbican-keystone-listener-8dc77f9b6-7s844\" (UID: \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\") " pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.181420 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-logs\") pod \"barbican-keystone-listener-8dc77f9b6-7s844\" (UID: \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\") " pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.181440 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-dns-svc\") pod \"dnsmasq-dns-688c87cc99-gg27w\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.181471 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-config-data\") pod \"barbican-api-7c455f6f5b-f25td\" (UID: \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\") " pod="openstack/barbican-api-7c455f6f5b-f25td" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.181503 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-config\") pod \"dnsmasq-dns-688c87cc99-gg27w\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.181519 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlcgz\" (UniqueName: \"kubernetes.io/projected/21ee9717-aaae-4511-9cee-fb022818e57d-kube-api-access-mlcgz\") pod \"dnsmasq-dns-688c87cc99-gg27w\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.181544 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-logs\") pod \"barbican-api-7c455f6f5b-f25td\" (UID: \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\") " pod="openstack/barbican-api-7c455f6f5b-f25td" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.181559 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-combined-ca-bundle\") pod \"barbican-api-7c455f6f5b-f25td\" (UID: \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\") " pod="openstack/barbican-api-7c455f6f5b-f25td" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.181579 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-config-data-custom\") pod \"barbican-keystone-listener-8dc77f9b6-7s844\" (UID: \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\") " pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.181611 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-combined-ca-bundle\") pod \"barbican-keystone-listener-8dc77f9b6-7s844\" (UID: \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\") " pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.181628 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzs9k\" (UniqueName: \"kubernetes.io/projected/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-kube-api-access-mzs9k\") pod \"barbican-keystone-listener-8dc77f9b6-7s844\" (UID: \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\") " pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.181646 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-gg27w\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.186030 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-gg27w"] Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.247349 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.250570 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-logs\") pod \"barbican-keystone-listener-8dc77f9b6-7s844\" (UID: \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\") " pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.265942 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-config-data\") pod \"barbican-worker-5fd9f445b9-bnr2j\" (UID: \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\") " pod="openstack/barbican-worker-5fd9f445b9-bnr2j" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.267593 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-config-data\") pod \"barbican-keystone-listener-8dc77f9b6-7s844\" (UID: \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\") " pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.267806 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-logs\") pod \"barbican-worker-5fd9f445b9-bnr2j\" (UID: \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\") " pod="openstack/barbican-worker-5fd9f445b9-bnr2j" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.268258 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-config-data-custom\") pod \"barbican-worker-5fd9f445b9-bnr2j\" (UID: \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\") " pod="openstack/barbican-worker-5fd9f445b9-bnr2j" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.281918 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-config-data-custom\") pod \"barbican-keystone-listener-8dc77f9b6-7s844\" (UID: \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\") " pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.283149 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-dns-svc\") pod \"dnsmasq-dns-688c87cc99-gg27w\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.283223 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-config-data\") pod \"barbican-api-7c455f6f5b-f25td\" (UID: \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\") " pod="openstack/barbican-api-7c455f6f5b-f25td" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.283258 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-config\") pod \"dnsmasq-dns-688c87cc99-gg27w\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.283286 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlcgz\" (UniqueName: \"kubernetes.io/projected/21ee9717-aaae-4511-9cee-fb022818e57d-kube-api-access-mlcgz\") pod \"dnsmasq-dns-688c87cc99-gg27w\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.283329 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-logs\") pod \"barbican-api-7c455f6f5b-f25td\" (UID: \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\") " pod="openstack/barbican-api-7c455f6f5b-f25td" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.283352 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-combined-ca-bundle\") pod \"barbican-api-7c455f6f5b-f25td\" (UID: \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\") " pod="openstack/barbican-api-7c455f6f5b-f25td" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.283393 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-gg27w\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.283445 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqdfx\" (UniqueName: \"kubernetes.io/projected/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-kube-api-access-sqdfx\") pod \"barbican-api-7c455f6f5b-f25td\" (UID: \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\") " pod="openstack/barbican-api-7c455f6f5b-f25td" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.283467 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-config-data-custom\") pod \"barbican-api-7c455f6f5b-f25td\" (UID: \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\") " pod="openstack/barbican-api-7c455f6f5b-f25td" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.283488 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-gg27w\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.283505 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-gg27w\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.284966 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-dns-svc\") pod \"dnsmasq-dns-688c87cc99-gg27w\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.285209 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-logs\") pod \"barbican-api-7c455f6f5b-f25td\" (UID: \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\") " pod="openstack/barbican-api-7c455f6f5b-f25td" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.289066 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-gg27w\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.293487 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-combined-ca-bundle\") pod \"barbican-keystone-listener-8dc77f9b6-7s844\" (UID: \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\") " pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.294003 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb7t4\" (UniqueName: \"kubernetes.io/projected/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-kube-api-access-wb7t4\") pod \"barbican-worker-5fd9f445b9-bnr2j\" (UID: \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\") " pod="openstack/barbican-worker-5fd9f445b9-bnr2j" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.299996 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzs9k\" (UniqueName: \"kubernetes.io/projected/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-kube-api-access-mzs9k\") pod \"barbican-keystone-listener-8dc77f9b6-7s844\" (UID: \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\") " pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.301228 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-config-data\") pod \"barbican-api-7c455f6f5b-f25td\" (UID: \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\") " pod="openstack/barbican-api-7c455f6f5b-f25td" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.304450 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-combined-ca-bundle\") pod \"barbican-api-7c455f6f5b-f25td\" (UID: \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\") " pod="openstack/barbican-api-7c455f6f5b-f25td" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.301624 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-combined-ca-bundle\") pod \"barbican-worker-5fd9f445b9-bnr2j\" (UID: \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\") " pod="openstack/barbican-worker-5fd9f445b9-bnr2j" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.309155 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-gg27w\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.314171 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-gg27w\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.314986 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-config\") pod \"dnsmasq-dns-688c87cc99-gg27w\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.317669 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.326177 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-config-data-custom\") pod \"barbican-api-7c455f6f5b-f25td\" (UID: \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\") " pod="openstack/barbican-api-7c455f6f5b-f25td" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.340195 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqdfx\" (UniqueName: \"kubernetes.io/projected/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-kube-api-access-sqdfx\") pod \"barbican-api-7c455f6f5b-f25td\" (UID: \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\") " pod="openstack/barbican-api-7c455f6f5b-f25td" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.341156 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlcgz\" (UniqueName: \"kubernetes.io/projected/21ee9717-aaae-4511-9cee-fb022818e57d-kube-api-access-mlcgz\") pod \"dnsmasq-dns-688c87cc99-gg27w\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.432678 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5fd9f445b9-bnr2j" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.549399 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c455f6f5b-f25td" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.549904 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.781078 4699 generic.go:334] "Generic (PLEG): container finished" podID="a1a11bcd-db42-43bf-86ca-90fafb25674e" containerID="f6df4021899217dba2f01191869995ca628d93d69016b474ac26db11ce7351f9" exitCode=0 Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.781213 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" event={"ID":"a1a11bcd-db42-43bf-86ca-90fafb25674e","Type":"ContainerDied","Data":"f6df4021899217dba2f01191869995ca628d93d69016b474ac26db11ce7351f9"} Feb 26 11:31:54 crc kubenswrapper[4699]: I0226 11:31:54.793557 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d42e724c-224e-4c68-b5b4-b72d72d4ded8","Type":"ContainerStarted","Data":"6389553c6ab212c6bdd09de56f8a6c0bc3ab110475816ef41318a3d8e60aa198"} Feb 26 11:31:55 crc kubenswrapper[4699]: I0226 11:31:55.804008 4699 generic.go:334] "Generic (PLEG): container finished" podID="b33c7b6e-a78a-4a10-848c-a65d01deee0b" containerID="861736c6decfb2ac1c3010699205e1df4da771409780863184ec8e9136dd76db" exitCode=0 Feb 26 11:31:55 crc kubenswrapper[4699]: I0226 11:31:55.804093 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-28v5g" event={"ID":"b33c7b6e-a78a-4a10-848c-a65d01deee0b","Type":"ContainerDied","Data":"861736c6decfb2ac1c3010699205e1df4da771409780863184ec8e9136dd76db"} Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.446420 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.521250 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c856fe4-2ae4-4e5d-8112-a367658a5082-httpd-run\") pod \"1c856fe4-2ae4-4e5d-8112-a367658a5082\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.521349 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"1c856fe4-2ae4-4e5d-8112-a367658a5082\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.521374 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c856fe4-2ae4-4e5d-8112-a367658a5082-config-data\") pod \"1c856fe4-2ae4-4e5d-8112-a367658a5082\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.521536 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c856fe4-2ae4-4e5d-8112-a367658a5082-scripts\") pod \"1c856fe4-2ae4-4e5d-8112-a367658a5082\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.521628 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c856fe4-2ae4-4e5d-8112-a367658a5082-combined-ca-bundle\") pod \"1c856fe4-2ae4-4e5d-8112-a367658a5082\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.521644 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c856fe4-2ae4-4e5d-8112-a367658a5082-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1c856fe4-2ae4-4e5d-8112-a367658a5082" (UID: "1c856fe4-2ae4-4e5d-8112-a367658a5082"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.521667 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-698q6\" (UniqueName: \"kubernetes.io/projected/1c856fe4-2ae4-4e5d-8112-a367658a5082-kube-api-access-698q6\") pod \"1c856fe4-2ae4-4e5d-8112-a367658a5082\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.521684 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c856fe4-2ae4-4e5d-8112-a367658a5082-logs\") pod \"1c856fe4-2ae4-4e5d-8112-a367658a5082\" (UID: \"1c856fe4-2ae4-4e5d-8112-a367658a5082\") " Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.522211 4699 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c856fe4-2ae4-4e5d-8112-a367658a5082-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.522463 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c856fe4-2ae4-4e5d-8112-a367658a5082-logs" (OuterVolumeSpecName: "logs") pod "1c856fe4-2ae4-4e5d-8112-a367658a5082" (UID: "1c856fe4-2ae4-4e5d-8112-a367658a5082"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.527240 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c856fe4-2ae4-4e5d-8112-a367658a5082-kube-api-access-698q6" (OuterVolumeSpecName: "kube-api-access-698q6") pod "1c856fe4-2ae4-4e5d-8112-a367658a5082" (UID: "1c856fe4-2ae4-4e5d-8112-a367658a5082"). InnerVolumeSpecName "kube-api-access-698q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.529024 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "1c856fe4-2ae4-4e5d-8112-a367658a5082" (UID: "1c856fe4-2ae4-4e5d-8112-a367658a5082"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.533746 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c856fe4-2ae4-4e5d-8112-a367658a5082-scripts" (OuterVolumeSpecName: "scripts") pod "1c856fe4-2ae4-4e5d-8112-a367658a5082" (UID: "1c856fe4-2ae4-4e5d-8112-a367658a5082"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.566885 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c856fe4-2ae4-4e5d-8112-a367658a5082-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c856fe4-2ae4-4e5d-8112-a367658a5082" (UID: "1c856fe4-2ae4-4e5d-8112-a367658a5082"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.599212 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c856fe4-2ae4-4e5d-8112-a367658a5082-config-data" (OuterVolumeSpecName: "config-data") pod "1c856fe4-2ae4-4e5d-8112-a367658a5082" (UID: "1c856fe4-2ae4-4e5d-8112-a367658a5082"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.627296 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c856fe4-2ae4-4e5d-8112-a367658a5082-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.627338 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-698q6\" (UniqueName: \"kubernetes.io/projected/1c856fe4-2ae4-4e5d-8112-a367658a5082-kube-api-access-698q6\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.627352 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c856fe4-2ae4-4e5d-8112-a367658a5082-logs\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.627393 4699 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.627407 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c856fe4-2ae4-4e5d-8112-a367658a5082-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.627417 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c856fe4-2ae4-4e5d-8112-a367658a5082-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.645877 4699 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.729689 4699 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.813603 4699 generic.go:334] "Generic (PLEG): container finished" podID="47a9d008-5b7e-4866-b92b-efcb60cbfdb0" containerID="45bdc052e6dc259f4ccec396b223ed5d541f623efae769fc3c166913b1ca187a" exitCode=0 Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.813813 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-z6w9z" event={"ID":"47a9d008-5b7e-4866-b92b-efcb60cbfdb0","Type":"ContainerDied","Data":"45bdc052e6dc259f4ccec396b223ed5d541f623efae769fc3c166913b1ca187a"} Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.816265 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1c856fe4-2ae4-4e5d-8112-a367658a5082","Type":"ContainerDied","Data":"534a666cc64d5aef0fcdad971cc27654c030250fd2a92fa19d0af2b3628f9287"} Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.816283 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.816305 4699 scope.go:117] "RemoveContainer" containerID="da66386168828e12898775322c74105b9a00cb1f54506a25a4d1fcf0d9e86a7e" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.885800 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.893585 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.908718 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 11:31:56 crc kubenswrapper[4699]: E0226 11:31:56.909211 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c856fe4-2ae4-4e5d-8112-a367658a5082" containerName="glance-httpd" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.909232 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c856fe4-2ae4-4e5d-8112-a367658a5082" containerName="glance-httpd" Feb 26 11:31:56 crc kubenswrapper[4699]: E0226 11:31:56.909242 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c856fe4-2ae4-4e5d-8112-a367658a5082" containerName="glance-log" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.909249 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c856fe4-2ae4-4e5d-8112-a367658a5082" containerName="glance-log" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.909452 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c856fe4-2ae4-4e5d-8112-a367658a5082" containerName="glance-httpd" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.909473 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c856fe4-2ae4-4e5d-8112-a367658a5082" containerName="glance-log" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.910491 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.913591 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.916335 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.934564 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-logs\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.934616 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.934635 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.934655 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.934673 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.934698 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.934738 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c2rv\" (UniqueName: \"kubernetes.io/projected/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-kube-api-access-7c2rv\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.934780 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:56 crc kubenswrapper[4699]: I0226 11:31:56.938574 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.008964 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-84b6bf6c74-r47qt"] Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.010634 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.013540 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.013820 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.018397 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84b6bf6c74-r47qt"] Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.039712 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.039790 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.039817 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.039858 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.039891 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.039958 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c2rv\" (UniqueName: \"kubernetes.io/projected/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-kube-api-access-7c2rv\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.040030 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.040255 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-logs\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.046681 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-logs\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.046773 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.046872 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.052635 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.055410 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.068712 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.075767 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c2rv\" (UniqueName: \"kubernetes.io/projected/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-kube-api-access-7c2rv\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.080888 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.088742 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.148256 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85rxv\" (UniqueName: \"kubernetes.io/projected/0876db8f-e235-40d9-b4a5-718097cdf02c-kube-api-access-85rxv\") pod \"barbican-api-84b6bf6c74-r47qt\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.148328 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-internal-tls-certs\") pod \"barbican-api-84b6bf6c74-r47qt\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.148372 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-combined-ca-bundle\") pod \"barbican-api-84b6bf6c74-r47qt\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.148435 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0876db8f-e235-40d9-b4a5-718097cdf02c-logs\") pod \"barbican-api-84b6bf6c74-r47qt\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.148474 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-config-data\") pod \"barbican-api-84b6bf6c74-r47qt\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.148551 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-public-tls-certs\") pod \"barbican-api-84b6bf6c74-r47qt\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.148584 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-config-data-custom\") pod \"barbican-api-84b6bf6c74-r47qt\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.240580 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.250480 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-internal-tls-certs\") pod \"barbican-api-84b6bf6c74-r47qt\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.250528 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-combined-ca-bundle\") pod \"barbican-api-84b6bf6c74-r47qt\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.250581 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0876db8f-e235-40d9-b4a5-718097cdf02c-logs\") pod \"barbican-api-84b6bf6c74-r47qt\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.250619 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-config-data\") pod \"barbican-api-84b6bf6c74-r47qt\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.250707 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-public-tls-certs\") pod \"barbican-api-84b6bf6c74-r47qt\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.250724 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-config-data-custom\") pod \"barbican-api-84b6bf6c74-r47qt\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.250765 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85rxv\" (UniqueName: \"kubernetes.io/projected/0876db8f-e235-40d9-b4a5-718097cdf02c-kube-api-access-85rxv\") pod \"barbican-api-84b6bf6c74-r47qt\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.251511 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0876db8f-e235-40d9-b4a5-718097cdf02c-logs\") pod \"barbican-api-84b6bf6c74-r47qt\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.255004 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-public-tls-certs\") pod \"barbican-api-84b6bf6c74-r47qt\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.255639 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-internal-tls-certs\") pod \"barbican-api-84b6bf6c74-r47qt\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.256174 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-combined-ca-bundle\") pod \"barbican-api-84b6bf6c74-r47qt\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.256895 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-config-data\") pod \"barbican-api-84b6bf6c74-r47qt\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.258093 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-config-data-custom\") pod \"barbican-api-84b6bf6c74-r47qt\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.267876 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85rxv\" (UniqueName: \"kubernetes.io/projected/0876db8f-e235-40d9-b4a5-718097cdf02c-kube-api-access-85rxv\") pod \"barbican-api-84b6bf6c74-r47qt\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:31:57 crc kubenswrapper[4699]: I0226 11:31:57.347268 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:31:58 crc kubenswrapper[4699]: I0226 11:31:58.131673 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:58 crc kubenswrapper[4699]: I0226 11:31:58.132230 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:31:58 crc kubenswrapper[4699]: I0226 11:31:58.212477 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:58 crc kubenswrapper[4699]: I0226 11:31:58.213653 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:31:58 crc kubenswrapper[4699]: I0226 11:31:58.279543 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c856fe4-2ae4-4e5d-8112-a367658a5082" path="/var/lib/kubelet/pods/1c856fe4-2ae4-4e5d-8112-a367658a5082/volumes" Feb 26 11:31:58 crc kubenswrapper[4699]: I0226 11:31:58.832026 4699 scope.go:117] "RemoveContainer" containerID="86f7637b447ffd260d1c029f9ebaf1fd5c0a52784cd3264877063821ada8e279" Feb 26 11:31:58 crc kubenswrapper[4699]: I0226 11:31:58.843515 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" event={"ID":"a1a11bcd-db42-43bf-86ca-90fafb25674e","Type":"ContainerDied","Data":"f9de2c3df25a9cf2b2ee0fc78e93892cae6d343216a055e8b673d55fa947c455"} Feb 26 11:31:58 crc kubenswrapper[4699]: I0226 11:31:58.843583 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9de2c3df25a9cf2b2ee0fc78e93892cae6d343216a055e8b673d55fa947c455" Feb 26 11:31:58 crc kubenswrapper[4699]: I0226 11:31:58.845710 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-z6w9z" event={"ID":"47a9d008-5b7e-4866-b92b-efcb60cbfdb0","Type":"ContainerDied","Data":"920ac430b947cdc3b32b9b6348a1213ef17636f95a2668e9fab680798b77b616"} Feb 26 11:31:58 crc kubenswrapper[4699]: I0226 11:31:58.845837 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="920ac430b947cdc3b32b9b6348a1213ef17636f95a2668e9fab680798b77b616" Feb 26 11:31:58 crc kubenswrapper[4699]: I0226 11:31:58.860037 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-28v5g" event={"ID":"b33c7b6e-a78a-4a10-848c-a65d01deee0b","Type":"ContainerDied","Data":"2c50ad90e0d44eb8ed21f890b451db6090ce5a989b38e99bb109caa8d5b20956"} Feb 26 11:31:58 crc kubenswrapper[4699]: I0226 11:31:58.860140 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c50ad90e0d44eb8ed21f890b451db6090ce5a989b38e99bb109caa8d5b20956" Feb 26 11:31:58 crc kubenswrapper[4699]: I0226 11:31:58.996611 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-28v5g" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.019162 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.094360 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-z6w9z" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.117691 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vznpb\" (UniqueName: \"kubernetes.io/projected/a1a11bcd-db42-43bf-86ca-90fafb25674e-kube-api-access-vznpb\") pod \"a1a11bcd-db42-43bf-86ca-90fafb25674e\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.117772 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-ovsdbserver-sb\") pod \"a1a11bcd-db42-43bf-86ca-90fafb25674e\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.117801 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89dst\" (UniqueName: \"kubernetes.io/projected/b33c7b6e-a78a-4a10-848c-a65d01deee0b-kube-api-access-89dst\") pod \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.117837 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-scripts\") pod \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.117876 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-combined-ca-bundle\") pod \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.117920 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-config-data\") pod \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.117939 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-fernet-keys\") pod \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.118044 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-dns-svc\") pod \"a1a11bcd-db42-43bf-86ca-90fafb25674e\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.118088 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-dns-swift-storage-0\") pod \"a1a11bcd-db42-43bf-86ca-90fafb25674e\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.118104 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-credential-keys\") pod \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\" (UID: \"b33c7b6e-a78a-4a10-848c-a65d01deee0b\") " Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.118144 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-config\") pod \"a1a11bcd-db42-43bf-86ca-90fafb25674e\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.118162 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-ovsdbserver-nb\") pod \"a1a11bcd-db42-43bf-86ca-90fafb25674e\" (UID: \"a1a11bcd-db42-43bf-86ca-90fafb25674e\") " Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.139127 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1a11bcd-db42-43bf-86ca-90fafb25674e-kube-api-access-vznpb" (OuterVolumeSpecName: "kube-api-access-vznpb") pod "a1a11bcd-db42-43bf-86ca-90fafb25674e" (UID: "a1a11bcd-db42-43bf-86ca-90fafb25674e"). InnerVolumeSpecName "kube-api-access-vznpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.140924 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b33c7b6e-a78a-4a10-848c-a65d01deee0b" (UID: "b33c7b6e-a78a-4a10-848c-a65d01deee0b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.140985 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-scripts" (OuterVolumeSpecName: "scripts") pod "b33c7b6e-a78a-4a10-848c-a65d01deee0b" (UID: "b33c7b6e-a78a-4a10-848c-a65d01deee0b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.141194 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b33c7b6e-a78a-4a10-848c-a65d01deee0b" (UID: "b33c7b6e-a78a-4a10-848c-a65d01deee0b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.141211 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b33c7b6e-a78a-4a10-848c-a65d01deee0b-kube-api-access-89dst" (OuterVolumeSpecName: "kube-api-access-89dst") pod "b33c7b6e-a78a-4a10-848c-a65d01deee0b" (UID: "b33c7b6e-a78a-4a10-848c-a65d01deee0b"). InnerVolumeSpecName "kube-api-access-89dst". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.222667 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-scripts\") pod \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\" (UID: \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\") " Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.222814 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99w54\" (UniqueName: \"kubernetes.io/projected/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-kube-api-access-99w54\") pod \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\" (UID: \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\") " Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.222881 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-combined-ca-bundle\") pod \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\" (UID: \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\") " Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.222948 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-config-data\") pod \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\" (UID: \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\") " Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.222980 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-logs\") pod \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\" (UID: \"47a9d008-5b7e-4866-b92b-efcb60cbfdb0\") " Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.223444 4699 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.223460 4699 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.223470 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vznpb\" (UniqueName: \"kubernetes.io/projected/a1a11bcd-db42-43bf-86ca-90fafb25674e-kube-api-access-vznpb\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.223478 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89dst\" (UniqueName: \"kubernetes.io/projected/b33c7b6e-a78a-4a10-848c-a65d01deee0b-kube-api-access-89dst\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.223489 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.224031 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-logs" (OuterVolumeSpecName: "logs") pod "47a9d008-5b7e-4866-b92b-efcb60cbfdb0" (UID: "47a9d008-5b7e-4866-b92b-efcb60cbfdb0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.228095 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-kube-api-access-99w54" (OuterVolumeSpecName: "kube-api-access-99w54") pod "47a9d008-5b7e-4866-b92b-efcb60cbfdb0" (UID: "47a9d008-5b7e-4866-b92b-efcb60cbfdb0"). InnerVolumeSpecName "kube-api-access-99w54". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.233333 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-scripts" (OuterVolumeSpecName: "scripts") pod "47a9d008-5b7e-4866-b92b-efcb60cbfdb0" (UID: "47a9d008-5b7e-4866-b92b-efcb60cbfdb0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.233429 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b33c7b6e-a78a-4a10-848c-a65d01deee0b" (UID: "b33c7b6e-a78a-4a10-848c-a65d01deee0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.245157 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a1a11bcd-db42-43bf-86ca-90fafb25674e" (UID: "a1a11bcd-db42-43bf-86ca-90fafb25674e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.278266 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a1a11bcd-db42-43bf-86ca-90fafb25674e" (UID: "a1a11bcd-db42-43bf-86ca-90fafb25674e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.294991 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-config-data" (OuterVolumeSpecName: "config-data") pod "b33c7b6e-a78a-4a10-848c-a65d01deee0b" (UID: "b33c7b6e-a78a-4a10-848c-a65d01deee0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.296776 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a1a11bcd-db42-43bf-86ca-90fafb25674e" (UID: "a1a11bcd-db42-43bf-86ca-90fafb25674e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.322265 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47a9d008-5b7e-4866-b92b-efcb60cbfdb0" (UID: "47a9d008-5b7e-4866-b92b-efcb60cbfdb0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.325447 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-config-data" (OuterVolumeSpecName: "config-data") pod "47a9d008-5b7e-4866-b92b-efcb60cbfdb0" (UID: "47a9d008-5b7e-4866-b92b-efcb60cbfdb0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.325768 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.325797 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-logs\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.325808 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.325816 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.325827 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.325835 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.325844 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.325854 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99w54\" (UniqueName: \"kubernetes.io/projected/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-kube-api-access-99w54\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.325863 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b33c7b6e-a78a-4a10-848c-a65d01deee0b-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.325871 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a9d008-5b7e-4866-b92b-efcb60cbfdb0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.359709 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a1a11bcd-db42-43bf-86ca-90fafb25674e" (UID: "a1a11bcd-db42-43bf-86ca-90fafb25674e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.368897 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-config" (OuterVolumeSpecName: "config") pod "a1a11bcd-db42-43bf-86ca-90fafb25674e" (UID: "a1a11bcd-db42-43bf-86ca-90fafb25674e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.440898 4699 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.440945 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1a11bcd-db42-43bf-86ca-90fafb25674e-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.485958 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8dc77f9b6-7s844"] Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.665242 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-gg27w"] Feb 26 11:31:59 crc kubenswrapper[4699]: W0226 11:31:59.678771 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21ee9717_aaae_4511_9cee_fb022818e57d.slice/crio-729fa2fe733b6553118627e4796e6e00ed271782aa89de919499cdfc619cd740 WatchSource:0}: Error finding container 729fa2fe733b6553118627e4796e6e00ed271782aa89de919499cdfc619cd740: Status 404 returned error can't find the container with id 729fa2fe733b6553118627e4796e6e00ed271782aa89de919499cdfc619cd740 Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.889683 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" event={"ID":"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b","Type":"ContainerStarted","Data":"64d085c2e0471990e9f05ef5274018eb074bf0ab7cec6ddaf7afcafa1dae6331"} Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.907781 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7cec2d73-9ca8-4a8b-836d-efce961fbde8","Type":"ContainerStarted","Data":"5b67c1b09ab95a7bde953070b4025e70711be4e1fc09bbc21308bfb6c7bb94c2"} Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.916128 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-z6w9z" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.918897 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-28v5g" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.918940 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" Feb 26 11:31:59 crc kubenswrapper[4699]: I0226 11:31:59.918996 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-gg27w" event={"ID":"21ee9717-aaae-4511-9cee-fb022818e57d","Type":"ContainerStarted","Data":"729fa2fe733b6553118627e4796e6e00ed271782aa89de919499cdfc619cd740"} Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.023088 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-x6m79"] Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.053668 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-x6m79"] Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.064677 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c455f6f5b-f25td"] Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.072329 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5fd9f445b9-bnr2j"] Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.191379 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84b6bf6c74-r47qt"] Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.210985 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-67d4f89fb9-65kmq"] Feb 26 11:32:00 crc kubenswrapper[4699]: E0226 11:32:00.211508 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47a9d008-5b7e-4866-b92b-efcb60cbfdb0" containerName="placement-db-sync" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.211526 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a9d008-5b7e-4866-b92b-efcb60cbfdb0" containerName="placement-db-sync" Feb 26 11:32:00 crc kubenswrapper[4699]: E0226 11:32:00.211545 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a11bcd-db42-43bf-86ca-90fafb25674e" containerName="init" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.211552 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a11bcd-db42-43bf-86ca-90fafb25674e" containerName="init" Feb 26 11:32:00 crc kubenswrapper[4699]: E0226 11:32:00.211572 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a11bcd-db42-43bf-86ca-90fafb25674e" containerName="dnsmasq-dns" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.211580 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a11bcd-db42-43bf-86ca-90fafb25674e" containerName="dnsmasq-dns" Feb 26 11:32:00 crc kubenswrapper[4699]: E0226 11:32:00.211613 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b33c7b6e-a78a-4a10-848c-a65d01deee0b" containerName="keystone-bootstrap" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.211622 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="b33c7b6e-a78a-4a10-848c-a65d01deee0b" containerName="keystone-bootstrap" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.211821 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="b33c7b6e-a78a-4a10-848c-a65d01deee0b" containerName="keystone-bootstrap" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.211861 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a11bcd-db42-43bf-86ca-90fafb25674e" containerName="dnsmasq-dns" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.211879 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="47a9d008-5b7e-4866-b92b-efcb60cbfdb0" containerName="placement-db-sync" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.212714 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.215346 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qbntt" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.215538 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.215701 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.215804 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.215899 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.216084 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.220529 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-67d4f89fb9-65kmq"] Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.257646 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d9e1983-3363-4542-a5f0-deb132ea6994-internal-tls-certs\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: W0226 11:32:00.257684 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03f1bc3b_c587_4c47_bbc2_3dca2240d30c.slice/crio-1807142ca5ad23a6f297805a63bd9002973ffc620dfe243a1b7bd07573b9a98e WatchSource:0}: Error finding container 1807142ca5ad23a6f297805a63bd9002973ffc620dfe243a1b7bd07573b9a98e: Status 404 returned error can't find the container with id 1807142ca5ad23a6f297805a63bd9002973ffc620dfe243a1b7bd07573b9a98e Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.257739 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d9e1983-3363-4542-a5f0-deb132ea6994-scripts\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.257773 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5d9e1983-3363-4542-a5f0-deb132ea6994-credential-keys\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.257808 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9e1983-3363-4542-a5f0-deb132ea6994-combined-ca-bundle\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.257835 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d9e1983-3363-4542-a5f0-deb132ea6994-public-tls-certs\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.257899 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hznp\" (UniqueName: \"kubernetes.io/projected/5d9e1983-3363-4542-a5f0-deb132ea6994-kube-api-access-4hznp\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.257955 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d9e1983-3363-4542-a5f0-deb132ea6994-config-data\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.257987 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d9e1983-3363-4542-a5f0-deb132ea6994-fernet-keys\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.360876 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hznp\" (UniqueName: \"kubernetes.io/projected/5d9e1983-3363-4542-a5f0-deb132ea6994-kube-api-access-4hznp\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.360923 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d9e1983-3363-4542-a5f0-deb132ea6994-config-data\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.360951 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d9e1983-3363-4542-a5f0-deb132ea6994-fernet-keys\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.361094 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d9e1983-3363-4542-a5f0-deb132ea6994-internal-tls-certs\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.361193 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d9e1983-3363-4542-a5f0-deb132ea6994-scripts\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.361233 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5d9e1983-3363-4542-a5f0-deb132ea6994-credential-keys\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.361284 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9e1983-3363-4542-a5f0-deb132ea6994-combined-ca-bundle\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.361310 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d9e1983-3363-4542-a5f0-deb132ea6994-public-tls-certs\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.375144 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d9e1983-3363-4542-a5f0-deb132ea6994-config-data\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.375510 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d9e1983-3363-4542-a5f0-deb132ea6994-fernet-keys\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.381834 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1a11bcd-db42-43bf-86ca-90fafb25674e" path="/var/lib/kubelet/pods/a1a11bcd-db42-43bf-86ca-90fafb25674e/volumes" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.388903 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535092-t7q4h"] Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.402939 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5d9e1983-3363-4542-a5f0-deb132ea6994-credential-keys\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.404782 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hznp\" (UniqueName: \"kubernetes.io/projected/5d9e1983-3363-4542-a5f0-deb132ea6994-kube-api-access-4hznp\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.404866 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d9e1983-3363-4542-a5f0-deb132ea6994-public-tls-certs\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.406713 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9e1983-3363-4542-a5f0-deb132ea6994-combined-ca-bundle\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.409413 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535092-t7q4h"] Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.409493 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.410282 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535092-t7q4h" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.418455 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-78f86c6bf8-r6wpf"] Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.420195 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.420470 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.421259 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.422852 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d9e1983-3363-4542-a5f0-deb132ea6994-internal-tls-certs\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.428906 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d9e1983-3363-4542-a5f0-deb132ea6994-scripts\") pod \"keystone-67d4f89fb9-65kmq\" (UID: \"5d9e1983-3363-4542-a5f0-deb132ea6994\") " pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.433272 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.447495 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.447539 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.447369 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2ghn5" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.448736 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.449959 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.470820 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-78f86c6bf8-r6wpf"] Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.576766 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-logs\") pod \"placement-78f86c6bf8-r6wpf\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.576835 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-public-tls-certs\") pod \"placement-78f86c6bf8-r6wpf\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.576857 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-internal-tls-certs\") pod \"placement-78f86c6bf8-r6wpf\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.576916 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv6sq\" (UniqueName: \"kubernetes.io/projected/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-kube-api-access-wv6sq\") pod \"placement-78f86c6bf8-r6wpf\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.576949 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpjjs\" (UniqueName: \"kubernetes.io/projected/343bb829-035d-4834-a0c4-d9a61c11a2ee-kube-api-access-hpjjs\") pod \"auto-csr-approver-29535092-t7q4h\" (UID: \"343bb829-035d-4834-a0c4-d9a61c11a2ee\") " pod="openshift-infra/auto-csr-approver-29535092-t7q4h" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.577031 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-config-data\") pod \"placement-78f86c6bf8-r6wpf\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.577055 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-scripts\") pod \"placement-78f86c6bf8-r6wpf\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.577158 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-combined-ca-bundle\") pod \"placement-78f86c6bf8-r6wpf\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.628008 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d4878dd78-qpvzg"] Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.640239 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d4878dd78-qpvzg"] Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.640369 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.641480 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.683954 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpjjs\" (UniqueName: \"kubernetes.io/projected/343bb829-035d-4834-a0c4-d9a61c11a2ee-kube-api-access-hpjjs\") pod \"auto-csr-approver-29535092-t7q4h\" (UID: \"343bb829-035d-4834-a0c4-d9a61c11a2ee\") " pod="openshift-infra/auto-csr-approver-29535092-t7q4h" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.684037 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-config-data\") pod \"placement-78f86c6bf8-r6wpf\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.684075 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-scripts\") pod \"placement-78f86c6bf8-r6wpf\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.684135 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-combined-ca-bundle\") pod \"placement-78f86c6bf8-r6wpf\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.684234 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-logs\") pod \"placement-78f86c6bf8-r6wpf\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.684269 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-public-tls-certs\") pod \"placement-78f86c6bf8-r6wpf\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.684300 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-internal-tls-certs\") pod \"placement-78f86c6bf8-r6wpf\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.684365 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv6sq\" (UniqueName: \"kubernetes.io/projected/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-kube-api-access-wv6sq\") pod \"placement-78f86c6bf8-r6wpf\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.685856 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-logs\") pod \"placement-78f86c6bf8-r6wpf\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.690185 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-combined-ca-bundle\") pod \"placement-78f86c6bf8-r6wpf\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.692471 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-scripts\") pod \"placement-78f86c6bf8-r6wpf\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.697864 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-config-data\") pod \"placement-78f86c6bf8-r6wpf\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.700510 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-public-tls-certs\") pod \"placement-78f86c6bf8-r6wpf\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.702456 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-internal-tls-certs\") pod \"placement-78f86c6bf8-r6wpf\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.707185 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv6sq\" (UniqueName: \"kubernetes.io/projected/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-kube-api-access-wv6sq\") pod \"placement-78f86c6bf8-r6wpf\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.708383 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpjjs\" (UniqueName: \"kubernetes.io/projected/343bb829-035d-4834-a0c4-d9a61c11a2ee-kube-api-access-hpjjs\") pod \"auto-csr-approver-29535092-t7q4h\" (UID: \"343bb829-035d-4834-a0c4-d9a61c11a2ee\") " pod="openshift-infra/auto-csr-approver-29535092-t7q4h" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.729869 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535092-t7q4h" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.775796 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.788376 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7700bd0-21d8-4b96-9753-2619443038a3-internal-tls-certs\") pod \"placement-d4878dd78-qpvzg\" (UID: \"b7700bd0-21d8-4b96-9753-2619443038a3\") " pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.788473 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7700bd0-21d8-4b96-9753-2619443038a3-config-data\") pod \"placement-d4878dd78-qpvzg\" (UID: \"b7700bd0-21d8-4b96-9753-2619443038a3\") " pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.788517 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7700bd0-21d8-4b96-9753-2619443038a3-public-tls-certs\") pod \"placement-d4878dd78-qpvzg\" (UID: \"b7700bd0-21d8-4b96-9753-2619443038a3\") " pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.788712 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7700bd0-21d8-4b96-9753-2619443038a3-scripts\") pod \"placement-d4878dd78-qpvzg\" (UID: \"b7700bd0-21d8-4b96-9753-2619443038a3\") " pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.788935 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7700bd0-21d8-4b96-9753-2619443038a3-logs\") pod \"placement-d4878dd78-qpvzg\" (UID: \"b7700bd0-21d8-4b96-9753-2619443038a3\") " pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.788988 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7700bd0-21d8-4b96-9753-2619443038a3-combined-ca-bundle\") pod \"placement-d4878dd78-qpvzg\" (UID: \"b7700bd0-21d8-4b96-9753-2619443038a3\") " pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.789102 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bf9k\" (UniqueName: \"kubernetes.io/projected/b7700bd0-21d8-4b96-9753-2619443038a3-kube-api-access-4bf9k\") pod \"placement-d4878dd78-qpvzg\" (UID: \"b7700bd0-21d8-4b96-9753-2619443038a3\") " pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.891077 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7700bd0-21d8-4b96-9753-2619443038a3-logs\") pod \"placement-d4878dd78-qpvzg\" (UID: \"b7700bd0-21d8-4b96-9753-2619443038a3\") " pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.891453 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7700bd0-21d8-4b96-9753-2619443038a3-combined-ca-bundle\") pod \"placement-d4878dd78-qpvzg\" (UID: \"b7700bd0-21d8-4b96-9753-2619443038a3\") " pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.891575 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bf9k\" (UniqueName: \"kubernetes.io/projected/b7700bd0-21d8-4b96-9753-2619443038a3-kube-api-access-4bf9k\") pod \"placement-d4878dd78-qpvzg\" (UID: \"b7700bd0-21d8-4b96-9753-2619443038a3\") " pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.891614 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7700bd0-21d8-4b96-9753-2619443038a3-internal-tls-certs\") pod \"placement-d4878dd78-qpvzg\" (UID: \"b7700bd0-21d8-4b96-9753-2619443038a3\") " pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.891734 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7700bd0-21d8-4b96-9753-2619443038a3-config-data\") pod \"placement-d4878dd78-qpvzg\" (UID: \"b7700bd0-21d8-4b96-9753-2619443038a3\") " pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.891797 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7700bd0-21d8-4b96-9753-2619443038a3-public-tls-certs\") pod \"placement-d4878dd78-qpvzg\" (UID: \"b7700bd0-21d8-4b96-9753-2619443038a3\") " pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.891877 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7700bd0-21d8-4b96-9753-2619443038a3-scripts\") pod \"placement-d4878dd78-qpvzg\" (UID: \"b7700bd0-21d8-4b96-9753-2619443038a3\") " pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.894250 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7700bd0-21d8-4b96-9753-2619443038a3-logs\") pod \"placement-d4878dd78-qpvzg\" (UID: \"b7700bd0-21d8-4b96-9753-2619443038a3\") " pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.902688 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7700bd0-21d8-4b96-9753-2619443038a3-scripts\") pod \"placement-d4878dd78-qpvzg\" (UID: \"b7700bd0-21d8-4b96-9753-2619443038a3\") " pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.907904 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7700bd0-21d8-4b96-9753-2619443038a3-config-data\") pod \"placement-d4878dd78-qpvzg\" (UID: \"b7700bd0-21d8-4b96-9753-2619443038a3\") " pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.908667 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7700bd0-21d8-4b96-9753-2619443038a3-internal-tls-certs\") pod \"placement-d4878dd78-qpvzg\" (UID: \"b7700bd0-21d8-4b96-9753-2619443038a3\") " pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.913383 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7700bd0-21d8-4b96-9753-2619443038a3-combined-ca-bundle\") pod \"placement-d4878dd78-qpvzg\" (UID: \"b7700bd0-21d8-4b96-9753-2619443038a3\") " pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.915608 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7700bd0-21d8-4b96-9753-2619443038a3-public-tls-certs\") pod \"placement-d4878dd78-qpvzg\" (UID: \"b7700bd0-21d8-4b96-9753-2619443038a3\") " pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.924340 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bf9k\" (UniqueName: \"kubernetes.io/projected/b7700bd0-21d8-4b96-9753-2619443038a3-kube-api-access-4bf9k\") pod \"placement-d4878dd78-qpvzg\" (UID: \"b7700bd0-21d8-4b96-9753-2619443038a3\") " pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.943407 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"03f1bc3b-c587-4c47-bbc2-3dca2240d30c","Type":"ContainerStarted","Data":"1807142ca5ad23a6f297805a63bd9002973ffc620dfe243a1b7bd07573b9a98e"} Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.946663 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c455f6f5b-f25td" event={"ID":"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4","Type":"ContainerStarted","Data":"1ea84a2c17c70c4722d76da041934ea3f75af2c65494a5778df946ebb8677371"} Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.950737 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84b6bf6c74-r47qt" event={"ID":"0876db8f-e235-40d9-b4a5-718097cdf02c","Type":"ContainerStarted","Data":"079bbabce73c111db6093e96198997a034c6927d448d649260507e6ce83573d4"} Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.952255 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5fd9f445b9-bnr2j" event={"ID":"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f","Type":"ContainerStarted","Data":"67791da9269463758e09bb6a9c9c2f13b834b1a262a1121df8a5fa0b5b6170cf"} Feb 26 11:32:00 crc kubenswrapper[4699]: I0226 11:32:00.969059 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:01 crc kubenswrapper[4699]: I0226 11:32:01.206542 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-67d4f89fb9-65kmq"] Feb 26 11:32:01 crc kubenswrapper[4699]: W0226 11:32:01.233390 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d9e1983_3363_4542_a5f0_deb132ea6994.slice/crio-fc01764fcf706afe2883d1c24a362dbebfdc64389d0d5a484a6dc51e9ddb78de WatchSource:0}: Error finding container fc01764fcf706afe2883d1c24a362dbebfdc64389d0d5a484a6dc51e9ddb78de: Status 404 returned error can't find the container with id fc01764fcf706afe2883d1c24a362dbebfdc64389d0d5a484a6dc51e9ddb78de Feb 26 11:32:01 crc kubenswrapper[4699]: I0226 11:32:01.427304 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535092-t7q4h"] Feb 26 11:32:01 crc kubenswrapper[4699]: I0226 11:32:01.485979 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-78f86c6bf8-r6wpf"] Feb 26 11:32:01 crc kubenswrapper[4699]: W0226 11:32:01.487012 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod343bb829_035d_4834_a0c4_d9a61c11a2ee.slice/crio-79a1129abbd02611508e0ac75a09840d716301f132a8b45d0ccf4b2b830b608a WatchSource:0}: Error finding container 79a1129abbd02611508e0ac75a09840d716301f132a8b45d0ccf4b2b830b608a: Status 404 returned error can't find the container with id 79a1129abbd02611508e0ac75a09840d716301f132a8b45d0ccf4b2b830b608a Feb 26 11:32:01 crc kubenswrapper[4699]: W0226 11:32:01.555970 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9ef11cc_2a83_4f0e_b117_4be10a1c0fee.slice/crio-086804bba8040fba8ead2adc36df764be92ea222ee4962825dd9a4df869adac5 WatchSource:0}: Error finding container 086804bba8040fba8ead2adc36df764be92ea222ee4962825dd9a4df869adac5: Status 404 returned error can't find the container with id 086804bba8040fba8ead2adc36df764be92ea222ee4962825dd9a4df869adac5 Feb 26 11:32:01 crc kubenswrapper[4699]: I0226 11:32:01.590050 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d4878dd78-qpvzg"] Feb 26 11:32:01 crc kubenswrapper[4699]: W0226 11:32:01.594203 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7700bd0_21d8_4b96_9753_2619443038a3.slice/crio-2a9002e29912f771e5283986ce768c9698ed7c635c879f18a8a62771838e81bf WatchSource:0}: Error finding container 2a9002e29912f771e5283986ce768c9698ed7c635c879f18a8a62771838e81bf: Status 404 returned error can't find the container with id 2a9002e29912f771e5283986ce768c9698ed7c635c879f18a8a62771838e81bf Feb 26 11:32:01 crc kubenswrapper[4699]: I0226 11:32:01.962913 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-67d4f89fb9-65kmq" event={"ID":"5d9e1983-3363-4542-a5f0-deb132ea6994","Type":"ContainerStarted","Data":"fc01764fcf706afe2883d1c24a362dbebfdc64389d0d5a484a6dc51e9ddb78de"} Feb 26 11:32:01 crc kubenswrapper[4699]: I0226 11:32:01.963950 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535092-t7q4h" event={"ID":"343bb829-035d-4834-a0c4-d9a61c11a2ee","Type":"ContainerStarted","Data":"79a1129abbd02611508e0ac75a09840d716301f132a8b45d0ccf4b2b830b608a"} Feb 26 11:32:01 crc kubenswrapper[4699]: I0226 11:32:01.965056 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d4878dd78-qpvzg" event={"ID":"b7700bd0-21d8-4b96-9753-2619443038a3","Type":"ContainerStarted","Data":"2a9002e29912f771e5283986ce768c9698ed7c635c879f18a8a62771838e81bf"} Feb 26 11:32:01 crc kubenswrapper[4699]: I0226 11:32:01.966055 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78f86c6bf8-r6wpf" event={"ID":"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee","Type":"ContainerStarted","Data":"086804bba8040fba8ead2adc36df764be92ea222ee4962825dd9a4df869adac5"} Feb 26 11:32:02 crc kubenswrapper[4699]: I0226 11:32:02.568415 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5ccc5c4795-x6m79" podUID="a1a11bcd-db42-43bf-86ca-90fafb25674e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.156:5353: i/o timeout" Feb 26 11:32:03 crc kubenswrapper[4699]: I0226 11:32:03.017609 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c455f6f5b-f25td" event={"ID":"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4","Type":"ContainerStarted","Data":"68ffeebcc8219b513faf07851f7ec0e29081e29acdefcbc0d3a8bcb52016ff06"} Feb 26 11:32:03 crc kubenswrapper[4699]: I0226 11:32:03.029008 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-67d4f89fb9-65kmq" event={"ID":"5d9e1983-3363-4542-a5f0-deb132ea6994","Type":"ContainerStarted","Data":"c6475f92b33fc20f6dc96976597c480d28deff6e28b83fbd4087da8b404d81f6"} Feb 26 11:32:03 crc kubenswrapper[4699]: I0226 11:32:03.029660 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:03 crc kubenswrapper[4699]: I0226 11:32:03.034678 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d4878dd78-qpvzg" event={"ID":"b7700bd0-21d8-4b96-9753-2619443038a3","Type":"ContainerStarted","Data":"c2abd1a49bd60a70897e46e829fede9f4c909194d03a92818f99d204012b49c8"} Feb 26 11:32:03 crc kubenswrapper[4699]: I0226 11:32:03.058951 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78f86c6bf8-r6wpf" event={"ID":"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee","Type":"ContainerStarted","Data":"cfb24c179a421c25f4518e3a61ebbebf3cbb893957f33df4ff29a903e7099944"} Feb 26 11:32:03 crc kubenswrapper[4699]: I0226 11:32:03.068537 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-67d4f89fb9-65kmq" podStartSLOduration=3.068520147 podStartE2EDuration="3.068520147s" podCreationTimestamp="2026-02-26 11:32:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:32:03.055564152 +0000 UTC m=+1268.866390616" watchObservedRunningTime="2026-02-26 11:32:03.068520147 +0000 UTC m=+1268.879346581" Feb 26 11:32:03 crc kubenswrapper[4699]: I0226 11:32:03.070440 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"03f1bc3b-c587-4c47-bbc2-3dca2240d30c","Type":"ContainerStarted","Data":"96b8b1dbd04dece91d5638035ad1782637a57aa6ee4d0c482a41be17ca67c558"} Feb 26 11:32:03 crc kubenswrapper[4699]: I0226 11:32:03.084889 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d42e724c-224e-4c68-b5b4-b72d72d4ded8","Type":"ContainerStarted","Data":"178382a3de582f32c10d899adfcff30626331736ab73b2469c8ca1b37fab0c4c"} Feb 26 11:32:03 crc kubenswrapper[4699]: I0226 11:32:03.094922 4699 generic.go:334] "Generic (PLEG): container finished" podID="21ee9717-aaae-4511-9cee-fb022818e57d" containerID="92cf2b1cba562648cb5236aef5b4582d6ded613391d9217a2ee3e5335a2f73cf" exitCode=0 Feb 26 11:32:03 crc kubenswrapper[4699]: I0226 11:32:03.095003 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-gg27w" event={"ID":"21ee9717-aaae-4511-9cee-fb022818e57d","Type":"ContainerDied","Data":"92cf2b1cba562648cb5236aef5b4582d6ded613391d9217a2ee3e5335a2f73cf"} Feb 26 11:32:03 crc kubenswrapper[4699]: I0226 11:32:03.099812 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f49xd" event={"ID":"8426fd89-9eba-46fa-8611-e98cc7636b41","Type":"ContainerStarted","Data":"2cec29afd9941e14f3e1571b5331427d3b1faa6723571c88143afc902d980bd2"} Feb 26 11:32:03 crc kubenswrapper[4699]: I0226 11:32:03.109435 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=12.109409377 podStartE2EDuration="12.109409377s" podCreationTimestamp="2026-02-26 11:31:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:32:03.108970425 +0000 UTC m=+1268.919796859" watchObservedRunningTime="2026-02-26 11:32:03.109409377 +0000 UTC m=+1268.920235811" Feb 26 11:32:03 crc kubenswrapper[4699]: I0226 11:32:03.115338 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84b6bf6c74-r47qt" event={"ID":"0876db8f-e235-40d9-b4a5-718097cdf02c","Type":"ContainerStarted","Data":"c3cf20e496184d423dd9676570affb3ed62ff3f5e0e800069d47d590effab24c"} Feb 26 11:32:03 crc kubenswrapper[4699]: I0226 11:32:03.134994 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-f49xd" podStartSLOduration=5.612793488 podStartE2EDuration="47.134970966s" podCreationTimestamp="2026-02-26 11:31:16 +0000 UTC" firstStartedPulling="2026-02-26 11:31:17.774246916 +0000 UTC m=+1223.585073350" lastFinishedPulling="2026-02-26 11:31:59.296424394 +0000 UTC m=+1265.107250828" observedRunningTime="2026-02-26 11:32:03.129833697 +0000 UTC m=+1268.940660131" watchObservedRunningTime="2026-02-26 11:32:03.134970966 +0000 UTC m=+1268.945797420" Feb 26 11:32:04 crc kubenswrapper[4699]: I0226 11:32:04.131766 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"03f1bc3b-c587-4c47-bbc2-3dca2240d30c","Type":"ContainerStarted","Data":"9c85812a27867e8983c7de4f13e637e2314c74c4cdd4c4523f56efe2ed2922e2"} Feb 26 11:32:04 crc kubenswrapper[4699]: I0226 11:32:04.136684 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84b6bf6c74-r47qt" event={"ID":"0876db8f-e235-40d9-b4a5-718097cdf02c","Type":"ContainerStarted","Data":"5fee23b2bd35e07b2bc23127d9ba51147df6b5de9840523d49b7247f51fcf676"} Feb 26 11:32:04 crc kubenswrapper[4699]: I0226 11:32:04.138256 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:32:04 crc kubenswrapper[4699]: I0226 11:32:04.138316 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:32:04 crc kubenswrapper[4699]: I0226 11:32:04.144924 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78f86c6bf8-r6wpf" event={"ID":"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee","Type":"ContainerStarted","Data":"1a317729338c17b4684891909a81baf465ca5e0314fc7b75c6f3742a26c946fe"} Feb 26 11:32:04 crc kubenswrapper[4699]: I0226 11:32:04.144988 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:04 crc kubenswrapper[4699]: I0226 11:32:04.145163 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:04 crc kubenswrapper[4699]: I0226 11:32:04.155996 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.155978851 podStartE2EDuration="8.155978851s" podCreationTimestamp="2026-02-26 11:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:32:04.155056014 +0000 UTC m=+1269.965882468" watchObservedRunningTime="2026-02-26 11:32:04.155978851 +0000 UTC m=+1269.966805305" Feb 26 11:32:04 crc kubenswrapper[4699]: I0226 11:32:04.184788 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-84b6bf6c74-r47qt" podStartSLOduration=8.184762962 podStartE2EDuration="8.184762962s" podCreationTimestamp="2026-02-26 11:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:32:04.179453269 +0000 UTC m=+1269.990279723" watchObservedRunningTime="2026-02-26 11:32:04.184762962 +0000 UTC m=+1269.995589406" Feb 26 11:32:04 crc kubenswrapper[4699]: I0226 11:32:04.212601 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-78f86c6bf8-r6wpf" podStartSLOduration=4.212577836 podStartE2EDuration="4.212577836s" podCreationTimestamp="2026-02-26 11:32:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:32:04.206057487 +0000 UTC m=+1270.016883921" watchObservedRunningTime="2026-02-26 11:32:04.212577836 +0000 UTC m=+1270.023404270" Feb 26 11:32:05 crc kubenswrapper[4699]: I0226 11:32:05.170662 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5fd9f445b9-bnr2j" event={"ID":"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f","Type":"ContainerStarted","Data":"47aa6fcab7ba63e0059bde039291f7d09fed39c47d8ed3b4b011f2b39240d68f"} Feb 26 11:32:05 crc kubenswrapper[4699]: I0226 11:32:05.180297 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535092-t7q4h" event={"ID":"343bb829-035d-4834-a0c4-d9a61c11a2ee","Type":"ContainerStarted","Data":"f2cdecc6eba8599d08f98abb877e3708c955cb03d406931c6fd1ea5f2ab28e98"} Feb 26 11:32:05 crc kubenswrapper[4699]: I0226 11:32:05.191438 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" event={"ID":"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b","Type":"ContainerStarted","Data":"514b33745b4aa127708bf8765bb8617e15516309231f6f17906729d04d3d2a16"} Feb 26 11:32:05 crc kubenswrapper[4699]: I0226 11:32:05.195163 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d4878dd78-qpvzg" event={"ID":"b7700bd0-21d8-4b96-9753-2619443038a3","Type":"ContainerStarted","Data":"b52272c268905f2ef6a8812534bcc8bf8110d18f696daedaf2e094ce064ec7f6"} Feb 26 11:32:05 crc kubenswrapper[4699]: I0226 11:32:05.196050 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:05 crc kubenswrapper[4699]: I0226 11:32:05.196074 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:05 crc kubenswrapper[4699]: I0226 11:32:05.206913 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535092-t7q4h" podStartSLOduration=2.250786603 podStartE2EDuration="5.20688932s" podCreationTimestamp="2026-02-26 11:32:00 +0000 UTC" firstStartedPulling="2026-02-26 11:32:01.553467765 +0000 UTC m=+1267.364294199" lastFinishedPulling="2026-02-26 11:32:04.509570482 +0000 UTC m=+1270.320396916" observedRunningTime="2026-02-26 11:32:05.197410456 +0000 UTC m=+1271.008236890" watchObservedRunningTime="2026-02-26 11:32:05.20688932 +0000 UTC m=+1271.017715754" Feb 26 11:32:05 crc kubenswrapper[4699]: I0226 11:32:05.220421 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-gg27w" event={"ID":"21ee9717-aaae-4511-9cee-fb022818e57d","Type":"ContainerStarted","Data":"306402b7645a267592b660f978f8685767bc49fa883947fdba6ed6fa1d54d19c"} Feb 26 11:32:05 crc kubenswrapper[4699]: I0226 11:32:05.221236 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:32:05 crc kubenswrapper[4699]: I0226 11:32:05.234688 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-d4878dd78-qpvzg" podStartSLOduration=5.2346687020000005 podStartE2EDuration="5.234668702s" podCreationTimestamp="2026-02-26 11:32:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:32:05.227239018 +0000 UTC m=+1271.038065462" watchObservedRunningTime="2026-02-26 11:32:05.234668702 +0000 UTC m=+1271.045495136" Feb 26 11:32:05 crc kubenswrapper[4699]: I0226 11:32:05.240098 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c455f6f5b-f25td" event={"ID":"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4","Type":"ContainerStarted","Data":"a0d7c518107ce530bde8dc06ecc1543caeb752ad958b36e173be8e60f8d8a088"} Feb 26 11:32:05 crc kubenswrapper[4699]: I0226 11:32:05.240149 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c455f6f5b-f25td" Feb 26 11:32:05 crc kubenswrapper[4699]: I0226 11:32:05.241008 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c455f6f5b-f25td" Feb 26 11:32:05 crc kubenswrapper[4699]: I0226 11:32:05.256219 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688c87cc99-gg27w" podStartSLOduration=11.256200584 podStartE2EDuration="11.256200584s" podCreationTimestamp="2026-02-26 11:31:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:32:05.250422837 +0000 UTC m=+1271.061249271" watchObservedRunningTime="2026-02-26 11:32:05.256200584 +0000 UTC m=+1271.067027018" Feb 26 11:32:05 crc kubenswrapper[4699]: I0226 11:32:05.276151 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7c455f6f5b-f25td" podStartSLOduration=11.27612967 podStartE2EDuration="11.27612967s" podCreationTimestamp="2026-02-26 11:31:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:32:05.27164094 +0000 UTC m=+1271.082467374" watchObservedRunningTime="2026-02-26 11:32:05.27612967 +0000 UTC m=+1271.086956104" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.252055 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" event={"ID":"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b","Type":"ContainerStarted","Data":"33718273cf0b85bce01d52282cadd465a6d877e40d664fb877a0e2590e81381a"} Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.254840 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5fd9f445b9-bnr2j" event={"ID":"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f","Type":"ContainerStarted","Data":"59e86688f7ad25464b86f65ac7156f4d78dbfbb25e41fcbc1ec58a4c8ed79739"} Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.257723 4699 generic.go:334] "Generic (PLEG): container finished" podID="343bb829-035d-4834-a0c4-d9a61c11a2ee" containerID="f2cdecc6eba8599d08f98abb877e3708c955cb03d406931c6fd1ea5f2ab28e98" exitCode=0 Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.257807 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535092-t7q4h" event={"ID":"343bb829-035d-4834-a0c4-d9a61c11a2ee","Type":"ContainerDied","Data":"f2cdecc6eba8599d08f98abb877e3708c955cb03d406931c6fd1ea5f2ab28e98"} Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.280806 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" podStartSLOduration=8.325056648 podStartE2EDuration="13.280785073s" podCreationTimestamp="2026-02-26 11:31:53 +0000 UTC" firstStartedPulling="2026-02-26 11:31:59.497661155 +0000 UTC m=+1265.308487589" lastFinishedPulling="2026-02-26 11:32:04.45338958 +0000 UTC m=+1270.264216014" observedRunningTime="2026-02-26 11:32:06.272194775 +0000 UTC m=+1272.083021229" watchObservedRunningTime="2026-02-26 11:32:06.280785073 +0000 UTC m=+1272.091611517" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.319673 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5fd9f445b9-bnr2j" podStartSLOduration=9.021380526 podStartE2EDuration="13.319650694s" podCreationTimestamp="2026-02-26 11:31:53 +0000 UTC" firstStartedPulling="2026-02-26 11:32:00.178350283 +0000 UTC m=+1265.989176727" lastFinishedPulling="2026-02-26 11:32:04.476620461 +0000 UTC m=+1270.287446895" observedRunningTime="2026-02-26 11:32:06.30566609 +0000 UTC m=+1272.116492524" watchObservedRunningTime="2026-02-26 11:32:06.319650694 +0000 UTC m=+1272.130477158" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.420879 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6596b66679-qmv4f"] Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.423488 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6596b66679-qmv4f" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.473207 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6596b66679-qmv4f"] Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.510151 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5bb8c656f4-cl8tt"] Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.511771 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5bb8c656f4-cl8tt" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.536455 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5bb8c656f4-cl8tt"] Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.556946 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edb59470-4038-48c2-a3ec-f3046406a971-logs\") pod \"barbican-worker-6596b66679-qmv4f\" (UID: \"edb59470-4038-48c2-a3ec-f3046406a971\") " pod="openstack/barbican-worker-6596b66679-qmv4f" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.557004 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/edb59470-4038-48c2-a3ec-f3046406a971-config-data-custom\") pod \"barbican-worker-6596b66679-qmv4f\" (UID: \"edb59470-4038-48c2-a3ec-f3046406a971\") " pod="openstack/barbican-worker-6596b66679-qmv4f" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.557077 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edb59470-4038-48c2-a3ec-f3046406a971-config-data\") pod \"barbican-worker-6596b66679-qmv4f\" (UID: \"edb59470-4038-48c2-a3ec-f3046406a971\") " pod="openstack/barbican-worker-6596b66679-qmv4f" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.557157 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb59470-4038-48c2-a3ec-f3046406a971-combined-ca-bundle\") pod \"barbican-worker-6596b66679-qmv4f\" (UID: \"edb59470-4038-48c2-a3ec-f3046406a971\") " pod="openstack/barbican-worker-6596b66679-qmv4f" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.557285 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5bh4\" (UniqueName: \"kubernetes.io/projected/edb59470-4038-48c2-a3ec-f3046406a971-kube-api-access-r5bh4\") pod \"barbican-worker-6596b66679-qmv4f\" (UID: \"edb59470-4038-48c2-a3ec-f3046406a971\") " pod="openstack/barbican-worker-6596b66679-qmv4f" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.590618 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7c455f6f5b-f25td"] Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.621940 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-977f89944-b96zk"] Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.625505 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.646067 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-977f89944-b96zk"] Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.659243 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5bh4\" (UniqueName: \"kubernetes.io/projected/edb59470-4038-48c2-a3ec-f3046406a971-kube-api-access-r5bh4\") pod \"barbican-worker-6596b66679-qmv4f\" (UID: \"edb59470-4038-48c2-a3ec-f3046406a971\") " pod="openstack/barbican-worker-6596b66679-qmv4f" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.659322 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770f4ffe-352c-416b-8f67-a894c4107003-config-data\") pod \"barbican-keystone-listener-5bb8c656f4-cl8tt\" (UID: \"770f4ffe-352c-416b-8f67-a894c4107003\") " pod="openstack/barbican-keystone-listener-5bb8c656f4-cl8tt" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.659351 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edb59470-4038-48c2-a3ec-f3046406a971-logs\") pod \"barbican-worker-6596b66679-qmv4f\" (UID: \"edb59470-4038-48c2-a3ec-f3046406a971\") " pod="openstack/barbican-worker-6596b66679-qmv4f" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.659369 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/edb59470-4038-48c2-a3ec-f3046406a971-config-data-custom\") pod \"barbican-worker-6596b66679-qmv4f\" (UID: \"edb59470-4038-48c2-a3ec-f3046406a971\") " pod="openstack/barbican-worker-6596b66679-qmv4f" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.659413 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/770f4ffe-352c-416b-8f67-a894c4107003-config-data-custom\") pod \"barbican-keystone-listener-5bb8c656f4-cl8tt\" (UID: \"770f4ffe-352c-416b-8f67-a894c4107003\") " pod="openstack/barbican-keystone-listener-5bb8c656f4-cl8tt" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.659435 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edb59470-4038-48c2-a3ec-f3046406a971-config-data\") pod \"barbican-worker-6596b66679-qmv4f\" (UID: \"edb59470-4038-48c2-a3ec-f3046406a971\") " pod="openstack/barbican-worker-6596b66679-qmv4f" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.659451 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/770f4ffe-352c-416b-8f67-a894c4107003-logs\") pod \"barbican-keystone-listener-5bb8c656f4-cl8tt\" (UID: \"770f4ffe-352c-416b-8f67-a894c4107003\") " pod="openstack/barbican-keystone-listener-5bb8c656f4-cl8tt" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.659472 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770f4ffe-352c-416b-8f67-a894c4107003-combined-ca-bundle\") pod \"barbican-keystone-listener-5bb8c656f4-cl8tt\" (UID: \"770f4ffe-352c-416b-8f67-a894c4107003\") " pod="openstack/barbican-keystone-listener-5bb8c656f4-cl8tt" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.659503 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwsgj\" (UniqueName: \"kubernetes.io/projected/770f4ffe-352c-416b-8f67-a894c4107003-kube-api-access-zwsgj\") pod \"barbican-keystone-listener-5bb8c656f4-cl8tt\" (UID: \"770f4ffe-352c-416b-8f67-a894c4107003\") " pod="openstack/barbican-keystone-listener-5bb8c656f4-cl8tt" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.659534 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb59470-4038-48c2-a3ec-f3046406a971-combined-ca-bundle\") pod \"barbican-worker-6596b66679-qmv4f\" (UID: \"edb59470-4038-48c2-a3ec-f3046406a971\") " pod="openstack/barbican-worker-6596b66679-qmv4f" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.661963 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edb59470-4038-48c2-a3ec-f3046406a971-logs\") pod \"barbican-worker-6596b66679-qmv4f\" (UID: \"edb59470-4038-48c2-a3ec-f3046406a971\") " pod="openstack/barbican-worker-6596b66679-qmv4f" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.676614 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/edb59470-4038-48c2-a3ec-f3046406a971-config-data-custom\") pod \"barbican-worker-6596b66679-qmv4f\" (UID: \"edb59470-4038-48c2-a3ec-f3046406a971\") " pod="openstack/barbican-worker-6596b66679-qmv4f" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.677238 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edb59470-4038-48c2-a3ec-f3046406a971-config-data\") pod \"barbican-worker-6596b66679-qmv4f\" (UID: \"edb59470-4038-48c2-a3ec-f3046406a971\") " pod="openstack/barbican-worker-6596b66679-qmv4f" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.677849 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb59470-4038-48c2-a3ec-f3046406a971-combined-ca-bundle\") pod \"barbican-worker-6596b66679-qmv4f\" (UID: \"edb59470-4038-48c2-a3ec-f3046406a971\") " pod="openstack/barbican-worker-6596b66679-qmv4f" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.695815 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5bh4\" (UniqueName: \"kubernetes.io/projected/edb59470-4038-48c2-a3ec-f3046406a971-kube-api-access-r5bh4\") pod \"barbican-worker-6596b66679-qmv4f\" (UID: \"edb59470-4038-48c2-a3ec-f3046406a971\") " pod="openstack/barbican-worker-6596b66679-qmv4f" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.761445 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6596b66679-qmv4f" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.762874 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spmxc\" (UniqueName: \"kubernetes.io/projected/dd004e01-9dac-4316-b6ee-05c1a0f20713-kube-api-access-spmxc\") pod \"barbican-api-977f89944-b96zk\" (UID: \"dd004e01-9dac-4316-b6ee-05c1a0f20713\") " pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.762957 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd004e01-9dac-4316-b6ee-05c1a0f20713-combined-ca-bundle\") pod \"barbican-api-977f89944-b96zk\" (UID: \"dd004e01-9dac-4316-b6ee-05c1a0f20713\") " pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.762996 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770f4ffe-352c-416b-8f67-a894c4107003-config-data\") pod \"barbican-keystone-listener-5bb8c656f4-cl8tt\" (UID: \"770f4ffe-352c-416b-8f67-a894c4107003\") " pod="openstack/barbican-keystone-listener-5bb8c656f4-cl8tt" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.763019 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd004e01-9dac-4316-b6ee-05c1a0f20713-config-data\") pod \"barbican-api-977f89944-b96zk\" (UID: \"dd004e01-9dac-4316-b6ee-05c1a0f20713\") " pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.763057 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd004e01-9dac-4316-b6ee-05c1a0f20713-internal-tls-certs\") pod \"barbican-api-977f89944-b96zk\" (UID: \"dd004e01-9dac-4316-b6ee-05c1a0f20713\") " pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.763293 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/770f4ffe-352c-416b-8f67-a894c4107003-config-data-custom\") pod \"barbican-keystone-listener-5bb8c656f4-cl8tt\" (UID: \"770f4ffe-352c-416b-8f67-a894c4107003\") " pod="openstack/barbican-keystone-listener-5bb8c656f4-cl8tt" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.763358 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/770f4ffe-352c-416b-8f67-a894c4107003-logs\") pod \"barbican-keystone-listener-5bb8c656f4-cl8tt\" (UID: \"770f4ffe-352c-416b-8f67-a894c4107003\") " pod="openstack/barbican-keystone-listener-5bb8c656f4-cl8tt" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.763390 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd004e01-9dac-4316-b6ee-05c1a0f20713-config-data-custom\") pod \"barbican-api-977f89944-b96zk\" (UID: \"dd004e01-9dac-4316-b6ee-05c1a0f20713\") " pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.763415 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd004e01-9dac-4316-b6ee-05c1a0f20713-public-tls-certs\") pod \"barbican-api-977f89944-b96zk\" (UID: \"dd004e01-9dac-4316-b6ee-05c1a0f20713\") " pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.763463 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770f4ffe-352c-416b-8f67-a894c4107003-combined-ca-bundle\") pod \"barbican-keystone-listener-5bb8c656f4-cl8tt\" (UID: \"770f4ffe-352c-416b-8f67-a894c4107003\") " pod="openstack/barbican-keystone-listener-5bb8c656f4-cl8tt" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.763545 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwsgj\" (UniqueName: \"kubernetes.io/projected/770f4ffe-352c-416b-8f67-a894c4107003-kube-api-access-zwsgj\") pod \"barbican-keystone-listener-5bb8c656f4-cl8tt\" (UID: \"770f4ffe-352c-416b-8f67-a894c4107003\") " pod="openstack/barbican-keystone-listener-5bb8c656f4-cl8tt" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.763851 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd004e01-9dac-4316-b6ee-05c1a0f20713-logs\") pod \"barbican-api-977f89944-b96zk\" (UID: \"dd004e01-9dac-4316-b6ee-05c1a0f20713\") " pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.764454 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/770f4ffe-352c-416b-8f67-a894c4107003-logs\") pod \"barbican-keystone-listener-5bb8c656f4-cl8tt\" (UID: \"770f4ffe-352c-416b-8f67-a894c4107003\") " pod="openstack/barbican-keystone-listener-5bb8c656f4-cl8tt" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.773742 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770f4ffe-352c-416b-8f67-a894c4107003-config-data\") pod \"barbican-keystone-listener-5bb8c656f4-cl8tt\" (UID: \"770f4ffe-352c-416b-8f67-a894c4107003\") " pod="openstack/barbican-keystone-listener-5bb8c656f4-cl8tt" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.776673 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/770f4ffe-352c-416b-8f67-a894c4107003-config-data-custom\") pod \"barbican-keystone-listener-5bb8c656f4-cl8tt\" (UID: \"770f4ffe-352c-416b-8f67-a894c4107003\") " pod="openstack/barbican-keystone-listener-5bb8c656f4-cl8tt" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.781416 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwsgj\" (UniqueName: \"kubernetes.io/projected/770f4ffe-352c-416b-8f67-a894c4107003-kube-api-access-zwsgj\") pod \"barbican-keystone-listener-5bb8c656f4-cl8tt\" (UID: \"770f4ffe-352c-416b-8f67-a894c4107003\") " pod="openstack/barbican-keystone-listener-5bb8c656f4-cl8tt" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.784822 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770f4ffe-352c-416b-8f67-a894c4107003-combined-ca-bundle\") pod \"barbican-keystone-listener-5bb8c656f4-cl8tt\" (UID: \"770f4ffe-352c-416b-8f67-a894c4107003\") " pod="openstack/barbican-keystone-listener-5bb8c656f4-cl8tt" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.835605 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5bb8c656f4-cl8tt" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.865674 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd004e01-9dac-4316-b6ee-05c1a0f20713-logs\") pod \"barbican-api-977f89944-b96zk\" (UID: \"dd004e01-9dac-4316-b6ee-05c1a0f20713\") " pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.865735 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spmxc\" (UniqueName: \"kubernetes.io/projected/dd004e01-9dac-4316-b6ee-05c1a0f20713-kube-api-access-spmxc\") pod \"barbican-api-977f89944-b96zk\" (UID: \"dd004e01-9dac-4316-b6ee-05c1a0f20713\") " pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.865787 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd004e01-9dac-4316-b6ee-05c1a0f20713-combined-ca-bundle\") pod \"barbican-api-977f89944-b96zk\" (UID: \"dd004e01-9dac-4316-b6ee-05c1a0f20713\") " pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.865833 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd004e01-9dac-4316-b6ee-05c1a0f20713-config-data\") pod \"barbican-api-977f89944-b96zk\" (UID: \"dd004e01-9dac-4316-b6ee-05c1a0f20713\") " pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.865868 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd004e01-9dac-4316-b6ee-05c1a0f20713-internal-tls-certs\") pod \"barbican-api-977f89944-b96zk\" (UID: \"dd004e01-9dac-4316-b6ee-05c1a0f20713\") " pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.865927 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd004e01-9dac-4316-b6ee-05c1a0f20713-config-data-custom\") pod \"barbican-api-977f89944-b96zk\" (UID: \"dd004e01-9dac-4316-b6ee-05c1a0f20713\") " pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.865948 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd004e01-9dac-4316-b6ee-05c1a0f20713-public-tls-certs\") pod \"barbican-api-977f89944-b96zk\" (UID: \"dd004e01-9dac-4316-b6ee-05c1a0f20713\") " pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.866472 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd004e01-9dac-4316-b6ee-05c1a0f20713-logs\") pod \"barbican-api-977f89944-b96zk\" (UID: \"dd004e01-9dac-4316-b6ee-05c1a0f20713\") " pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.871352 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd004e01-9dac-4316-b6ee-05c1a0f20713-combined-ca-bundle\") pod \"barbican-api-977f89944-b96zk\" (UID: \"dd004e01-9dac-4316-b6ee-05c1a0f20713\") " pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.874560 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd004e01-9dac-4316-b6ee-05c1a0f20713-public-tls-certs\") pod \"barbican-api-977f89944-b96zk\" (UID: \"dd004e01-9dac-4316-b6ee-05c1a0f20713\") " pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.877096 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd004e01-9dac-4316-b6ee-05c1a0f20713-config-data-custom\") pod \"barbican-api-977f89944-b96zk\" (UID: \"dd004e01-9dac-4316-b6ee-05c1a0f20713\") " pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.877873 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd004e01-9dac-4316-b6ee-05c1a0f20713-internal-tls-certs\") pod \"barbican-api-977f89944-b96zk\" (UID: \"dd004e01-9dac-4316-b6ee-05c1a0f20713\") " pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.883232 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd004e01-9dac-4316-b6ee-05c1a0f20713-config-data\") pod \"barbican-api-977f89944-b96zk\" (UID: \"dd004e01-9dac-4316-b6ee-05c1a0f20713\") " pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.894829 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spmxc\" (UniqueName: \"kubernetes.io/projected/dd004e01-9dac-4316-b6ee-05c1a0f20713-kube-api-access-spmxc\") pod \"barbican-api-977f89944-b96zk\" (UID: \"dd004e01-9dac-4316-b6ee-05c1a0f20713\") " pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:06 crc kubenswrapper[4699]: I0226 11:32:06.951541 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:07 crc kubenswrapper[4699]: I0226 11:32:07.241628 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 11:32:07 crc kubenswrapper[4699]: I0226 11:32:07.242463 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 11:32:07 crc kubenswrapper[4699]: I0226 11:32:07.291514 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 11:32:07 crc kubenswrapper[4699]: I0226 11:32:07.307685 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 11:32:07 crc kubenswrapper[4699]: W0226 11:32:07.401196 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedb59470_4038_48c2_a3ec_f3046406a971.slice/crio-f9e9ed4f68ff5e841b905f88457d8ee5e1235b06031621cd281168141ba74c94 WatchSource:0}: Error finding container f9e9ed4f68ff5e841b905f88457d8ee5e1235b06031621cd281168141ba74c94: Status 404 returned error can't find the container with id f9e9ed4f68ff5e841b905f88457d8ee5e1235b06031621cd281168141ba74c94 Feb 26 11:32:07 crc kubenswrapper[4699]: I0226 11:32:07.406550 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6596b66679-qmv4f"] Feb 26 11:32:07 crc kubenswrapper[4699]: I0226 11:32:07.486840 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5bb8c656f4-cl8tt"] Feb 26 11:32:07 crc kubenswrapper[4699]: W0226 11:32:07.502324 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod770f4ffe_352c_416b_8f67_a894c4107003.slice/crio-da0df9922567327a6a465c5335d645dd901f4e99046de86caad0e48c0c0c9aa5 WatchSource:0}: Error finding container da0df9922567327a6a465c5335d645dd901f4e99046de86caad0e48c0c0c9aa5: Status 404 returned error can't find the container with id da0df9922567327a6a465c5335d645dd901f4e99046de86caad0e48c0c0c9aa5 Feb 26 11:32:07 crc kubenswrapper[4699]: I0226 11:32:07.592240 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-977f89944-b96zk"] Feb 26 11:32:07 crc kubenswrapper[4699]: W0226 11:32:07.643698 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd004e01_9dac_4316_b6ee_05c1a0f20713.slice/crio-3140f5b1ce1501b49833062e2d5d01a117ba9b880f4453bd067f76646894c128 WatchSource:0}: Error finding container 3140f5b1ce1501b49833062e2d5d01a117ba9b880f4453bd067f76646894c128: Status 404 returned error can't find the container with id 3140f5b1ce1501b49833062e2d5d01a117ba9b880f4453bd067f76646894c128 Feb 26 11:32:07 crc kubenswrapper[4699]: I0226 11:32:07.802935 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535092-t7q4h" Feb 26 11:32:07 crc kubenswrapper[4699]: I0226 11:32:07.901627 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpjjs\" (UniqueName: \"kubernetes.io/projected/343bb829-035d-4834-a0c4-d9a61c11a2ee-kube-api-access-hpjjs\") pod \"343bb829-035d-4834-a0c4-d9a61c11a2ee\" (UID: \"343bb829-035d-4834-a0c4-d9a61c11a2ee\") " Feb 26 11:32:07 crc kubenswrapper[4699]: I0226 11:32:07.924690 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/343bb829-035d-4834-a0c4-d9a61c11a2ee-kube-api-access-hpjjs" (OuterVolumeSpecName: "kube-api-access-hpjjs") pod "343bb829-035d-4834-a0c4-d9a61c11a2ee" (UID: "343bb829-035d-4834-a0c4-d9a61c11a2ee"). InnerVolumeSpecName "kube-api-access-hpjjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:32:08 crc kubenswrapper[4699]: I0226 11:32:08.004388 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpjjs\" (UniqueName: \"kubernetes.io/projected/343bb829-035d-4834-a0c4-d9a61c11a2ee-kube-api-access-hpjjs\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:08 crc kubenswrapper[4699]: I0226 11:32:08.137462 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57899c756d-w9pc5" podUID="78d85906-b78a-46eb-b5dd-4da95c1222d8" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Feb 26 11:32:08 crc kubenswrapper[4699]: I0226 11:32:08.214023 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5795557cd8-dvzqq" podUID="15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.155:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.155:8443: connect: connection refused" Feb 26 11:32:08 crc kubenswrapper[4699]: I0226 11:32:08.286519 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5bb8c656f4-cl8tt" event={"ID":"770f4ffe-352c-416b-8f67-a894c4107003","Type":"ContainerStarted","Data":"8922e19c5aab9ec1b1802ee35e765681c8e48a651add12222ada15ceab724d5d"} Feb 26 11:32:08 crc kubenswrapper[4699]: I0226 11:32:08.286572 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5bb8c656f4-cl8tt" event={"ID":"770f4ffe-352c-416b-8f67-a894c4107003","Type":"ContainerStarted","Data":"da0df9922567327a6a465c5335d645dd901f4e99046de86caad0e48c0c0c9aa5"} Feb 26 11:32:08 crc kubenswrapper[4699]: I0226 11:32:08.289917 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535092-t7q4h" event={"ID":"343bb829-035d-4834-a0c4-d9a61c11a2ee","Type":"ContainerDied","Data":"79a1129abbd02611508e0ac75a09840d716301f132a8b45d0ccf4b2b830b608a"} Feb 26 11:32:08 crc kubenswrapper[4699]: I0226 11:32:08.289977 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79a1129abbd02611508e0ac75a09840d716301f132a8b45d0ccf4b2b830b608a" Feb 26 11:32:08 crc kubenswrapper[4699]: I0226 11:32:08.290066 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535092-t7q4h" Feb 26 11:32:08 crc kubenswrapper[4699]: I0226 11:32:08.298188 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-977f89944-b96zk" event={"ID":"dd004e01-9dac-4316-b6ee-05c1a0f20713","Type":"ContainerStarted","Data":"6c3e3ad06d91af3563256917e31187ffdce0e6ac43b19116b69ce425875fd7a8"} Feb 26 11:32:08 crc kubenswrapper[4699]: I0226 11:32:08.298235 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-977f89944-b96zk" event={"ID":"dd004e01-9dac-4316-b6ee-05c1a0f20713","Type":"ContainerStarted","Data":"3140f5b1ce1501b49833062e2d5d01a117ba9b880f4453bd067f76646894c128"} Feb 26 11:32:08 crc kubenswrapper[4699]: I0226 11:32:08.299421 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535086-jjp9j"] Feb 26 11:32:08 crc kubenswrapper[4699]: I0226 11:32:08.300847 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6596b66679-qmv4f" event={"ID":"edb59470-4038-48c2-a3ec-f3046406a971","Type":"ContainerStarted","Data":"870168c7b1196015e731aed77dea2764205201ef5876695bad0a57f0beae9fd1"} Feb 26 11:32:08 crc kubenswrapper[4699]: I0226 11:32:08.300913 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6596b66679-qmv4f" event={"ID":"edb59470-4038-48c2-a3ec-f3046406a971","Type":"ContainerStarted","Data":"f9e9ed4f68ff5e841b905f88457d8ee5e1235b06031621cd281168141ba74c94"} Feb 26 11:32:08 crc kubenswrapper[4699]: I0226 11:32:08.300948 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7c455f6f5b-f25td" podUID="2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" containerName="barbican-api-log" containerID="cri-o://68ffeebcc8219b513faf07851f7ec0e29081e29acdefcbc0d3a8bcb52016ff06" gracePeriod=30 Feb 26 11:32:08 crc kubenswrapper[4699]: I0226 11:32:08.301076 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7c455f6f5b-f25td" podUID="2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" containerName="barbican-api" containerID="cri-o://a0d7c518107ce530bde8dc06ecc1543caeb752ad958b36e173be8e60f8d8a088" gracePeriod=30 Feb 26 11:32:08 crc kubenswrapper[4699]: I0226 11:32:08.301683 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 11:32:08 crc kubenswrapper[4699]: I0226 11:32:08.301913 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 11:32:08 crc kubenswrapper[4699]: I0226 11:32:08.306792 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c455f6f5b-f25td" podUID="2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": EOF" Feb 26 11:32:08 crc kubenswrapper[4699]: I0226 11:32:08.323301 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535086-jjp9j"] Feb 26 11:32:09 crc kubenswrapper[4699]: I0226 11:32:09.310639 4699 generic.go:334] "Generic (PLEG): container finished" podID="2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" containerID="68ffeebcc8219b513faf07851f7ec0e29081e29acdefcbc0d3a8bcb52016ff06" exitCode=143 Feb 26 11:32:09 crc kubenswrapper[4699]: I0226 11:32:09.310739 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c455f6f5b-f25td" event={"ID":"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4","Type":"ContainerDied","Data":"68ffeebcc8219b513faf07851f7ec0e29081e29acdefcbc0d3a8bcb52016ff06"} Feb 26 11:32:09 crc kubenswrapper[4699]: I0226 11:32:09.554701 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:32:09 crc kubenswrapper[4699]: I0226 11:32:09.627430 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-nhzhh"] Feb 26 11:32:09 crc kubenswrapper[4699]: I0226 11:32:09.627728 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" podUID="81843e2c-774f-402a-bd90-c4485ab24c05" containerName="dnsmasq-dns" containerID="cri-o://3eda8514ede18fd03dc0849cf95cf8d9b4cb3f130429078ff465a976e2f5421b" gracePeriod=10 Feb 26 11:32:10 crc kubenswrapper[4699]: I0226 11:32:10.273627 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fce3efa9-6f6f-4e81-a7a4-6249237a0d61" path="/var/lib/kubelet/pods/fce3efa9-6f6f-4e81-a7a4-6249237a0d61/volumes" Feb 26 11:32:10 crc kubenswrapper[4699]: I0226 11:32:10.323422 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5bb8c656f4-cl8tt" event={"ID":"770f4ffe-352c-416b-8f67-a894c4107003","Type":"ContainerStarted","Data":"a7080c86547fd8f7c7e97ac6d4432f041d526e65a75fc69008d1132326998b56"} Feb 26 11:32:10 crc kubenswrapper[4699]: I0226 11:32:10.331527 4699 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 11:32:10 crc kubenswrapper[4699]: I0226 11:32:10.331553 4699 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 11:32:10 crc kubenswrapper[4699]: I0226 11:32:10.331917 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-977f89944-b96zk" event={"ID":"dd004e01-9dac-4316-b6ee-05c1a0f20713","Type":"ContainerStarted","Data":"14f772681a296c450847b03d8e34f52d0bcba29f69abe121cf72db917752342c"} Feb 26 11:32:10 crc kubenswrapper[4699]: I0226 11:32:10.442211 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-84b6bf6c74-r47qt" podUID="0876db8f-e235-40d9-b4a5-718097cdf02c" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 11:32:10 crc kubenswrapper[4699]: I0226 11:32:10.511227 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:11 crc kubenswrapper[4699]: I0226 11:32:11.048933 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 11:32:11 crc kubenswrapper[4699]: I0226 11:32:11.332170 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 11:32:11 crc kubenswrapper[4699]: I0226 11:32:11.387508 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6596b66679-qmv4f" event={"ID":"edb59470-4038-48c2-a3ec-f3046406a971","Type":"ContainerStarted","Data":"22902c6f0d47bcb2e5584cc068bb540826236ee2bb20b5249dd39ec46f56f698"} Feb 26 11:32:11 crc kubenswrapper[4699]: I0226 11:32:11.401434 4699 generic.go:334] "Generic (PLEG): container finished" podID="81843e2c-774f-402a-bd90-c4485ab24c05" containerID="3eda8514ede18fd03dc0849cf95cf8d9b4cb3f130429078ff465a976e2f5421b" exitCode=0 Feb 26 11:32:11 crc kubenswrapper[4699]: I0226 11:32:11.402633 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" event={"ID":"81843e2c-774f-402a-bd90-c4485ab24c05","Type":"ContainerDied","Data":"3eda8514ede18fd03dc0849cf95cf8d9b4cb3f130429078ff465a976e2f5421b"} Feb 26 11:32:11 crc kubenswrapper[4699]: I0226 11:32:11.402687 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:11 crc kubenswrapper[4699]: I0226 11:32:11.402721 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:11 crc kubenswrapper[4699]: I0226 11:32:11.414643 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6596b66679-qmv4f" podStartSLOduration=5.41462396 podStartE2EDuration="5.41462396s" podCreationTimestamp="2026-02-26 11:32:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:32:11.40598046 +0000 UTC m=+1277.216806904" watchObservedRunningTime="2026-02-26 11:32:11.41462396 +0000 UTC m=+1277.225450394" Feb 26 11:32:11 crc kubenswrapper[4699]: I0226 11:32:11.437913 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-977f89944-b96zk" podStartSLOduration=5.437893152 podStartE2EDuration="5.437893152s" podCreationTimestamp="2026-02-26 11:32:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:32:11.430317683 +0000 UTC m=+1277.241144117" watchObservedRunningTime="2026-02-26 11:32:11.437893152 +0000 UTC m=+1277.248719586" Feb 26 11:32:11 crc kubenswrapper[4699]: I0226 11:32:11.463848 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5fd9f445b9-bnr2j"] Feb 26 11:32:11 crc kubenswrapper[4699]: I0226 11:32:11.464100 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5fd9f445b9-bnr2j" podUID="4a31ea66-afbf-4606-baa9-0f5fb98e5c4f" containerName="barbican-worker-log" containerID="cri-o://47aa6fcab7ba63e0059bde039291f7d09fed39c47d8ed3b4b011f2b39240d68f" gracePeriod=30 Feb 26 11:32:11 crc kubenswrapper[4699]: I0226 11:32:11.464166 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5fd9f445b9-bnr2j" podUID="4a31ea66-afbf-4606-baa9-0f5fb98e5c4f" containerName="barbican-worker" containerID="cri-o://59e86688f7ad25464b86f65ac7156f4d78dbfbb25e41fcbc1ec58a4c8ed79739" gracePeriod=30 Feb 26 11:32:11 crc kubenswrapper[4699]: I0226 11:32:11.474045 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5bb8c656f4-cl8tt" podStartSLOduration=5.474025885 podStartE2EDuration="5.474025885s" podCreationTimestamp="2026-02-26 11:32:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:32:11.459237848 +0000 UTC m=+1277.270064302" watchObservedRunningTime="2026-02-26 11:32:11.474025885 +0000 UTC m=+1277.284852319" Feb 26 11:32:11 crc kubenswrapper[4699]: I0226 11:32:11.508437 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-8dc77f9b6-7s844"] Feb 26 11:32:11 crc kubenswrapper[4699]: I0226 11:32:11.508662 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" podUID="b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b" containerName="barbican-keystone-listener-log" containerID="cri-o://514b33745b4aa127708bf8765bb8617e15516309231f6f17906729d04d3d2a16" gracePeriod=30 Feb 26 11:32:11 crc kubenswrapper[4699]: I0226 11:32:11.508831 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" podUID="b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b" containerName="barbican-keystone-listener" containerID="cri-o://33718273cf0b85bce01d52282cadd465a6d877e40d664fb877a0e2590e81381a" gracePeriod=30 Feb 26 11:32:11 crc kubenswrapper[4699]: I0226 11:32:11.729690 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-d4878dd78-qpvzg" Feb 26 11:32:11 crc kubenswrapper[4699]: I0226 11:32:11.825305 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-78f86c6bf8-r6wpf"] Feb 26 11:32:11 crc kubenswrapper[4699]: I0226 11:32:11.825566 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-78f86c6bf8-r6wpf" podUID="e9ef11cc-2a83-4f0e-b117-4be10a1c0fee" containerName="placement-log" containerID="cri-o://cfb24c179a421c25f4518e3a61ebbebf3cbb893957f33df4ff29a903e7099944" gracePeriod=30 Feb 26 11:32:11 crc kubenswrapper[4699]: I0226 11:32:11.826359 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-78f86c6bf8-r6wpf" podUID="e9ef11cc-2a83-4f0e-b117-4be10a1c0fee" containerName="placement-api" containerID="cri-o://1a317729338c17b4684891909a81baf465ca5e0314fc7b75c6f3742a26c946fe" gracePeriod=30 Feb 26 11:32:11 crc kubenswrapper[4699]: I0226 11:32:11.875658 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-78f86c6bf8-r6wpf" podUID="e9ef11cc-2a83-4f0e-b117-4be10a1c0fee" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.168:8778/\": read tcp 10.217.0.2:51950->10.217.0.168:8778: read: connection reset by peer" Feb 26 11:32:11 crc kubenswrapper[4699]: I0226 11:32:11.875996 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-78f86c6bf8-r6wpf" podUID="e9ef11cc-2a83-4f0e-b117-4be10a1c0fee" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.168:8778/\": EOF" Feb 26 11:32:12 crc kubenswrapper[4699]: I0226 11:32:12.202529 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 11:32:12 crc kubenswrapper[4699]: I0226 11:32:12.202577 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 11:32:12 crc kubenswrapper[4699]: I0226 11:32:12.329443 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 11:32:12 crc kubenswrapper[4699]: I0226 11:32:12.343591 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 11:32:12 crc kubenswrapper[4699]: I0226 11:32:12.429972 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" podUID="81843e2c-774f-402a-bd90-c4485ab24c05" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.148:5353: connect: connection refused" Feb 26 11:32:12 crc kubenswrapper[4699]: I0226 11:32:12.436722 4699 generic.go:334] "Generic (PLEG): container finished" podID="b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b" containerID="514b33745b4aa127708bf8765bb8617e15516309231f6f17906729d04d3d2a16" exitCode=143 Feb 26 11:32:12 crc kubenswrapper[4699]: I0226 11:32:12.436805 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" event={"ID":"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b","Type":"ContainerDied","Data":"514b33745b4aa127708bf8765bb8617e15516309231f6f17906729d04d3d2a16"} Feb 26 11:32:12 crc kubenswrapper[4699]: I0226 11:32:12.454433 4699 generic.go:334] "Generic (PLEG): container finished" podID="e9ef11cc-2a83-4f0e-b117-4be10a1c0fee" containerID="cfb24c179a421c25f4518e3a61ebbebf3cbb893957f33df4ff29a903e7099944" exitCode=143 Feb 26 11:32:12 crc kubenswrapper[4699]: I0226 11:32:12.454566 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78f86c6bf8-r6wpf" event={"ID":"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee","Type":"ContainerDied","Data":"cfb24c179a421c25f4518e3a61ebbebf3cbb893957f33df4ff29a903e7099944"} Feb 26 11:32:12 crc kubenswrapper[4699]: I0226 11:32:12.461455 4699 generic.go:334] "Generic (PLEG): container finished" podID="8426fd89-9eba-46fa-8611-e98cc7636b41" containerID="2cec29afd9941e14f3e1571b5331427d3b1faa6723571c88143afc902d980bd2" exitCode=0 Feb 26 11:32:12 crc kubenswrapper[4699]: I0226 11:32:12.461550 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f49xd" event={"ID":"8426fd89-9eba-46fa-8611-e98cc7636b41","Type":"ContainerDied","Data":"2cec29afd9941e14f3e1571b5331427d3b1faa6723571c88143afc902d980bd2"} Feb 26 11:32:12 crc kubenswrapper[4699]: I0226 11:32:12.468860 4699 generic.go:334] "Generic (PLEG): container finished" podID="4a31ea66-afbf-4606-baa9-0f5fb98e5c4f" containerID="59e86688f7ad25464b86f65ac7156f4d78dbfbb25e41fcbc1ec58a4c8ed79739" exitCode=0 Feb 26 11:32:12 crc kubenswrapper[4699]: I0226 11:32:12.468894 4699 generic.go:334] "Generic (PLEG): container finished" podID="4a31ea66-afbf-4606-baa9-0f5fb98e5c4f" containerID="47aa6fcab7ba63e0059bde039291f7d09fed39c47d8ed3b4b011f2b39240d68f" exitCode=143 Feb 26 11:32:12 crc kubenswrapper[4699]: I0226 11:32:12.469714 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5fd9f445b9-bnr2j" event={"ID":"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f","Type":"ContainerDied","Data":"59e86688f7ad25464b86f65ac7156f4d78dbfbb25e41fcbc1ec58a4c8ed79739"} Feb 26 11:32:12 crc kubenswrapper[4699]: I0226 11:32:12.469746 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5fd9f445b9-bnr2j" event={"ID":"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f","Type":"ContainerDied","Data":"47aa6fcab7ba63e0059bde039291f7d09fed39c47d8ed3b4b011f2b39240d68f"} Feb 26 11:32:12 crc kubenswrapper[4699]: I0226 11:32:12.471809 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 11:32:12 crc kubenswrapper[4699]: I0226 11:32:12.471848 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 11:32:13 crc kubenswrapper[4699]: I0226 11:32:13.359262 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c455f6f5b-f25td" podUID="2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 11:32:13 crc kubenswrapper[4699]: I0226 11:32:13.722390 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c455f6f5b-f25td" podUID="2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": read tcp 10.217.0.2:58256->10.217.0.162:9311: read: connection reset by peer" Feb 26 11:32:13 crc kubenswrapper[4699]: I0226 11:32:13.722448 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c455f6f5b-f25td" podUID="2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": read tcp 10.217.0.2:58258->10.217.0.162:9311: read: connection reset by peer" Feb 26 11:32:14 crc kubenswrapper[4699]: I0226 11:32:14.328775 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:32:14 crc kubenswrapper[4699]: I0226 11:32:14.495311 4699 generic.go:334] "Generic (PLEG): container finished" podID="2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" containerID="a0d7c518107ce530bde8dc06ecc1543caeb752ad958b36e173be8e60f8d8a088" exitCode=0 Feb 26 11:32:14 crc kubenswrapper[4699]: I0226 11:32:14.495464 4699 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 11:32:14 crc kubenswrapper[4699]: I0226 11:32:14.495476 4699 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 11:32:14 crc kubenswrapper[4699]: I0226 11:32:14.496698 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c455f6f5b-f25td" event={"ID":"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4","Type":"ContainerDied","Data":"a0d7c518107ce530bde8dc06ecc1543caeb752ad958b36e173be8e60f8d8a088"} Feb 26 11:32:14 crc kubenswrapper[4699]: I0226 11:32:14.550640 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c455f6f5b-f25td" podUID="2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": dial tcp 10.217.0.162:9311: connect: connection refused" Feb 26 11:32:14 crc kubenswrapper[4699]: I0226 11:32:14.550809 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c455f6f5b-f25td" podUID="2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": dial tcp 10.217.0.162:9311: connect: connection refused" Feb 26 11:32:14 crc kubenswrapper[4699]: I0226 11:32:14.865476 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:32:15 crc kubenswrapper[4699]: I0226 11:32:15.068798 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 11:32:15 crc kubenswrapper[4699]: I0226 11:32:15.071053 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 11:32:15 crc kubenswrapper[4699]: I0226 11:32:15.661943 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:16 crc kubenswrapper[4699]: I0226 11:32:16.526791 4699 generic.go:334] "Generic (PLEG): container finished" podID="b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b" containerID="33718273cf0b85bce01d52282cadd465a6d877e40d664fb877a0e2590e81381a" exitCode=0 Feb 26 11:32:16 crc kubenswrapper[4699]: I0226 11:32:16.527075 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" event={"ID":"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b","Type":"ContainerDied","Data":"33718273cf0b85bce01d52282cadd465a6d877e40d664fb877a0e2590e81381a"} Feb 26 11:32:17 crc kubenswrapper[4699]: I0226 11:32:17.423206 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" podUID="81843e2c-774f-402a-bd90-c4485ab24c05" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.148:5353: connect: connection refused" Feb 26 11:32:17 crc kubenswrapper[4699]: I0226 11:32:17.503342 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-977f89944-b96zk" Feb 26 11:32:17 crc kubenswrapper[4699]: I0226 11:32:17.542853 4699 generic.go:334] "Generic (PLEG): container finished" podID="e9ef11cc-2a83-4f0e-b117-4be10a1c0fee" containerID="1a317729338c17b4684891909a81baf465ca5e0314fc7b75c6f3742a26c946fe" exitCode=0 Feb 26 11:32:17 crc kubenswrapper[4699]: I0226 11:32:17.543027 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78f86c6bf8-r6wpf" event={"ID":"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee","Type":"ContainerDied","Data":"1a317729338c17b4684891909a81baf465ca5e0314fc7b75c6f3742a26c946fe"} Feb 26 11:32:17 crc kubenswrapper[4699]: I0226 11:32:17.569756 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-84b6bf6c74-r47qt"] Feb 26 11:32:17 crc kubenswrapper[4699]: I0226 11:32:17.569991 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-84b6bf6c74-r47qt" podUID="0876db8f-e235-40d9-b4a5-718097cdf02c" containerName="barbican-api-log" containerID="cri-o://c3cf20e496184d423dd9676570affb3ed62ff3f5e0e800069d47d590effab24c" gracePeriod=30 Feb 26 11:32:17 crc kubenswrapper[4699]: I0226 11:32:17.570521 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-84b6bf6c74-r47qt" podUID="0876db8f-e235-40d9-b4a5-718097cdf02c" containerName="barbican-api" containerID="cri-o://5fee23b2bd35e07b2bc23127d9ba51147df6b5de9840523d49b7247f51fcf676" gracePeriod=30 Feb 26 11:32:17 crc kubenswrapper[4699]: I0226 11:32:17.699246 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-59dd795c56-7kv72" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.107367 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6dc5565bbf-zgvcg"] Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.107654 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6dc5565bbf-zgvcg" podUID="73fd43db-ab24-441d-9912-881ef04d4572" containerName="neutron-api" containerID="cri-o://f1b43b05d45b05ac3c54d378fa118972d9e5848b345eada8b66bb2c67ea89c63" gracePeriod=30 Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.107776 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6dc5565bbf-zgvcg" podUID="73fd43db-ab24-441d-9912-881ef04d4572" containerName="neutron-httpd" containerID="cri-o://fb5409015c0850abe735cc049f283c49118298bd94a368b4191042b9fb38469f" gracePeriod=30 Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.122174 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6dc5565bbf-zgvcg" podUID="73fd43db-ab24-441d-9912-881ef04d4572" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9696/\": EOF" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.138184 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6d45896d49-mh862"] Feb 26 11:32:18 crc kubenswrapper[4699]: E0226 11:32:18.138582 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343bb829-035d-4834-a0c4-d9a61c11a2ee" containerName="oc" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.138597 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="343bb829-035d-4834-a0c4-d9a61c11a2ee" containerName="oc" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.138782 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="343bb829-035d-4834-a0c4-d9a61c11a2ee" containerName="oc" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.139786 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.151520 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d45896d49-mh862"] Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.226621 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/862cb546-78f8-4864-a158-9dc217ec2796-internal-tls-certs\") pod \"neutron-6d45896d49-mh862\" (UID: \"862cb546-78f8-4864-a158-9dc217ec2796\") " pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.226684 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/862cb546-78f8-4864-a158-9dc217ec2796-config\") pod \"neutron-6d45896d49-mh862\" (UID: \"862cb546-78f8-4864-a158-9dc217ec2796\") " pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.226738 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/862cb546-78f8-4864-a158-9dc217ec2796-combined-ca-bundle\") pod \"neutron-6d45896d49-mh862\" (UID: \"862cb546-78f8-4864-a158-9dc217ec2796\") " pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.226764 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/862cb546-78f8-4864-a158-9dc217ec2796-ovndb-tls-certs\") pod \"neutron-6d45896d49-mh862\" (UID: \"862cb546-78f8-4864-a158-9dc217ec2796\") " pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.227067 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/862cb546-78f8-4864-a158-9dc217ec2796-public-tls-certs\") pod \"neutron-6d45896d49-mh862\" (UID: \"862cb546-78f8-4864-a158-9dc217ec2796\") " pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.227132 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/862cb546-78f8-4864-a158-9dc217ec2796-httpd-config\") pod \"neutron-6d45896d49-mh862\" (UID: \"862cb546-78f8-4864-a158-9dc217ec2796\") " pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.227405 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xpnq\" (UniqueName: \"kubernetes.io/projected/862cb546-78f8-4864-a158-9dc217ec2796-kube-api-access-7xpnq\") pod \"neutron-6d45896d49-mh862\" (UID: \"862cb546-78f8-4864-a158-9dc217ec2796\") " pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.329003 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/862cb546-78f8-4864-a158-9dc217ec2796-public-tls-certs\") pod \"neutron-6d45896d49-mh862\" (UID: \"862cb546-78f8-4864-a158-9dc217ec2796\") " pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.329078 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/862cb546-78f8-4864-a158-9dc217ec2796-httpd-config\") pod \"neutron-6d45896d49-mh862\" (UID: \"862cb546-78f8-4864-a158-9dc217ec2796\") " pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.329228 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xpnq\" (UniqueName: \"kubernetes.io/projected/862cb546-78f8-4864-a158-9dc217ec2796-kube-api-access-7xpnq\") pod \"neutron-6d45896d49-mh862\" (UID: \"862cb546-78f8-4864-a158-9dc217ec2796\") " pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.329327 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/862cb546-78f8-4864-a158-9dc217ec2796-internal-tls-certs\") pod \"neutron-6d45896d49-mh862\" (UID: \"862cb546-78f8-4864-a158-9dc217ec2796\") " pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.329375 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/862cb546-78f8-4864-a158-9dc217ec2796-config\") pod \"neutron-6d45896d49-mh862\" (UID: \"862cb546-78f8-4864-a158-9dc217ec2796\") " pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.329411 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/862cb546-78f8-4864-a158-9dc217ec2796-combined-ca-bundle\") pod \"neutron-6d45896d49-mh862\" (UID: \"862cb546-78f8-4864-a158-9dc217ec2796\") " pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.329471 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/862cb546-78f8-4864-a158-9dc217ec2796-ovndb-tls-certs\") pod \"neutron-6d45896d49-mh862\" (UID: \"862cb546-78f8-4864-a158-9dc217ec2796\") " pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.335145 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/862cb546-78f8-4864-a158-9dc217ec2796-public-tls-certs\") pod \"neutron-6d45896d49-mh862\" (UID: \"862cb546-78f8-4864-a158-9dc217ec2796\") " pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.335146 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/862cb546-78f8-4864-a158-9dc217ec2796-ovndb-tls-certs\") pod \"neutron-6d45896d49-mh862\" (UID: \"862cb546-78f8-4864-a158-9dc217ec2796\") " pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.335773 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/862cb546-78f8-4864-a158-9dc217ec2796-combined-ca-bundle\") pod \"neutron-6d45896d49-mh862\" (UID: \"862cb546-78f8-4864-a158-9dc217ec2796\") " pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.336374 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/862cb546-78f8-4864-a158-9dc217ec2796-config\") pod \"neutron-6d45896d49-mh862\" (UID: \"862cb546-78f8-4864-a158-9dc217ec2796\") " pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.336581 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/862cb546-78f8-4864-a158-9dc217ec2796-internal-tls-certs\") pod \"neutron-6d45896d49-mh862\" (UID: \"862cb546-78f8-4864-a158-9dc217ec2796\") " pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.344151 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/862cb546-78f8-4864-a158-9dc217ec2796-httpd-config\") pod \"neutron-6d45896d49-mh862\" (UID: \"862cb546-78f8-4864-a158-9dc217ec2796\") " pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.349491 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xpnq\" (UniqueName: \"kubernetes.io/projected/862cb546-78f8-4864-a158-9dc217ec2796-kube-api-access-7xpnq\") pod \"neutron-6d45896d49-mh862\" (UID: \"862cb546-78f8-4864-a158-9dc217ec2796\") " pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.462630 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.554390 4699 generic.go:334] "Generic (PLEG): container finished" podID="0876db8f-e235-40d9-b4a5-718097cdf02c" containerID="c3cf20e496184d423dd9676570affb3ed62ff3f5e0e800069d47d590effab24c" exitCode=143 Feb 26 11:32:18 crc kubenswrapper[4699]: I0226 11:32:18.554675 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84b6bf6c74-r47qt" event={"ID":"0876db8f-e235-40d9-b4a5-718097cdf02c","Type":"ContainerDied","Data":"c3cf20e496184d423dd9676570affb3ed62ff3f5e0e800069d47d590effab24c"} Feb 26 11:32:19 crc kubenswrapper[4699]: I0226 11:32:19.551489 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c455f6f5b-f25td" podUID="2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": dial tcp 10.217.0.162:9311: connect: connection refused" Feb 26 11:32:19 crc kubenswrapper[4699]: I0226 11:32:19.552381 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c455f6f5b-f25td" podUID="2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": dial tcp 10.217.0.162:9311: connect: connection refused" Feb 26 11:32:20 crc kubenswrapper[4699]: I0226 11:32:20.008885 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6dc5565bbf-zgvcg" podUID="73fd43db-ab24-441d-9912-881ef04d4572" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9696/\": dial tcp 10.217.0.158:9696: connect: connection refused" Feb 26 11:32:20 crc kubenswrapper[4699]: I0226 11:32:20.322608 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:32:20 crc kubenswrapper[4699]: I0226 11:32:20.335734 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:32:21 crc kubenswrapper[4699]: I0226 11:32:21.451346 4699 scope.go:117] "RemoveContainer" containerID="842f6cf352666ae13feda0b772e0ee74a200121a74a35bd2b4b96deac77bd6aa" Feb 26 11:32:21 crc kubenswrapper[4699]: I0226 11:32:21.584026 4699 generic.go:334] "Generic (PLEG): container finished" podID="73fd43db-ab24-441d-9912-881ef04d4572" containerID="fb5409015c0850abe735cc049f283c49118298bd94a368b4191042b9fb38469f" exitCode=0 Feb 26 11:32:21 crc kubenswrapper[4699]: I0226 11:32:21.584099 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dc5565bbf-zgvcg" event={"ID":"73fd43db-ab24-441d-9912-881ef04d4572","Type":"ContainerDied","Data":"fb5409015c0850abe735cc049f283c49118298bd94a368b4191042b9fb38469f"} Feb 26 11:32:21 crc kubenswrapper[4699]: I0226 11:32:21.585828 4699 generic.go:334] "Generic (PLEG): container finished" podID="0876db8f-e235-40d9-b4a5-718097cdf02c" containerID="5fee23b2bd35e07b2bc23127d9ba51147df6b5de9840523d49b7247f51fcf676" exitCode=0 Feb 26 11:32:21 crc kubenswrapper[4699]: I0226 11:32:21.585857 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84b6bf6c74-r47qt" event={"ID":"0876db8f-e235-40d9-b4a5-718097cdf02c","Type":"ContainerDied","Data":"5fee23b2bd35e07b2bc23127d9ba51147df6b5de9840523d49b7247f51fcf676"} Feb 26 11:32:22 crc kubenswrapper[4699]: I0226 11:32:22.096080 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5795557cd8-dvzqq" Feb 26 11:32:22 crc kubenswrapper[4699]: I0226 11:32:22.164294 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:32:22 crc kubenswrapper[4699]: I0226 11:32:22.172754 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57899c756d-w9pc5"] Feb 26 11:32:22 crc kubenswrapper[4699]: I0226 11:32:22.348739 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-84b6bf6c74-r47qt" podUID="0876db8f-e235-40d9-b4a5-718097cdf02c" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.165:9311/healthcheck\": dial tcp 10.217.0.165:9311: connect: connection refused" Feb 26 11:32:22 crc kubenswrapper[4699]: I0226 11:32:22.348817 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-84b6bf6c74-r47qt" podUID="0876db8f-e235-40d9-b4a5-718097cdf02c" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.165:9311/healthcheck\": dial tcp 10.217.0.165:9311: connect: connection refused" Feb 26 11:32:22 crc kubenswrapper[4699]: I0226 11:32:22.423167 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" podUID="81843e2c-774f-402a-bd90-c4485ab24c05" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.148:5353: connect: connection refused" Feb 26 11:32:22 crc kubenswrapper[4699]: I0226 11:32:22.423309 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:32:22 crc kubenswrapper[4699]: I0226 11:32:22.593323 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-57899c756d-w9pc5" podUID="78d85906-b78a-46eb-b5dd-4da95c1222d8" containerName="horizon-log" containerID="cri-o://de9a25314ef41f7d3414b57dcaeec2a9add4d5ecb708b80dc9af27c79856ba9b" gracePeriod=30 Feb 26 11:32:22 crc kubenswrapper[4699]: I0226 11:32:22.593372 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-57899c756d-w9pc5" podUID="78d85906-b78a-46eb-b5dd-4da95c1222d8" containerName="horizon" containerID="cri-o://5570b961c7c2f73533bbe65fa87a9f8cc0b880e79add1f25b918377e32b9375d" gracePeriod=30 Feb 26 11:32:23 crc kubenswrapper[4699]: I0226 11:32:23.803666 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f49xd" Feb 26 11:32:23 crc kubenswrapper[4699]: I0226 11:32:23.931432 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-combined-ca-bundle\") pod \"8426fd89-9eba-46fa-8611-e98cc7636b41\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " Feb 26 11:32:23 crc kubenswrapper[4699]: I0226 11:32:23.931614 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-config-data\") pod \"8426fd89-9eba-46fa-8611-e98cc7636b41\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " Feb 26 11:32:23 crc kubenswrapper[4699]: I0226 11:32:23.931642 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-db-sync-config-data\") pod \"8426fd89-9eba-46fa-8611-e98cc7636b41\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " Feb 26 11:32:23 crc kubenswrapper[4699]: I0226 11:32:23.931678 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8426fd89-9eba-46fa-8611-e98cc7636b41-etc-machine-id\") pod \"8426fd89-9eba-46fa-8611-e98cc7636b41\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " Feb 26 11:32:23 crc kubenswrapper[4699]: I0226 11:32:23.931752 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-scripts\") pod \"8426fd89-9eba-46fa-8611-e98cc7636b41\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " Feb 26 11:32:23 crc kubenswrapper[4699]: I0226 11:32:23.931777 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr9sd\" (UniqueName: \"kubernetes.io/projected/8426fd89-9eba-46fa-8611-e98cc7636b41-kube-api-access-mr9sd\") pod \"8426fd89-9eba-46fa-8611-e98cc7636b41\" (UID: \"8426fd89-9eba-46fa-8611-e98cc7636b41\") " Feb 26 11:32:23 crc kubenswrapper[4699]: I0226 11:32:23.932617 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8426fd89-9eba-46fa-8611-e98cc7636b41-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8426fd89-9eba-46fa-8611-e98cc7636b41" (UID: "8426fd89-9eba-46fa-8611-e98cc7636b41"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:32:23 crc kubenswrapper[4699]: I0226 11:32:23.937784 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-scripts" (OuterVolumeSpecName: "scripts") pod "8426fd89-9eba-46fa-8611-e98cc7636b41" (UID: "8426fd89-9eba-46fa-8611-e98cc7636b41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:23 crc kubenswrapper[4699]: I0226 11:32:23.937944 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8426fd89-9eba-46fa-8611-e98cc7636b41-kube-api-access-mr9sd" (OuterVolumeSpecName: "kube-api-access-mr9sd") pod "8426fd89-9eba-46fa-8611-e98cc7636b41" (UID: "8426fd89-9eba-46fa-8611-e98cc7636b41"). InnerVolumeSpecName "kube-api-access-mr9sd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:32:23 crc kubenswrapper[4699]: I0226 11:32:23.955391 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8426fd89-9eba-46fa-8611-e98cc7636b41" (UID: "8426fd89-9eba-46fa-8611-e98cc7636b41"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:23 crc kubenswrapper[4699]: I0226 11:32:23.961543 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8426fd89-9eba-46fa-8611-e98cc7636b41" (UID: "8426fd89-9eba-46fa-8611-e98cc7636b41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:23 crc kubenswrapper[4699]: I0226 11:32:23.979732 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-config-data" (OuterVolumeSpecName: "config-data") pod "8426fd89-9eba-46fa-8611-e98cc7636b41" (UID: "8426fd89-9eba-46fa-8611-e98cc7636b41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:24 crc kubenswrapper[4699]: I0226 11:32:24.034079 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:24 crc kubenswrapper[4699]: I0226 11:32:24.034185 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr9sd\" (UniqueName: \"kubernetes.io/projected/8426fd89-9eba-46fa-8611-e98cc7636b41-kube-api-access-mr9sd\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:24 crc kubenswrapper[4699]: I0226 11:32:24.034201 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:24 crc kubenswrapper[4699]: I0226 11:32:24.034210 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:24 crc kubenswrapper[4699]: I0226 11:32:24.034220 4699 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8426fd89-9eba-46fa-8611-e98cc7636b41-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:24 crc kubenswrapper[4699]: I0226 11:32:24.034228 4699 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8426fd89-9eba-46fa-8611-e98cc7636b41-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:24 crc kubenswrapper[4699]: I0226 11:32:24.550845 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c455f6f5b-f25td" podUID="2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": dial tcp 10.217.0.162:9311: connect: connection refused" Feb 26 11:32:24 crc kubenswrapper[4699]: I0226 11:32:24.551009 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c455f6f5b-f25td" podUID="2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": dial tcp 10.217.0.162:9311: connect: connection refused" Feb 26 11:32:24 crc kubenswrapper[4699]: I0226 11:32:24.611963 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f49xd" event={"ID":"8426fd89-9eba-46fa-8611-e98cc7636b41","Type":"ContainerDied","Data":"3e0a4f4a5840bf076a02406c3b220ed5f7a7941a35ea7875a55be88dc0efa11e"} Feb 26 11:32:24 crc kubenswrapper[4699]: I0226 11:32:24.612287 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e0a4f4a5840bf076a02406c3b220ed5f7a7941a35ea7875a55be88dc0efa11e" Feb 26 11:32:24 crc kubenswrapper[4699]: I0226 11:32:24.612013 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f49xd" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.100365 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 11:32:25 crc kubenswrapper[4699]: E0226 11:32:25.104603 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8426fd89-9eba-46fa-8611-e98cc7636b41" containerName="cinder-db-sync" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.104638 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="8426fd89-9eba-46fa-8611-e98cc7636b41" containerName="cinder-db-sync" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.104897 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="8426fd89-9eba-46fa-8611-e98cc7636b41" containerName="cinder-db-sync" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.106278 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.114133 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.114458 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.114470 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.114642 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-bgvh2" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.114799 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.176997 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.177503 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9rw5\" (UniqueName: \"kubernetes.io/projected/81e6c561-d55c-48fa-94a9-2dd7d491fd48-kube-api-access-j9rw5\") pod \"cinder-scheduler-0\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.177690 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.177813 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-scripts\") pod \"cinder-scheduler-0\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.177926 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/81e6c561-d55c-48fa-94a9-2dd7d491fd48-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.178079 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-config-data\") pod \"cinder-scheduler-0\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.199746 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-vlzrl"] Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.201647 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.228036 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-vlzrl"] Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.282685 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.283690 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9rw5\" (UniqueName: \"kubernetes.io/projected/81e6c561-d55c-48fa-94a9-2dd7d491fd48-kube-api-access-j9rw5\") pod \"cinder-scheduler-0\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.283804 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-vlzrl\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.283889 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-vlzrl\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.283995 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x97ms\" (UniqueName: \"kubernetes.io/projected/9fa27ea0-52eb-406f-8256-68b4a471e452-kube-api-access-x97ms\") pod \"dnsmasq-dns-6bb4fc677f-vlzrl\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.284159 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-vlzrl\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.284250 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.284327 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-scripts\") pod \"cinder-scheduler-0\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.284409 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/81e6c561-d55c-48fa-94a9-2dd7d491fd48-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.284512 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/81e6c561-d55c-48fa-94a9-2dd7d491fd48-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.284548 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-config\") pod \"dnsmasq-dns-6bb4fc677f-vlzrl\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.284688 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-config-data\") pod \"cinder-scheduler-0\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.284776 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-vlzrl\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.297226 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-config-data\") pod \"cinder-scheduler-0\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.310855 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.319959 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.320188 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-scripts\") pod \"cinder-scheduler-0\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.332104 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9rw5\" (UniqueName: \"kubernetes.io/projected/81e6c561-d55c-48fa-94a9-2dd7d491fd48-kube-api-access-j9rw5\") pod \"cinder-scheduler-0\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.391732 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-vlzrl\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.391811 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-vlzrl\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.391852 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x97ms\" (UniqueName: \"kubernetes.io/projected/9fa27ea0-52eb-406f-8256-68b4a471e452-kube-api-access-x97ms\") pod \"dnsmasq-dns-6bb4fc677f-vlzrl\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.392057 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-vlzrl\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.392187 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-config\") pod \"dnsmasq-dns-6bb4fc677f-vlzrl\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.392271 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-vlzrl\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.394418 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-vlzrl\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.394465 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-vlzrl\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.395047 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-vlzrl\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.395143 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-config\") pod \"dnsmasq-dns-6bb4fc677f-vlzrl\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.397489 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-vlzrl\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.419225 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x97ms\" (UniqueName: \"kubernetes.io/projected/9fa27ea0-52eb-406f-8256-68b4a471e452-kube-api-access-x97ms\") pod \"dnsmasq-dns-6bb4fc677f-vlzrl\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.425326 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.430896 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.436479 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.438853 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.446713 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.495175 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/581ae159-48c4-4821-aede-361485304c59-logs\") pod \"cinder-api-0\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " pod="openstack/cinder-api-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.495534 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdwqp\" (UniqueName: \"kubernetes.io/projected/581ae159-48c4-4821-aede-361485304c59-kube-api-access-wdwqp\") pod \"cinder-api-0\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " pod="openstack/cinder-api-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.495582 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-config-data-custom\") pod \"cinder-api-0\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " pod="openstack/cinder-api-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.495623 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-config-data\") pod \"cinder-api-0\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " pod="openstack/cinder-api-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.495681 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-scripts\") pod \"cinder-api-0\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " pod="openstack/cinder-api-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.495869 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/581ae159-48c4-4821-aede-361485304c59-etc-machine-id\") pod \"cinder-api-0\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " pod="openstack/cinder-api-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.495939 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " pod="openstack/cinder-api-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.536728 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.605539 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/581ae159-48c4-4821-aede-361485304c59-logs\") pod \"cinder-api-0\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " pod="openstack/cinder-api-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.605594 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdwqp\" (UniqueName: \"kubernetes.io/projected/581ae159-48c4-4821-aede-361485304c59-kube-api-access-wdwqp\") pod \"cinder-api-0\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " pod="openstack/cinder-api-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.605619 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-config-data-custom\") pod \"cinder-api-0\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " pod="openstack/cinder-api-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.605649 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-config-data\") pod \"cinder-api-0\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " pod="openstack/cinder-api-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.605681 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-scripts\") pod \"cinder-api-0\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " pod="openstack/cinder-api-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.605758 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/581ae159-48c4-4821-aede-361485304c59-etc-machine-id\") pod \"cinder-api-0\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " pod="openstack/cinder-api-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.605802 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " pod="openstack/cinder-api-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.613171 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/581ae159-48c4-4821-aede-361485304c59-logs\") pod \"cinder-api-0\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " pod="openstack/cinder-api-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.613301 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/581ae159-48c4-4821-aede-361485304c59-etc-machine-id\") pod \"cinder-api-0\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " pod="openstack/cinder-api-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.617431 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-config-data-custom\") pod \"cinder-api-0\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " pod="openstack/cinder-api-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.649978 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " pod="openstack/cinder-api-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.658263 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-config-data\") pod \"cinder-api-0\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " pod="openstack/cinder-api-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.658382 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdwqp\" (UniqueName: \"kubernetes.io/projected/581ae159-48c4-4821-aede-361485304c59-kube-api-access-wdwqp\") pod \"cinder-api-0\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " pod="openstack/cinder-api-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.658703 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-scripts\") pod \"cinder-api-0\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " pod="openstack/cinder-api-0" Feb 26 11:32:25 crc kubenswrapper[4699]: I0226 11:32:25.822505 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 11:32:26 crc kubenswrapper[4699]: E0226 11:32:26.054797 4699 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78d85906_b78a_46eb_b5dd_4da95c1222d8.slice/crio-5570b961c7c2f73533bbe65fa87a9f8cc0b880e79add1f25b918377e32b9375d.scope\": RecentStats: unable to find data in memory cache]" Feb 26 11:32:26 crc kubenswrapper[4699]: I0226 11:32:26.644236 4699 generic.go:334] "Generic (PLEG): container finished" podID="78d85906-b78a-46eb-b5dd-4da95c1222d8" containerID="5570b961c7c2f73533bbe65fa87a9f8cc0b880e79add1f25b918377e32b9375d" exitCode=0 Feb 26 11:32:26 crc kubenswrapper[4699]: I0226 11:32:26.644291 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57899c756d-w9pc5" event={"ID":"78d85906-b78a-46eb-b5dd-4da95c1222d8","Type":"ContainerDied","Data":"5570b961c7c2f73533bbe65fa87a9f8cc0b880e79add1f25b918377e32b9375d"} Feb 26 11:32:27 crc kubenswrapper[4699]: E0226 11:32:27.022015 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Feb 26 11:32:27 crc kubenswrapper[4699]: E0226 11:32:27.022398 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-srl4m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(7cec2d73-9ca8-4a8b-836d-efce961fbde8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 11:32:27 crc kubenswrapper[4699]: E0226 11:32:27.023662 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="7cec2d73-9ca8-4a8b-836d-efce961fbde8" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.087307 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.090474 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.149092 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-scripts\") pod \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.149440 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-combined-ca-bundle\") pod \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.149485 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-ovsdbserver-sb\") pod \"81843e2c-774f-402a-bd90-c4485ab24c05\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.149509 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-config-data\") pod \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.149547 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-internal-tls-certs\") pod \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.149572 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2rj6\" (UniqueName: \"kubernetes.io/projected/81843e2c-774f-402a-bd90-c4485ab24c05-kube-api-access-z2rj6\") pod \"81843e2c-774f-402a-bd90-c4485ab24c05\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.149589 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-logs\") pod \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.149627 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-ovsdbserver-nb\") pod \"81843e2c-774f-402a-bd90-c4485ab24c05\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.149656 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-dns-swift-storage-0\") pod \"81843e2c-774f-402a-bd90-c4485ab24c05\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.149677 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-config\") pod \"81843e2c-774f-402a-bd90-c4485ab24c05\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.149734 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-dns-svc\") pod \"81843e2c-774f-402a-bd90-c4485ab24c05\" (UID: \"81843e2c-774f-402a-bd90-c4485ab24c05\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.149803 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv6sq\" (UniqueName: \"kubernetes.io/projected/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-kube-api-access-wv6sq\") pod \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.149823 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-public-tls-certs\") pod \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\" (UID: \"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.155261 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-logs" (OuterVolumeSpecName: "logs") pod "e9ef11cc-2a83-4f0e-b117-4be10a1c0fee" (UID: "e9ef11cc-2a83-4f0e-b117-4be10a1c0fee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.162330 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-scripts" (OuterVolumeSpecName: "scripts") pod "e9ef11cc-2a83-4f0e-b117-4be10a1c0fee" (UID: "e9ef11cc-2a83-4f0e-b117-4be10a1c0fee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.163005 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.163034 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-logs\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.168830 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c455f6f5b-f25td" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.169534 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5fd9f445b9-bnr2j" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.183565 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81843e2c-774f-402a-bd90-c4485ab24c05-kube-api-access-z2rj6" (OuterVolumeSpecName: "kube-api-access-z2rj6") pod "81843e2c-774f-402a-bd90-c4485ab24c05" (UID: "81843e2c-774f-402a-bd90-c4485ab24c05"). InnerVolumeSpecName "kube-api-access-z2rj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.183650 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-kube-api-access-wv6sq" (OuterVolumeSpecName: "kube-api-access-wv6sq") pod "e9ef11cc-2a83-4f0e-b117-4be10a1c0fee" (UID: "e9ef11cc-2a83-4f0e-b117-4be10a1c0fee"). InnerVolumeSpecName "kube-api-access-wv6sq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.243623 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.264324 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-logs\") pod \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\" (UID: \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.264430 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-logs\") pod \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\" (UID: \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.264492 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-config-data-custom\") pod \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\" (UID: \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.264551 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-config-data-custom\") pod \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\" (UID: \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.264586 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-config-data\") pod \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\" (UID: \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.264653 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-config-data\") pod \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\" (UID: \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.264689 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqdfx\" (UniqueName: \"kubernetes.io/projected/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-kube-api-access-sqdfx\") pod \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\" (UID: \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.264748 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-combined-ca-bundle\") pod \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\" (UID: \"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.264794 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb7t4\" (UniqueName: \"kubernetes.io/projected/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-kube-api-access-wb7t4\") pod \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\" (UID: \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.264798 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-logs" (OuterVolumeSpecName: "logs") pod "4a31ea66-afbf-4606-baa9-0f5fb98e5c4f" (UID: "4a31ea66-afbf-4606-baa9-0f5fb98e5c4f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.264888 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-combined-ca-bundle\") pod \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\" (UID: \"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.265567 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv6sq\" (UniqueName: \"kubernetes.io/projected/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-kube-api-access-wv6sq\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.265617 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2rj6\" (UniqueName: \"kubernetes.io/projected/81843e2c-774f-402a-bd90-c4485ab24c05-kube-api-access-z2rj6\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.265631 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-logs\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.272846 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.273446 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-logs" (OuterVolumeSpecName: "logs") pod "2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" (UID: "2147a5bc-be0c-4ab4-a0ee-ede87002a1a4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.290700 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "81843e2c-774f-402a-bd90-c4485ab24c05" (UID: "81843e2c-774f-402a-bd90-c4485ab24c05"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.295972 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4a31ea66-afbf-4606-baa9-0f5fb98e5c4f" (UID: "4a31ea66-afbf-4606-baa9-0f5fb98e5c4f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.296821 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-kube-api-access-wb7t4" (OuterVolumeSpecName: "kube-api-access-wb7t4") pod "4a31ea66-afbf-4606-baa9-0f5fb98e5c4f" (UID: "4a31ea66-afbf-4606-baa9-0f5fb98e5c4f"). InnerVolumeSpecName "kube-api-access-wb7t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.301558 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-kube-api-access-sqdfx" (OuterVolumeSpecName: "kube-api-access-sqdfx") pod "2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" (UID: "2147a5bc-be0c-4ab4-a0ee-ede87002a1a4"). InnerVolumeSpecName "kube-api-access-sqdfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.324698 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" (UID: "2147a5bc-be0c-4ab4-a0ee-ede87002a1a4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.333858 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "81843e2c-774f-402a-bd90-c4485ab24c05" (UID: "81843e2c-774f-402a-bd90-c4485ab24c05"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.366974 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-config-data-custom\") pod \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\" (UID: \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.367044 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-config-data\") pod \"0876db8f-e235-40d9-b4a5-718097cdf02c\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.367143 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85rxv\" (UniqueName: \"kubernetes.io/projected/0876db8f-e235-40d9-b4a5-718097cdf02c-kube-api-access-85rxv\") pod \"0876db8f-e235-40d9-b4a5-718097cdf02c\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.367191 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzs9k\" (UniqueName: \"kubernetes.io/projected/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-kube-api-access-mzs9k\") pod \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\" (UID: \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.367244 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-combined-ca-bundle\") pod \"0876db8f-e235-40d9-b4a5-718097cdf02c\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.367317 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-config-data-custom\") pod \"0876db8f-e235-40d9-b4a5-718097cdf02c\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.367355 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-config-data\") pod \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\" (UID: \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.367450 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0876db8f-e235-40d9-b4a5-718097cdf02c-logs\") pod \"0876db8f-e235-40d9-b4a5-718097cdf02c\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.367521 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-public-tls-certs\") pod \"0876db8f-e235-40d9-b4a5-718097cdf02c\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.367558 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-internal-tls-certs\") pod \"0876db8f-e235-40d9-b4a5-718097cdf02c\" (UID: \"0876db8f-e235-40d9-b4a5-718097cdf02c\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.367607 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-logs\") pod \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\" (UID: \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.367648 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-combined-ca-bundle\") pod \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\" (UID: \"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b\") " Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.368289 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqdfx\" (UniqueName: \"kubernetes.io/projected/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-kube-api-access-sqdfx\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.368314 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb7t4\" (UniqueName: \"kubernetes.io/projected/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-kube-api-access-wb7t4\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.368326 4699 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.368337 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-logs\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.368350 4699 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.368361 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.368372 4699 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.376925 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-logs" (OuterVolumeSpecName: "logs") pod "b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b" (UID: "b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.377422 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0876db8f-e235-40d9-b4a5-718097cdf02c-logs" (OuterVolumeSpecName: "logs") pod "0876db8f-e235-40d9-b4a5-718097cdf02c" (UID: "0876db8f-e235-40d9-b4a5-718097cdf02c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.380948 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0876db8f-e235-40d9-b4a5-718097cdf02c" (UID: "0876db8f-e235-40d9-b4a5-718097cdf02c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.382286 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0876db8f-e235-40d9-b4a5-718097cdf02c-kube-api-access-85rxv" (OuterVolumeSpecName: "kube-api-access-85rxv") pod "0876db8f-e235-40d9-b4a5-718097cdf02c" (UID: "0876db8f-e235-40d9-b4a5-718097cdf02c"). InnerVolumeSpecName "kube-api-access-85rxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.391605 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b" (UID: "b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.398240 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9ef11cc-2a83-4f0e-b117-4be10a1c0fee" (UID: "e9ef11cc-2a83-4f0e-b117-4be10a1c0fee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.401269 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-kube-api-access-mzs9k" (OuterVolumeSpecName: "kube-api-access-mzs9k") pod "b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b" (UID: "b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b"). InnerVolumeSpecName "kube-api-access-mzs9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.410481 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "81843e2c-774f-402a-bd90-c4485ab24c05" (UID: "81843e2c-774f-402a-bd90-c4485ab24c05"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.473659 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-logs\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.473697 4699 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.473708 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85rxv\" (UniqueName: \"kubernetes.io/projected/0876db8f-e235-40d9-b4a5-718097cdf02c-kube-api-access-85rxv\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.473719 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzs9k\" (UniqueName: \"kubernetes.io/projected/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-kube-api-access-mzs9k\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.473735 4699 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.473746 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.473756 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.473766 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0876db8f-e235-40d9-b4a5-718097cdf02c-logs\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.480727 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.495521 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d45896d49-mh862"] Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.525437 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-config" (OuterVolumeSpecName: "config") pod "81843e2c-774f-402a-bd90-c4485ab24c05" (UID: "81843e2c-774f-402a-bd90-c4485ab24c05"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.534191 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a31ea66-afbf-4606-baa9-0f5fb98e5c4f" (UID: "4a31ea66-afbf-4606-baa9-0f5fb98e5c4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.581720 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.581758 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.606720 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "81843e2c-774f-402a-bd90-c4485ab24c05" (UID: "81843e2c-774f-402a-bd90-c4485ab24c05"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.613248 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" (UID: "2147a5bc-be0c-4ab4-a0ee-ede87002a1a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.618993 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-config-data" (OuterVolumeSpecName: "config-data") pod "2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" (UID: "2147a5bc-be0c-4ab4-a0ee-ede87002a1a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.624995 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0876db8f-e235-40d9-b4a5-718097cdf02c" (UID: "0876db8f-e235-40d9-b4a5-718097cdf02c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.669613 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" event={"ID":"81843e2c-774f-402a-bd90-c4485ab24c05","Type":"ContainerDied","Data":"fed26d1422b55affaace34ac700e5a58aa1d192cab8a88f61c67c7cb3b1ca3ed"} Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.669667 4699 scope.go:117] "RemoveContainer" containerID="3eda8514ede18fd03dc0849cf95cf8d9b4cb3f130429078ff465a976e2f5421b" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.669805 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-nhzhh" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.677368 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0876db8f-e235-40d9-b4a5-718097cdf02c" (UID: "0876db8f-e235-40d9-b4a5-718097cdf02c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.685881 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81843e2c-774f-402a-bd90-c4485ab24c05-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.685907 4699 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.685920 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.685932 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.685943 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.695060 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-vlzrl"] Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.699952 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c455f6f5b-f25td" event={"ID":"2147a5bc-be0c-4ab4-a0ee-ede87002a1a4","Type":"ContainerDied","Data":"1ea84a2c17c70c4722d76da041934ea3f75af2c65494a5778df946ebb8677371"} Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.700089 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c455f6f5b-f25td" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.711734 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.711899 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b" (UID: "b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.712932 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78f86c6bf8-r6wpf" event={"ID":"e9ef11cc-2a83-4f0e-b117-4be10a1c0fee","Type":"ContainerDied","Data":"086804bba8040fba8ead2adc36df764be92ea222ee4962825dd9a4df869adac5"} Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.713006 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78f86c6bf8-r6wpf" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.717842 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5fd9f445b9-bnr2j" event={"ID":"4a31ea66-afbf-4606-baa9-0f5fb98e5c4f","Type":"ContainerDied","Data":"67791da9269463758e09bb6a9c9c2f13b834b1a262a1121df8a5fa0b5b6170cf"} Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.717980 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-config-data" (OuterVolumeSpecName: "config-data") pod "e9ef11cc-2a83-4f0e-b117-4be10a1c0fee" (UID: "e9ef11cc-2a83-4f0e-b117-4be10a1c0fee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.719238 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5fd9f445b9-bnr2j" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.727709 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d45896d49-mh862" event={"ID":"862cb546-78f8-4864-a158-9dc217ec2796","Type":"ContainerStarted","Data":"29cc7d2eee99e53136d941d8237d18ae89ab2c4497f23c739b9b2ae06d0c1d8c"} Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.730585 4699 generic.go:334] "Generic (PLEG): container finished" podID="73fd43db-ab24-441d-9912-881ef04d4572" containerID="f1b43b05d45b05ac3c54d378fa118972d9e5848b345eada8b66bb2c67ea89c63" exitCode=0 Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.730763 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dc5565bbf-zgvcg" event={"ID":"73fd43db-ab24-441d-9912-881ef04d4572","Type":"ContainerDied","Data":"f1b43b05d45b05ac3c54d378fa118972d9e5848b345eada8b66bb2c67ea89c63"} Feb 26 11:32:27 crc kubenswrapper[4699]: W0226 11:32:27.735266 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod581ae159_48c4_4821_aede_361485304c59.slice/crio-682d7ab4556bf77bd10ad1dbcd5f0a84777dbfdd65cbaf81b868443ac2be23e3 WatchSource:0}: Error finding container 682d7ab4556bf77bd10ad1dbcd5f0a84777dbfdd65cbaf81b868443ac2be23e3: Status 404 returned error can't find the container with id 682d7ab4556bf77bd10ad1dbcd5f0a84777dbfdd65cbaf81b868443ac2be23e3 Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.737259 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.737935 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84b6bf6c74-r47qt" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.738015 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84b6bf6c74-r47qt" event={"ID":"0876db8f-e235-40d9-b4a5-718097cdf02c","Type":"ContainerDied","Data":"079bbabce73c111db6093e96198997a034c6927d448d649260507e6ce83573d4"} Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.751415 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-config-data" (OuterVolumeSpecName: "config-data") pod "b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b" (UID: "b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.760378 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7cec2d73-9ca8-4a8b-836d-efce961fbde8" containerName="ceilometer-notification-agent" containerID="cri-o://2f8839485194d3f95fa8291b794501d86f5716d9498983ff8ecd5c989e725c0d" gracePeriod=30 Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.760607 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.760704 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7cec2d73-9ca8-4a8b-836d-efce961fbde8" containerName="sg-core" containerID="cri-o://5b67c1b09ab95a7bde953070b4025e70711be4e1fc09bbc21308bfb6c7bb94c2" gracePeriod=30 Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.760833 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8dc77f9b6-7s844" event={"ID":"b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b","Type":"ContainerDied","Data":"64d085c2e0471990e9f05ef5274018eb074bf0ab7cec6ddaf7afcafa1dae6331"} Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.767471 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-config-data" (OuterVolumeSpecName: "config-data") pod "4a31ea66-afbf-4606-baa9-0f5fb98e5c4f" (UID: "4a31ea66-afbf-4606-baa9-0f5fb98e5c4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.768473 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-nhzhh"] Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.776100 4699 scope.go:117] "RemoveContainer" containerID="2161a9d96d5b3712e81eaf624a88f2f6f3ee6fc2f0aaa102d1a1b03d768333c4" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.787280 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.794333 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.810049 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.810201 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.812663 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e9ef11cc-2a83-4f0e-b117-4be10a1c0fee" (UID: "e9ef11cc-2a83-4f0e-b117-4be10a1c0fee"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.830190 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-nhzhh"] Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.843962 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0876db8f-e235-40d9-b4a5-718097cdf02c" (UID: "0876db8f-e235-40d9-b4a5-718097cdf02c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.848511 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7c455f6f5b-f25td"] Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.865361 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7c455f6f5b-f25td"] Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.871860 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-config-data" (OuterVolumeSpecName: "config-data") pod "0876db8f-e235-40d9-b4a5-718097cdf02c" (UID: "0876db8f-e235-40d9-b4a5-718097cdf02c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.896200 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e9ef11cc-2a83-4f0e-b117-4be10a1c0fee" (UID: "e9ef11cc-2a83-4f0e-b117-4be10a1c0fee"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.913176 4699 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.913207 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0876db8f-e235-40d9-b4a5-718097cdf02c-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.913216 4699 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:27 crc kubenswrapper[4699]: I0226 11:32:27.913226 4699 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.080811 4699 scope.go:117] "RemoveContainer" containerID="a0d7c518107ce530bde8dc06ecc1543caeb752ad958b36e173be8e60f8d8a088" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.131973 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-57899c756d-w9pc5" podUID="78d85906-b78a-46eb-b5dd-4da95c1222d8" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.178216 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5fd9f445b9-bnr2j"] Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.196799 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.203368 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-5fd9f445b9-bnr2j"] Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.235994 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-84b6bf6c74-r47qt"] Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.249238 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-84b6bf6c74-r47qt"] Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.273803 4699 scope.go:117] "RemoveContainer" containerID="68ffeebcc8219b513faf07851f7ec0e29081e29acdefcbc0d3a8bcb52016ff06" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.287732 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0876db8f-e235-40d9-b4a5-718097cdf02c" path="/var/lib/kubelet/pods/0876db8f-e235-40d9-b4a5-718097cdf02c/volumes" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.288474 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" path="/var/lib/kubelet/pods/2147a5bc-be0c-4ab4-a0ee-ede87002a1a4/volumes" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.289920 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a31ea66-afbf-4606-baa9-0f5fb98e5c4f" path="/var/lib/kubelet/pods/4a31ea66-afbf-4606-baa9-0f5fb98e5c4f/volumes" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.299077 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81843e2c-774f-402a-bd90-c4485ab24c05" path="/var/lib/kubelet/pods/81843e2c-774f-402a-bd90-c4485ab24c05/volumes" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.329796 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-config\") pod \"73fd43db-ab24-441d-9912-881ef04d4572\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.330545 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-internal-tls-certs\") pod \"73fd43db-ab24-441d-9912-881ef04d4572\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.330722 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g2x8\" (UniqueName: \"kubernetes.io/projected/73fd43db-ab24-441d-9912-881ef04d4572-kube-api-access-6g2x8\") pod \"73fd43db-ab24-441d-9912-881ef04d4572\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.330930 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-httpd-config\") pod \"73fd43db-ab24-441d-9912-881ef04d4572\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.331047 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-public-tls-certs\") pod \"73fd43db-ab24-441d-9912-881ef04d4572\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.331332 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-combined-ca-bundle\") pod \"73fd43db-ab24-441d-9912-881ef04d4572\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.331491 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-ovndb-tls-certs\") pod \"73fd43db-ab24-441d-9912-881ef04d4572\" (UID: \"73fd43db-ab24-441d-9912-881ef04d4572\") " Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.346712 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-78f86c6bf8-r6wpf"] Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.346747 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-78f86c6bf8-r6wpf"] Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.346763 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-8dc77f9b6-7s844"] Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.346774 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-8dc77f9b6-7s844"] Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.347170 4699 scope.go:117] "RemoveContainer" containerID="1a317729338c17b4684891909a81baf465ca5e0314fc7b75c6f3742a26c946fe" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.349946 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "73fd43db-ab24-441d-9912-881ef04d4572" (UID: "73fd43db-ab24-441d-9912-881ef04d4572"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.372144 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73fd43db-ab24-441d-9912-881ef04d4572-kube-api-access-6g2x8" (OuterVolumeSpecName: "kube-api-access-6g2x8") pod "73fd43db-ab24-441d-9912-881ef04d4572" (UID: "73fd43db-ab24-441d-9912-881ef04d4572"). InnerVolumeSpecName "kube-api-access-6g2x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.449769 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g2x8\" (UniqueName: \"kubernetes.io/projected/73fd43db-ab24-441d-9912-881ef04d4572-kube-api-access-6g2x8\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.449801 4699 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.456288 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-config" (OuterVolumeSpecName: "config") pod "73fd43db-ab24-441d-9912-881ef04d4572" (UID: "73fd43db-ab24-441d-9912-881ef04d4572"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.465793 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "73fd43db-ab24-441d-9912-881ef04d4572" (UID: "73fd43db-ab24-441d-9912-881ef04d4572"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.501270 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73fd43db-ab24-441d-9912-881ef04d4572" (UID: "73fd43db-ab24-441d-9912-881ef04d4572"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.509316 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "73fd43db-ab24-441d-9912-881ef04d4572" (UID: "73fd43db-ab24-441d-9912-881ef04d4572"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.514736 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "73fd43db-ab24-441d-9912-881ef04d4572" (UID: "73fd43db-ab24-441d-9912-881ef04d4572"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.547549 4699 scope.go:117] "RemoveContainer" containerID="cfb24c179a421c25f4518e3a61ebbebf3cbb893957f33df4ff29a903e7099944" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.551679 4699 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.551709 4699 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.551718 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.551728 4699 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.551737 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/73fd43db-ab24-441d-9912-881ef04d4572-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.621393 4699 scope.go:117] "RemoveContainer" containerID="59e86688f7ad25464b86f65ac7156f4d78dbfbb25e41fcbc1ec58a4c8ed79739" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.658535 4699 scope.go:117] "RemoveContainer" containerID="47aa6fcab7ba63e0059bde039291f7d09fed39c47d8ed3b4b011f2b39240d68f" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.688875 4699 scope.go:117] "RemoveContainer" containerID="5fee23b2bd35e07b2bc23127d9ba51147df6b5de9840523d49b7247f51fcf676" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.744692 4699 scope.go:117] "RemoveContainer" containerID="c3cf20e496184d423dd9676570affb3ed62ff3f5e0e800069d47d590effab24c" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.797842 4699 scope.go:117] "RemoveContainer" containerID="33718273cf0b85bce01d52282cadd465a6d877e40d664fb877a0e2590e81381a" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.806649 4699 generic.go:334] "Generic (PLEG): container finished" podID="7cec2d73-9ca8-4a8b-836d-efce961fbde8" containerID="5b67c1b09ab95a7bde953070b4025e70711be4e1fc09bbc21308bfb6c7bb94c2" exitCode=2 Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.806718 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7cec2d73-9ca8-4a8b-836d-efce961fbde8","Type":"ContainerDied","Data":"5b67c1b09ab95a7bde953070b4025e70711be4e1fc09bbc21308bfb6c7bb94c2"} Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.808099 4699 generic.go:334] "Generic (PLEG): container finished" podID="9fa27ea0-52eb-406f-8256-68b4a471e452" containerID="c85028e72c3a0e7a111082381d8b68740bba7121da4a177d5d4e87fed9daf888" exitCode=0 Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.808169 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" event={"ID":"9fa27ea0-52eb-406f-8256-68b4a471e452","Type":"ContainerDied","Data":"c85028e72c3a0e7a111082381d8b68740bba7121da4a177d5d4e87fed9daf888"} Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.808186 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" event={"ID":"9fa27ea0-52eb-406f-8256-68b4a471e452","Type":"ContainerStarted","Data":"096662e32232c28cf3046778c91211f7c3482d79260670ba5c8b5347692e739f"} Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.834323 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"81e6c561-d55c-48fa-94a9-2dd7d491fd48","Type":"ContainerStarted","Data":"48e983e75dfbee9e41159572aae0afa12ee51c7366ffabb530747e91bb647659"} Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.844362 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d45896d49-mh862" event={"ID":"862cb546-78f8-4864-a158-9dc217ec2796","Type":"ContainerStarted","Data":"9a59f11c6499b61ad8c0a8b993bd48cbdbc71b6e77f5dc55cc125d07caa3624c"} Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.845417 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.882402 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6d45896d49-mh862" podStartSLOduration=10.88237759 podStartE2EDuration="10.88237759s" podCreationTimestamp="2026-02-26 11:32:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:32:28.874912563 +0000 UTC m=+1294.685739007" watchObservedRunningTime="2026-02-26 11:32:28.88237759 +0000 UTC m=+1294.693204024" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.884502 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dc5565bbf-zgvcg" event={"ID":"73fd43db-ab24-441d-9912-881ef04d4572","Type":"ContainerDied","Data":"31b648d87b25df09b072d95e938824b9e321e65ded6b88d9eed7727a038a5155"} Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.884851 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6dc5565bbf-zgvcg" Feb 26 11:32:28 crc kubenswrapper[4699]: I0226 11:32:28.925368 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"581ae159-48c4-4821-aede-361485304c59","Type":"ContainerStarted","Data":"682d7ab4556bf77bd10ad1dbcd5f0a84777dbfdd65cbaf81b868443ac2be23e3"} Feb 26 11:32:29 crc kubenswrapper[4699]: I0226 11:32:29.063652 4699 scope.go:117] "RemoveContainer" containerID="514b33745b4aa127708bf8765bb8617e15516309231f6f17906729d04d3d2a16" Feb 26 11:32:29 crc kubenswrapper[4699]: I0226 11:32:29.098287 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6dc5565bbf-zgvcg"] Feb 26 11:32:29 crc kubenswrapper[4699]: I0226 11:32:29.105282 4699 scope.go:117] "RemoveContainer" containerID="fb5409015c0850abe735cc049f283c49118298bd94a368b4191042b9fb38469f" Feb 26 11:32:29 crc kubenswrapper[4699]: I0226 11:32:29.108006 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6dc5565bbf-zgvcg"] Feb 26 11:32:29 crc kubenswrapper[4699]: I0226 11:32:29.182062 4699 scope.go:117] "RemoveContainer" containerID="f1b43b05d45b05ac3c54d378fa118972d9e5848b345eada8b66bb2c67ea89c63" Feb 26 11:32:29 crc kubenswrapper[4699]: I0226 11:32:29.960513 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"581ae159-48c4-4821-aede-361485304c59","Type":"ContainerStarted","Data":"9e86abf10ae647fbec08f92496385017210d4ff2c918b5e4390bc7246666f380"} Feb 26 11:32:29 crc kubenswrapper[4699]: I0226 11:32:29.961170 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"581ae159-48c4-4821-aede-361485304c59","Type":"ContainerStarted","Data":"40142f357c043763beb1bb0a8f6b18fb7ff06e2c964dea2c67cd27ad4975a2b8"} Feb 26 11:32:29 crc kubenswrapper[4699]: I0226 11:32:29.960607 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="581ae159-48c4-4821-aede-361485304c59" containerName="cinder-api-log" containerID="cri-o://9e86abf10ae647fbec08f92496385017210d4ff2c918b5e4390bc7246666f380" gracePeriod=30 Feb 26 11:32:29 crc kubenswrapper[4699]: I0226 11:32:29.961223 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 26 11:32:29 crc kubenswrapper[4699]: I0226 11:32:29.960621 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="581ae159-48c4-4821-aede-361485304c59" containerName="cinder-api" containerID="cri-o://40142f357c043763beb1bb0a8f6b18fb7ff06e2c964dea2c67cd27ad4975a2b8" gracePeriod=30 Feb 26 11:32:29 crc kubenswrapper[4699]: I0226 11:32:29.970957 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" event={"ID":"9fa27ea0-52eb-406f-8256-68b4a471e452","Type":"ContainerStarted","Data":"b6434f4c6dfd6f80388c5828582829cf6f0feba6a3ec19ba253ae96d727b8a21"} Feb 26 11:32:29 crc kubenswrapper[4699]: I0226 11:32:29.971320 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:32:29 crc kubenswrapper[4699]: I0226 11:32:29.975361 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"81e6c561-d55c-48fa-94a9-2dd7d491fd48","Type":"ContainerStarted","Data":"22eb49c2ddda393e2d578a662fade5ebe09a2739d93f76bbdbb5c7f8759be981"} Feb 26 11:32:29 crc kubenswrapper[4699]: I0226 11:32:29.981083 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d45896d49-mh862" event={"ID":"862cb546-78f8-4864-a158-9dc217ec2796","Type":"ContainerStarted","Data":"97ddf93d5e2850bacd26d60c3eae5e72a0817d976e7bfe9b76f973f92ca9f570"} Feb 26 11:32:29 crc kubenswrapper[4699]: I0226 11:32:29.995233 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.995207818 podStartE2EDuration="4.995207818s" podCreationTimestamp="2026-02-26 11:32:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:32:29.979018327 +0000 UTC m=+1295.789844781" watchObservedRunningTime="2026-02-26 11:32:29.995207818 +0000 UTC m=+1295.806034272" Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.004082 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" podStartSLOduration=5.004064095 podStartE2EDuration="5.004064095s" podCreationTimestamp="2026-02-26 11:32:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:32:30.002458668 +0000 UTC m=+1295.813285112" watchObservedRunningTime="2026-02-26 11:32:30.004064095 +0000 UTC m=+1295.814890529" Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.274836 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73fd43db-ab24-441d-9912-881ef04d4572" path="/var/lib/kubelet/pods/73fd43db-ab24-441d-9912-881ef04d4572/volumes" Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.276071 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b" path="/var/lib/kubelet/pods/b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b/volumes" Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.277028 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9ef11cc-2a83-4f0e-b117-4be10a1c0fee" path="/var/lib/kubelet/pods/e9ef11cc-2a83-4f0e-b117-4be10a1c0fee/volumes" Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.803007 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.906987 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdwqp\" (UniqueName: \"kubernetes.io/projected/581ae159-48c4-4821-aede-361485304c59-kube-api-access-wdwqp\") pod \"581ae159-48c4-4821-aede-361485304c59\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.907024 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-config-data\") pod \"581ae159-48c4-4821-aede-361485304c59\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.907135 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-config-data-custom\") pod \"581ae159-48c4-4821-aede-361485304c59\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.907154 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-scripts\") pod \"581ae159-48c4-4821-aede-361485304c59\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.907172 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/581ae159-48c4-4821-aede-361485304c59-etc-machine-id\") pod \"581ae159-48c4-4821-aede-361485304c59\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.907248 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/581ae159-48c4-4821-aede-361485304c59-logs\") pod \"581ae159-48c4-4821-aede-361485304c59\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.907330 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-combined-ca-bundle\") pod \"581ae159-48c4-4821-aede-361485304c59\" (UID: \"581ae159-48c4-4821-aede-361485304c59\") " Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.912784 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/581ae159-48c4-4821-aede-361485304c59-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "581ae159-48c4-4821-aede-361485304c59" (UID: "581ae159-48c4-4821-aede-361485304c59"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.913185 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-scripts" (OuterVolumeSpecName: "scripts") pod "581ae159-48c4-4821-aede-361485304c59" (UID: "581ae159-48c4-4821-aede-361485304c59"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.914283 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/581ae159-48c4-4821-aede-361485304c59-kube-api-access-wdwqp" (OuterVolumeSpecName: "kube-api-access-wdwqp") pod "581ae159-48c4-4821-aede-361485304c59" (UID: "581ae159-48c4-4821-aede-361485304c59"). InnerVolumeSpecName "kube-api-access-wdwqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.914927 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/581ae159-48c4-4821-aede-361485304c59-logs" (OuterVolumeSpecName: "logs") pod "581ae159-48c4-4821-aede-361485304c59" (UID: "581ae159-48c4-4821-aede-361485304c59"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.935161 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "581ae159-48c4-4821-aede-361485304c59" (UID: "581ae159-48c4-4821-aede-361485304c59"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.963207 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "581ae159-48c4-4821-aede-361485304c59" (UID: "581ae159-48c4-4821-aede-361485304c59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.978812 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-config-data" (OuterVolumeSpecName: "config-data") pod "581ae159-48c4-4821-aede-361485304c59" (UID: "581ae159-48c4-4821-aede-361485304c59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.993155 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"81e6c561-d55c-48fa-94a9-2dd7d491fd48","Type":"ContainerStarted","Data":"85950baeb6758614d608c7fa199dcfa1f953ec17c373ed9a7510133fe5c2bb78"} Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.996635 4699 generic.go:334] "Generic (PLEG): container finished" podID="581ae159-48c4-4821-aede-361485304c59" containerID="40142f357c043763beb1bb0a8f6b18fb7ff06e2c964dea2c67cd27ad4975a2b8" exitCode=0 Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.996665 4699 generic.go:334] "Generic (PLEG): container finished" podID="581ae159-48c4-4821-aede-361485304c59" containerID="9e86abf10ae647fbec08f92496385017210d4ff2c918b5e4390bc7246666f380" exitCode=143 Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.996686 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.996719 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"581ae159-48c4-4821-aede-361485304c59","Type":"ContainerDied","Data":"40142f357c043763beb1bb0a8f6b18fb7ff06e2c964dea2c67cd27ad4975a2b8"} Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.996766 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"581ae159-48c4-4821-aede-361485304c59","Type":"ContainerDied","Data":"9e86abf10ae647fbec08f92496385017210d4ff2c918b5e4390bc7246666f380"} Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.996779 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"581ae159-48c4-4821-aede-361485304c59","Type":"ContainerDied","Data":"682d7ab4556bf77bd10ad1dbcd5f0a84777dbfdd65cbaf81b868443ac2be23e3"} Feb 26 11:32:30 crc kubenswrapper[4699]: I0226 11:32:30.996801 4699 scope.go:117] "RemoveContainer" containerID="40142f357c043763beb1bb0a8f6b18fb7ff06e2c964dea2c67cd27ad4975a2b8" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.009386 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.009640 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdwqp\" (UniqueName: \"kubernetes.io/projected/581ae159-48c4-4821-aede-361485304c59-kube-api-access-wdwqp\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.009706 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.009908 4699 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.010019 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/581ae159-48c4-4821-aede-361485304c59-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.010094 4699 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/581ae159-48c4-4821-aede-361485304c59-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.010213 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/581ae159-48c4-4821-aede-361485304c59-logs\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.094446 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.214015002 podStartE2EDuration="6.09442442s" podCreationTimestamp="2026-02-26 11:32:25 +0000 UTC" firstStartedPulling="2026-02-26 11:32:27.778289586 +0000 UTC m=+1293.589116020" lastFinishedPulling="2026-02-26 11:32:28.658699004 +0000 UTC m=+1294.469525438" observedRunningTime="2026-02-26 11:32:31.01728008 +0000 UTC m=+1296.828106534" watchObservedRunningTime="2026-02-26 11:32:31.09442442 +0000 UTC m=+1296.905250864" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.095370 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.107533 4699 scope.go:117] "RemoveContainer" containerID="9e86abf10ae647fbec08f92496385017210d4ff2c918b5e4390bc7246666f380" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.121023 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.150917 4699 scope.go:117] "RemoveContainer" containerID="40142f357c043763beb1bb0a8f6b18fb7ff06e2c964dea2c67cd27ad4975a2b8" Feb 26 11:32:31 crc kubenswrapper[4699]: E0226 11:32:31.152442 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40142f357c043763beb1bb0a8f6b18fb7ff06e2c964dea2c67cd27ad4975a2b8\": container with ID starting with 40142f357c043763beb1bb0a8f6b18fb7ff06e2c964dea2c67cd27ad4975a2b8 not found: ID does not exist" containerID="40142f357c043763beb1bb0a8f6b18fb7ff06e2c964dea2c67cd27ad4975a2b8" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.152565 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40142f357c043763beb1bb0a8f6b18fb7ff06e2c964dea2c67cd27ad4975a2b8"} err="failed to get container status \"40142f357c043763beb1bb0a8f6b18fb7ff06e2c964dea2c67cd27ad4975a2b8\": rpc error: code = NotFound desc = could not find container \"40142f357c043763beb1bb0a8f6b18fb7ff06e2c964dea2c67cd27ad4975a2b8\": container with ID starting with 40142f357c043763beb1bb0a8f6b18fb7ff06e2c964dea2c67cd27ad4975a2b8 not found: ID does not exist" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.152672 4699 scope.go:117] "RemoveContainer" containerID="9e86abf10ae647fbec08f92496385017210d4ff2c918b5e4390bc7246666f380" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.154580 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 26 11:32:31 crc kubenswrapper[4699]: E0226 11:32:31.155225 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73fd43db-ab24-441d-9912-881ef04d4572" containerName="neutron-api" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.155249 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="73fd43db-ab24-441d-9912-881ef04d4572" containerName="neutron-api" Feb 26 11:32:31 crc kubenswrapper[4699]: E0226 11:32:31.155306 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81843e2c-774f-402a-bd90-c4485ab24c05" containerName="init" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.155318 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="81843e2c-774f-402a-bd90-c4485ab24c05" containerName="init" Feb 26 11:32:31 crc kubenswrapper[4699]: E0226 11:32:31.155329 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a31ea66-afbf-4606-baa9-0f5fb98e5c4f" containerName="barbican-worker" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.155339 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a31ea66-afbf-4606-baa9-0f5fb98e5c4f" containerName="barbican-worker" Feb 26 11:32:31 crc kubenswrapper[4699]: E0226 11:32:31.155380 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ef11cc-2a83-4f0e-b117-4be10a1c0fee" containerName="placement-log" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.155392 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ef11cc-2a83-4f0e-b117-4be10a1c0fee" containerName="placement-log" Feb 26 11:32:31 crc kubenswrapper[4699]: E0226 11:32:31.155406 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="581ae159-48c4-4821-aede-361485304c59" containerName="cinder-api-log" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.155416 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="581ae159-48c4-4821-aede-361485304c59" containerName="cinder-api-log" Feb 26 11:32:31 crc kubenswrapper[4699]: E0226 11:32:31.155428 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0876db8f-e235-40d9-b4a5-718097cdf02c" containerName="barbican-api-log" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.155436 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0876db8f-e235-40d9-b4a5-718097cdf02c" containerName="barbican-api-log" Feb 26 11:32:31 crc kubenswrapper[4699]: E0226 11:32:31.155487 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b" containerName="barbican-keystone-listener-log" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.155498 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b" containerName="barbican-keystone-listener-log" Feb 26 11:32:31 crc kubenswrapper[4699]: E0226 11:32:31.155518 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0876db8f-e235-40d9-b4a5-718097cdf02c" containerName="barbican-api" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.155552 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0876db8f-e235-40d9-b4a5-718097cdf02c" containerName="barbican-api" Feb 26 11:32:31 crc kubenswrapper[4699]: E0226 11:32:31.155573 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" containerName="barbican-api-log" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.155582 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" containerName="barbican-api-log" Feb 26 11:32:31 crc kubenswrapper[4699]: E0226 11:32:31.155595 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ef11cc-2a83-4f0e-b117-4be10a1c0fee" containerName="placement-api" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.155602 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ef11cc-2a83-4f0e-b117-4be10a1c0fee" containerName="placement-api" Feb 26 11:32:31 crc kubenswrapper[4699]: E0226 11:32:31.155643 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a31ea66-afbf-4606-baa9-0f5fb98e5c4f" containerName="barbican-worker-log" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.155655 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a31ea66-afbf-4606-baa9-0f5fb98e5c4f" containerName="barbican-worker-log" Feb 26 11:32:31 crc kubenswrapper[4699]: E0226 11:32:31.155672 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b" containerName="barbican-keystone-listener" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.155680 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b" containerName="barbican-keystone-listener" Feb 26 11:32:31 crc kubenswrapper[4699]: E0226 11:32:31.155912 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81843e2c-774f-402a-bd90-c4485ab24c05" containerName="dnsmasq-dns" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.155924 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="81843e2c-774f-402a-bd90-c4485ab24c05" containerName="dnsmasq-dns" Feb 26 11:32:31 crc kubenswrapper[4699]: E0226 11:32:31.155942 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73fd43db-ab24-441d-9912-881ef04d4572" containerName="neutron-httpd" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.155952 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="73fd43db-ab24-441d-9912-881ef04d4572" containerName="neutron-httpd" Feb 26 11:32:31 crc kubenswrapper[4699]: E0226 11:32:31.155992 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="581ae159-48c4-4821-aede-361485304c59" containerName="cinder-api" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.156002 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="581ae159-48c4-4821-aede-361485304c59" containerName="cinder-api" Feb 26 11:32:31 crc kubenswrapper[4699]: E0226 11:32:31.156014 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" containerName="barbican-api" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.156022 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" containerName="barbican-api" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.156353 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" containerName="barbican-api" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.156402 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="0876db8f-e235-40d9-b4a5-718097cdf02c" containerName="barbican-api-log" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.156418 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="73fd43db-ab24-441d-9912-881ef04d4572" containerName="neutron-httpd" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.156429 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a31ea66-afbf-4606-baa9-0f5fb98e5c4f" containerName="barbican-worker" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.156443 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="73fd43db-ab24-441d-9912-881ef04d4572" containerName="neutron-api" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.156567 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="2147a5bc-be0c-4ab4-a0ee-ede87002a1a4" containerName="barbican-api-log" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.156583 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b" containerName="barbican-keystone-listener" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.156597 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a31ea66-afbf-4606-baa9-0f5fb98e5c4f" containerName="barbican-worker-log" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.156611 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ef11cc-2a83-4f0e-b117-4be10a1c0fee" containerName="placement-log" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.156739 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="81843e2c-774f-402a-bd90-c4485ab24c05" containerName="dnsmasq-dns" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.156752 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="581ae159-48c4-4821-aede-361485304c59" containerName="cinder-api-log" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.156766 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="581ae159-48c4-4821-aede-361485304c59" containerName="cinder-api" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.156904 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ef11cc-2a83-4f0e-b117-4be10a1c0fee" containerName="placement-api" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.156920 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6e2bd7a-1476-4dd9-8765-0c1bbbd7dd8b" containerName="barbican-keystone-listener-log" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.156936 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="0876db8f-e235-40d9-b4a5-718097cdf02c" containerName="barbican-api" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.159792 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: E0226 11:32:31.160671 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e86abf10ae647fbec08f92496385017210d4ff2c918b5e4390bc7246666f380\": container with ID starting with 9e86abf10ae647fbec08f92496385017210d4ff2c918b5e4390bc7246666f380 not found: ID does not exist" containerID="9e86abf10ae647fbec08f92496385017210d4ff2c918b5e4390bc7246666f380" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.160708 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e86abf10ae647fbec08f92496385017210d4ff2c918b5e4390bc7246666f380"} err="failed to get container status \"9e86abf10ae647fbec08f92496385017210d4ff2c918b5e4390bc7246666f380\": rpc error: code = NotFound desc = could not find container \"9e86abf10ae647fbec08f92496385017210d4ff2c918b5e4390bc7246666f380\": container with ID starting with 9e86abf10ae647fbec08f92496385017210d4ff2c918b5e4390bc7246666f380 not found: ID does not exist" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.160734 4699 scope.go:117] "RemoveContainer" containerID="40142f357c043763beb1bb0a8f6b18fb7ff06e2c964dea2c67cd27ad4975a2b8" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.161141 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40142f357c043763beb1bb0a8f6b18fb7ff06e2c964dea2c67cd27ad4975a2b8"} err="failed to get container status \"40142f357c043763beb1bb0a8f6b18fb7ff06e2c964dea2c67cd27ad4975a2b8\": rpc error: code = NotFound desc = could not find container \"40142f357c043763beb1bb0a8f6b18fb7ff06e2c964dea2c67cd27ad4975a2b8\": container with ID starting with 40142f357c043763beb1bb0a8f6b18fb7ff06e2c964dea2c67cd27ad4975a2b8 not found: ID does not exist" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.161159 4699 scope.go:117] "RemoveContainer" containerID="9e86abf10ae647fbec08f92496385017210d4ff2c918b5e4390bc7246666f380" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.161484 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e86abf10ae647fbec08f92496385017210d4ff2c918b5e4390bc7246666f380"} err="failed to get container status \"9e86abf10ae647fbec08f92496385017210d4ff2c918b5e4390bc7246666f380\": rpc error: code = NotFound desc = could not find container \"9e86abf10ae647fbec08f92496385017210d4ff2c918b5e4390bc7246666f380\": container with ID starting with 9e86abf10ae647fbec08f92496385017210d4ff2c918b5e4390bc7246666f380 not found: ID does not exist" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.162092 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.163450 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.167217 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.184782 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.320319 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c2d2c1-e68e-4b14-a732-3b42a6132503-config-data\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.320384 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2c2d2c1-e68e-4b14-a732-3b42a6132503-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.320425 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c2d2c1-e68e-4b14-a732-3b42a6132503-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.320465 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c2d2c1-e68e-4b14-a732-3b42a6132503-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.320487 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c2d2c1-e68e-4b14-a732-3b42a6132503-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.320527 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2c2d2c1-e68e-4b14-a732-3b42a6132503-logs\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.320593 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5f6r\" (UniqueName: \"kubernetes.io/projected/c2c2d2c1-e68e-4b14-a732-3b42a6132503-kube-api-access-l5f6r\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.320656 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2c2d2c1-e68e-4b14-a732-3b42a6132503-scripts\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.320688 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2c2d2c1-e68e-4b14-a732-3b42a6132503-config-data-custom\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.422726 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2c2d2c1-e68e-4b14-a732-3b42a6132503-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.423064 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c2d2c1-e68e-4b14-a732-3b42a6132503-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.423260 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c2d2c1-e68e-4b14-a732-3b42a6132503-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.423363 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c2d2c1-e68e-4b14-a732-3b42a6132503-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.423504 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2c2d2c1-e68e-4b14-a732-3b42a6132503-logs\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.422866 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2c2d2c1-e68e-4b14-a732-3b42a6132503-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.424234 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5f6r\" (UniqueName: \"kubernetes.io/projected/c2c2d2c1-e68e-4b14-a732-3b42a6132503-kube-api-access-l5f6r\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.424551 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2c2d2c1-e68e-4b14-a732-3b42a6132503-scripts\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.424611 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2c2d2c1-e68e-4b14-a732-3b42a6132503-config-data-custom\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.424646 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c2d2c1-e68e-4b14-a732-3b42a6132503-config-data\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.425016 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2c2d2c1-e68e-4b14-a732-3b42a6132503-logs\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.427226 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c2d2c1-e68e-4b14-a732-3b42a6132503-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.429529 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2c2d2c1-e68e-4b14-a732-3b42a6132503-config-data-custom\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.429772 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c2d2c1-e68e-4b14-a732-3b42a6132503-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.430682 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2c2d2c1-e68e-4b14-a732-3b42a6132503-scripts\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.430843 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c2d2c1-e68e-4b14-a732-3b42a6132503-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.440810 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c2d2c1-e68e-4b14-a732-3b42a6132503-config-data\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.445832 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5f6r\" (UniqueName: \"kubernetes.io/projected/c2c2d2c1-e68e-4b14-a732-3b42a6132503-kube-api-access-l5f6r\") pod \"cinder-api-0\" (UID: \"c2c2d2c1-e68e-4b14-a732-3b42a6132503\") " pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.487831 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 11:32:31 crc kubenswrapper[4699]: I0226 11:32:31.934587 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 26 11:32:31 crc kubenswrapper[4699]: W0226 11:32:31.939690 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2c2d2c1_e68e_4b14_a732_3b42a6132503.slice/crio-e474cf4adc9bb11aed16ebe0fc2a10f66d43da75ae3ad333d6e6c436ad80c6fe WatchSource:0}: Error finding container e474cf4adc9bb11aed16ebe0fc2a10f66d43da75ae3ad333d6e6c436ad80c6fe: Status 404 returned error can't find the container with id e474cf4adc9bb11aed16ebe0fc2a10f66d43da75ae3ad333d6e6c436ad80c6fe Feb 26 11:32:32 crc kubenswrapper[4699]: I0226 11:32:32.062784 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c2c2d2c1-e68e-4b14-a732-3b42a6132503","Type":"ContainerStarted","Data":"e474cf4adc9bb11aed16ebe0fc2a10f66d43da75ae3ad333d6e6c436ad80c6fe"} Feb 26 11:32:32 crc kubenswrapper[4699]: I0226 11:32:32.275391 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="581ae159-48c4-4821-aede-361485304c59" path="/var/lib/kubelet/pods/581ae159-48c4-4821-aede-361485304c59/volumes" Feb 26 11:32:32 crc kubenswrapper[4699]: I0226 11:32:32.454575 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-67d4f89fb9-65kmq" Feb 26 11:32:33 crc kubenswrapper[4699]: I0226 11:32:33.107611 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c2c2d2c1-e68e-4b14-a732-3b42a6132503","Type":"ContainerStarted","Data":"0a44045c0ef3fbc374b93d9133001d77112bcc42335dae3b11707d390ea07179"} Feb 26 11:32:33 crc kubenswrapper[4699]: I0226 11:32:33.965489 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.094576 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-config-data\") pod \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.094683 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srl4m\" (UniqueName: \"kubernetes.io/projected/7cec2d73-9ca8-4a8b-836d-efce961fbde8-kube-api-access-srl4m\") pod \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.094762 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cec2d73-9ca8-4a8b-836d-efce961fbde8-run-httpd\") pod \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.094794 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-combined-ca-bundle\") pod \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.094944 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-sg-core-conf-yaml\") pod \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.094993 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cec2d73-9ca8-4a8b-836d-efce961fbde8-log-httpd\") pod \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.095014 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-scripts\") pod \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\" (UID: \"7cec2d73-9ca8-4a8b-836d-efce961fbde8\") " Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.095079 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cec2d73-9ca8-4a8b-836d-efce961fbde8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7cec2d73-9ca8-4a8b-836d-efce961fbde8" (UID: "7cec2d73-9ca8-4a8b-836d-efce961fbde8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.095608 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cec2d73-9ca8-4a8b-836d-efce961fbde8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7cec2d73-9ca8-4a8b-836d-efce961fbde8" (UID: "7cec2d73-9ca8-4a8b-836d-efce961fbde8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.095673 4699 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cec2d73-9ca8-4a8b-836d-efce961fbde8-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.102334 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-scripts" (OuterVolumeSpecName: "scripts") pod "7cec2d73-9ca8-4a8b-836d-efce961fbde8" (UID: "7cec2d73-9ca8-4a8b-836d-efce961fbde8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.106357 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cec2d73-9ca8-4a8b-836d-efce961fbde8-kube-api-access-srl4m" (OuterVolumeSpecName: "kube-api-access-srl4m") pod "7cec2d73-9ca8-4a8b-836d-efce961fbde8" (UID: "7cec2d73-9ca8-4a8b-836d-efce961fbde8"). InnerVolumeSpecName "kube-api-access-srl4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.129779 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c2c2d2c1-e68e-4b14-a732-3b42a6132503","Type":"ContainerStarted","Data":"dfce0dfae871016f2f4e74df9ef312cfcba1295385069eef7f6970c9983c1ca9"} Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.130010 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.132913 4699 generic.go:334] "Generic (PLEG): container finished" podID="7cec2d73-9ca8-4a8b-836d-efce961fbde8" containerID="2f8839485194d3f95fa8291b794501d86f5716d9498983ff8ecd5c989e725c0d" exitCode=0 Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.132953 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7cec2d73-9ca8-4a8b-836d-efce961fbde8","Type":"ContainerDied","Data":"2f8839485194d3f95fa8291b794501d86f5716d9498983ff8ecd5c989e725c0d"} Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.132980 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7cec2d73-9ca8-4a8b-836d-efce961fbde8","Type":"ContainerDied","Data":"50c24ca371e65d6a43a9a97ed072f4bd1eadffc6515aa3e571658b4eeec32c3b"} Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.132985 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.132999 4699 scope.go:117] "RemoveContainer" containerID="5b67c1b09ab95a7bde953070b4025e70711be4e1fc09bbc21308bfb6c7bb94c2" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.139796 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-config-data" (OuterVolumeSpecName: "config-data") pod "7cec2d73-9ca8-4a8b-836d-efce961fbde8" (UID: "7cec2d73-9ca8-4a8b-836d-efce961fbde8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.143060 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7cec2d73-9ca8-4a8b-836d-efce961fbde8" (UID: "7cec2d73-9ca8-4a8b-836d-efce961fbde8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.164767 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.164745175 podStartE2EDuration="3.164745175s" podCreationTimestamp="2026-02-26 11:32:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:32:34.156954189 +0000 UTC m=+1299.967780643" watchObservedRunningTime="2026-02-26 11:32:34.164745175 +0000 UTC m=+1299.975571619" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.170160 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cec2d73-9ca8-4a8b-836d-efce961fbde8" (UID: "7cec2d73-9ca8-4a8b-836d-efce961fbde8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.196942 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.196983 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srl4m\" (UniqueName: \"kubernetes.io/projected/7cec2d73-9ca8-4a8b-836d-efce961fbde8-kube-api-access-srl4m\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.196995 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.197004 4699 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.197012 4699 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cec2d73-9ca8-4a8b-836d-efce961fbde8-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.197020 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cec2d73-9ca8-4a8b-836d-efce961fbde8-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.249513 4699 scope.go:117] "RemoveContainer" containerID="2f8839485194d3f95fa8291b794501d86f5716d9498983ff8ecd5c989e725c0d" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.289510 4699 scope.go:117] "RemoveContainer" containerID="5b67c1b09ab95a7bde953070b4025e70711be4e1fc09bbc21308bfb6c7bb94c2" Feb 26 11:32:34 crc kubenswrapper[4699]: E0226 11:32:34.291326 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b67c1b09ab95a7bde953070b4025e70711be4e1fc09bbc21308bfb6c7bb94c2\": container with ID starting with 5b67c1b09ab95a7bde953070b4025e70711be4e1fc09bbc21308bfb6c7bb94c2 not found: ID does not exist" containerID="5b67c1b09ab95a7bde953070b4025e70711be4e1fc09bbc21308bfb6c7bb94c2" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.291364 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b67c1b09ab95a7bde953070b4025e70711be4e1fc09bbc21308bfb6c7bb94c2"} err="failed to get container status \"5b67c1b09ab95a7bde953070b4025e70711be4e1fc09bbc21308bfb6c7bb94c2\": rpc error: code = NotFound desc = could not find container \"5b67c1b09ab95a7bde953070b4025e70711be4e1fc09bbc21308bfb6c7bb94c2\": container with ID starting with 5b67c1b09ab95a7bde953070b4025e70711be4e1fc09bbc21308bfb6c7bb94c2 not found: ID does not exist" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.291385 4699 scope.go:117] "RemoveContainer" containerID="2f8839485194d3f95fa8291b794501d86f5716d9498983ff8ecd5c989e725c0d" Feb 26 11:32:34 crc kubenswrapper[4699]: E0226 11:32:34.291630 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f8839485194d3f95fa8291b794501d86f5716d9498983ff8ecd5c989e725c0d\": container with ID starting with 2f8839485194d3f95fa8291b794501d86f5716d9498983ff8ecd5c989e725c0d not found: ID does not exist" containerID="2f8839485194d3f95fa8291b794501d86f5716d9498983ff8ecd5c989e725c0d" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.291672 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f8839485194d3f95fa8291b794501d86f5716d9498983ff8ecd5c989e725c0d"} err="failed to get container status \"2f8839485194d3f95fa8291b794501d86f5716d9498983ff8ecd5c989e725c0d\": rpc error: code = NotFound desc = could not find container \"2f8839485194d3f95fa8291b794501d86f5716d9498983ff8ecd5c989e725c0d\": container with ID starting with 2f8839485194d3f95fa8291b794501d86f5716d9498983ff8ecd5c989e725c0d not found: ID does not exist" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.483552 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.493150 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.515161 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:32:34 crc kubenswrapper[4699]: E0226 11:32:34.515620 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cec2d73-9ca8-4a8b-836d-efce961fbde8" containerName="sg-core" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.515643 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cec2d73-9ca8-4a8b-836d-efce961fbde8" containerName="sg-core" Feb 26 11:32:34 crc kubenswrapper[4699]: E0226 11:32:34.515688 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cec2d73-9ca8-4a8b-836d-efce961fbde8" containerName="ceilometer-notification-agent" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.515697 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cec2d73-9ca8-4a8b-836d-efce961fbde8" containerName="ceilometer-notification-agent" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.515915 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cec2d73-9ca8-4a8b-836d-efce961fbde8" containerName="sg-core" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.515940 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cec2d73-9ca8-4a8b-836d-efce961fbde8" containerName="ceilometer-notification-agent" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.521872 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.525841 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.526174 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.544265 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.708885 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07f96d49-8858-4aca-b9c2-3cf489845764-run-httpd\") pod \"ceilometer-0\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.708947 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.709008 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-config-data\") pod \"ceilometer-0\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.709336 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07f96d49-8858-4aca-b9c2-3cf489845764-log-httpd\") pod \"ceilometer-0\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.709428 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-scripts\") pod \"ceilometer-0\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.709586 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkp48\" (UniqueName: \"kubernetes.io/projected/07f96d49-8858-4aca-b9c2-3cf489845764-kube-api-access-rkp48\") pod \"ceilometer-0\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.709657 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.811132 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkp48\" (UniqueName: \"kubernetes.io/projected/07f96d49-8858-4aca-b9c2-3cf489845764-kube-api-access-rkp48\") pod \"ceilometer-0\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.811181 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.811231 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07f96d49-8858-4aca-b9c2-3cf489845764-run-httpd\") pod \"ceilometer-0\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.811260 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.811297 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-config-data\") pod \"ceilometer-0\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.811396 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07f96d49-8858-4aca-b9c2-3cf489845764-log-httpd\") pod \"ceilometer-0\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.811418 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-scripts\") pod \"ceilometer-0\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.813403 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07f96d49-8858-4aca-b9c2-3cf489845764-run-httpd\") pod \"ceilometer-0\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.813485 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07f96d49-8858-4aca-b9c2-3cf489845764-log-httpd\") pod \"ceilometer-0\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.819304 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-scripts\") pod \"ceilometer-0\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.821753 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.830133 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-config-data\") pod \"ceilometer-0\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.836047 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.836997 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkp48\" (UniqueName: \"kubernetes.io/projected/07f96d49-8858-4aca-b9c2-3cf489845764-kube-api-access-rkp48\") pod \"ceilometer-0\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " pod="openstack/ceilometer-0" Feb 26 11:32:34 crc kubenswrapper[4699]: I0226 11:32:34.854500 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:32:35 crc kubenswrapper[4699]: I0226 11:32:35.360507 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:32:35 crc kubenswrapper[4699]: W0226 11:32:35.361340 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07f96d49_8858_4aca_b9c2_3cf489845764.slice/crio-47100af0619c1b65c26d0c6fde8e00cd8b96fe31ad0063ee87c9c6c36917918c WatchSource:0}: Error finding container 47100af0619c1b65c26d0c6fde8e00cd8b96fe31ad0063ee87c9c6c36917918c: Status 404 returned error can't find the container with id 47100af0619c1b65c26d0c6fde8e00cd8b96fe31ad0063ee87c9c6c36917918c Feb 26 11:32:35 crc kubenswrapper[4699]: I0226 11:32:35.440156 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 26 11:32:35 crc kubenswrapper[4699]: I0226 11:32:35.538259 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:32:35 crc kubenswrapper[4699]: I0226 11:32:35.618923 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-gg27w"] Feb 26 11:32:35 crc kubenswrapper[4699]: I0226 11:32:35.623431 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688c87cc99-gg27w" podUID="21ee9717-aaae-4511-9cee-fb022818e57d" containerName="dnsmasq-dns" containerID="cri-o://306402b7645a267592b660f978f8685767bc49fa883947fdba6ed6fa1d54d19c" gracePeriod=10 Feb 26 11:32:35 crc kubenswrapper[4699]: I0226 11:32:35.719613 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.209448 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07f96d49-8858-4aca-b9c2-3cf489845764","Type":"ContainerStarted","Data":"47100af0619c1b65c26d0c6fde8e00cd8b96fe31ad0063ee87c9c6c36917918c"} Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.215030 4699 generic.go:334] "Generic (PLEG): container finished" podID="21ee9717-aaae-4511-9cee-fb022818e57d" containerID="306402b7645a267592b660f978f8685767bc49fa883947fdba6ed6fa1d54d19c" exitCode=0 Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.217052 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-gg27w" event={"ID":"21ee9717-aaae-4511-9cee-fb022818e57d","Type":"ContainerDied","Data":"306402b7645a267592b660f978f8685767bc49fa883947fdba6ed6fa1d54d19c"} Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.279684 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cec2d73-9ca8-4a8b-836d-efce961fbde8" path="/var/lib/kubelet/pods/7cec2d73-9ca8-4a8b-836d-efce961fbde8/volumes" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.310749 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.313400 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.487377 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-ovsdbserver-sb\") pod \"21ee9717-aaae-4511-9cee-fb022818e57d\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.487880 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-dns-svc\") pod \"21ee9717-aaae-4511-9cee-fb022818e57d\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.487944 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-config\") pod \"21ee9717-aaae-4511-9cee-fb022818e57d\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.487997 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-ovsdbserver-nb\") pod \"21ee9717-aaae-4511-9cee-fb022818e57d\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.488106 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlcgz\" (UniqueName: \"kubernetes.io/projected/21ee9717-aaae-4511-9cee-fb022818e57d-kube-api-access-mlcgz\") pod \"21ee9717-aaae-4511-9cee-fb022818e57d\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.488169 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-dns-swift-storage-0\") pod \"21ee9717-aaae-4511-9cee-fb022818e57d\" (UID: \"21ee9717-aaae-4511-9cee-fb022818e57d\") " Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.507179 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21ee9717-aaae-4511-9cee-fb022818e57d-kube-api-access-mlcgz" (OuterVolumeSpecName: "kube-api-access-mlcgz") pod "21ee9717-aaae-4511-9cee-fb022818e57d" (UID: "21ee9717-aaae-4511-9cee-fb022818e57d"). InnerVolumeSpecName "kube-api-access-mlcgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.548945 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "21ee9717-aaae-4511-9cee-fb022818e57d" (UID: "21ee9717-aaae-4511-9cee-fb022818e57d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.549166 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "21ee9717-aaae-4511-9cee-fb022818e57d" (UID: "21ee9717-aaae-4511-9cee-fb022818e57d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.559752 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "21ee9717-aaae-4511-9cee-fb022818e57d" (UID: "21ee9717-aaae-4511-9cee-fb022818e57d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.562801 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "21ee9717-aaae-4511-9cee-fb022818e57d" (UID: "21ee9717-aaae-4511-9cee-fb022818e57d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.577611 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-config" (OuterVolumeSpecName: "config") pod "21ee9717-aaae-4511-9cee-fb022818e57d" (UID: "21ee9717-aaae-4511-9cee-fb022818e57d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.590456 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlcgz\" (UniqueName: \"kubernetes.io/projected/21ee9717-aaae-4511-9cee-fb022818e57d-kube-api-access-mlcgz\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.591031 4699 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.591052 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.591061 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.591071 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.591079 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21ee9717-aaae-4511-9cee-fb022818e57d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.809960 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 26 11:32:36 crc kubenswrapper[4699]: E0226 11:32:36.810573 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ee9717-aaae-4511-9cee-fb022818e57d" containerName="dnsmasq-dns" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.810594 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ee9717-aaae-4511-9cee-fb022818e57d" containerName="dnsmasq-dns" Feb 26 11:32:36 crc kubenswrapper[4699]: E0226 11:32:36.810622 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ee9717-aaae-4511-9cee-fb022818e57d" containerName="init" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.810631 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ee9717-aaae-4511-9cee-fb022818e57d" containerName="init" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.810825 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="21ee9717-aaae-4511-9cee-fb022818e57d" containerName="dnsmasq-dns" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.811401 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.815453 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-xh5tl" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.815457 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.816255 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.820046 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.998942 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16db7cc3-bd7c-44aa-b92f-d2a645d96ef0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"16db7cc3-bd7c-44aa-b92f-d2a645d96ef0\") " pod="openstack/openstackclient" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.999068 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/16db7cc3-bd7c-44aa-b92f-d2a645d96ef0-openstack-config-secret\") pod \"openstackclient\" (UID: \"16db7cc3-bd7c-44aa-b92f-d2a645d96ef0\") " pod="openstack/openstackclient" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.999366 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/16db7cc3-bd7c-44aa-b92f-d2a645d96ef0-openstack-config\") pod \"openstackclient\" (UID: \"16db7cc3-bd7c-44aa-b92f-d2a645d96ef0\") " pod="openstack/openstackclient" Feb 26 11:32:36 crc kubenswrapper[4699]: I0226 11:32:36.999673 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxkpt\" (UniqueName: \"kubernetes.io/projected/16db7cc3-bd7c-44aa-b92f-d2a645d96ef0-kube-api-access-nxkpt\") pod \"openstackclient\" (UID: \"16db7cc3-bd7c-44aa-b92f-d2a645d96ef0\") " pod="openstack/openstackclient" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.101444 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16db7cc3-bd7c-44aa-b92f-d2a645d96ef0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"16db7cc3-bd7c-44aa-b92f-d2a645d96ef0\") " pod="openstack/openstackclient" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.101586 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/16db7cc3-bd7c-44aa-b92f-d2a645d96ef0-openstack-config-secret\") pod \"openstackclient\" (UID: \"16db7cc3-bd7c-44aa-b92f-d2a645d96ef0\") " pod="openstack/openstackclient" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.101611 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/16db7cc3-bd7c-44aa-b92f-d2a645d96ef0-openstack-config\") pod \"openstackclient\" (UID: \"16db7cc3-bd7c-44aa-b92f-d2a645d96ef0\") " pod="openstack/openstackclient" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.101727 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxkpt\" (UniqueName: \"kubernetes.io/projected/16db7cc3-bd7c-44aa-b92f-d2a645d96ef0-kube-api-access-nxkpt\") pod \"openstackclient\" (UID: \"16db7cc3-bd7c-44aa-b92f-d2a645d96ef0\") " pod="openstack/openstackclient" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.103586 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/16db7cc3-bd7c-44aa-b92f-d2a645d96ef0-openstack-config\") pod \"openstackclient\" (UID: \"16db7cc3-bd7c-44aa-b92f-d2a645d96ef0\") " pod="openstack/openstackclient" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.105938 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/16db7cc3-bd7c-44aa-b92f-d2a645d96ef0-openstack-config-secret\") pod \"openstackclient\" (UID: \"16db7cc3-bd7c-44aa-b92f-d2a645d96ef0\") " pod="openstack/openstackclient" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.106057 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16db7cc3-bd7c-44aa-b92f-d2a645d96ef0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"16db7cc3-bd7c-44aa-b92f-d2a645d96ef0\") " pod="openstack/openstackclient" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.120890 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxkpt\" (UniqueName: \"kubernetes.io/projected/16db7cc3-bd7c-44aa-b92f-d2a645d96ef0-kube-api-access-nxkpt\") pod \"openstackclient\" (UID: \"16db7cc3-bd7c-44aa-b92f-d2a645d96ef0\") " pod="openstack/openstackclient" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.126980 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.242461 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-gg27w" event={"ID":"21ee9717-aaae-4511-9cee-fb022818e57d","Type":"ContainerDied","Data":"729fa2fe733b6553118627e4796e6e00ed271782aa89de919499cdfc619cd740"} Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.242515 4699 scope.go:117] "RemoveContainer" containerID="306402b7645a267592b660f978f8685767bc49fa883947fdba6ed6fa1d54d19c" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.242622 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-gg27w" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.251021 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07f96d49-8858-4aca-b9c2-3cf489845764","Type":"ContainerStarted","Data":"e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3"} Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.251081 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="81e6c561-d55c-48fa-94a9-2dd7d491fd48" containerName="cinder-scheduler" containerID="cri-o://22eb49c2ddda393e2d578a662fade5ebe09a2739d93f76bbdbb5c7f8759be981" gracePeriod=30 Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.251110 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="81e6c561-d55c-48fa-94a9-2dd7d491fd48" containerName="probe" containerID="cri-o://85950baeb6758614d608c7fa199dcfa1f953ec17c373ed9a7510133fe5c2bb78" gracePeriod=30 Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.290486 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-78cbc76b59-m6shv"] Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.290709 4699 scope.go:117] "RemoveContainer" containerID="92cf2b1cba562648cb5236aef5b4582d6ded613391d9217a2ee3e5335a2f73cf" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.292062 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.301684 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.301787 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.301946 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.310315 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-78cbc76b59-m6shv"] Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.340312 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-gg27w"] Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.372975 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-gg27w"] Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.408300 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5a4ece68-df2a-480c-9531-1d133d7f4bd0-etc-swift\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.408345 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a4ece68-df2a-480c-9531-1d133d7f4bd0-run-httpd\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.408378 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a4ece68-df2a-480c-9531-1d133d7f4bd0-combined-ca-bundle\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.408422 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a4ece68-df2a-480c-9531-1d133d7f4bd0-public-tls-certs\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.408446 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a4ece68-df2a-480c-9531-1d133d7f4bd0-config-data\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.408488 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtf8m\" (UniqueName: \"kubernetes.io/projected/5a4ece68-df2a-480c-9531-1d133d7f4bd0-kube-api-access-qtf8m\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.408513 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a4ece68-df2a-480c-9531-1d133d7f4bd0-log-httpd\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.408532 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a4ece68-df2a-480c-9531-1d133d7f4bd0-internal-tls-certs\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.510799 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5a4ece68-df2a-480c-9531-1d133d7f4bd0-etc-swift\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.510928 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a4ece68-df2a-480c-9531-1d133d7f4bd0-run-httpd\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.511495 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a4ece68-df2a-480c-9531-1d133d7f4bd0-run-httpd\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.511571 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a4ece68-df2a-480c-9531-1d133d7f4bd0-combined-ca-bundle\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.512094 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a4ece68-df2a-480c-9531-1d133d7f4bd0-public-tls-certs\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.512153 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a4ece68-df2a-480c-9531-1d133d7f4bd0-config-data\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.512212 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtf8m\" (UniqueName: \"kubernetes.io/projected/5a4ece68-df2a-480c-9531-1d133d7f4bd0-kube-api-access-qtf8m\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.512250 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a4ece68-df2a-480c-9531-1d133d7f4bd0-log-httpd\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.512279 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a4ece68-df2a-480c-9531-1d133d7f4bd0-internal-tls-certs\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.513856 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a4ece68-df2a-480c-9531-1d133d7f4bd0-log-httpd\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.516860 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a4ece68-df2a-480c-9531-1d133d7f4bd0-public-tls-certs\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.517043 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a4ece68-df2a-480c-9531-1d133d7f4bd0-internal-tls-certs\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.517582 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a4ece68-df2a-480c-9531-1d133d7f4bd0-config-data\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.517759 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5a4ece68-df2a-480c-9531-1d133d7f4bd0-etc-swift\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.524211 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a4ece68-df2a-480c-9531-1d133d7f4bd0-combined-ca-bundle\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.533808 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtf8m\" (UniqueName: \"kubernetes.io/projected/5a4ece68-df2a-480c-9531-1d133d7f4bd0-kube-api-access-qtf8m\") pod \"swift-proxy-78cbc76b59-m6shv\" (UID: \"5a4ece68-df2a-480c-9531-1d133d7f4bd0\") " pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.650822 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 26 11:32:37 crc kubenswrapper[4699]: W0226 11:32:37.654788 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16db7cc3_bd7c_44aa_b92f_d2a645d96ef0.slice/crio-2ee3b4fc6642667decf6be1b2b1d77f395d542f2813c79d3c87ab1a802f09f49 WatchSource:0}: Error finding container 2ee3b4fc6642667decf6be1b2b1d77f395d542f2813c79d3c87ab1a802f09f49: Status 404 returned error can't find the container with id 2ee3b4fc6642667decf6be1b2b1d77f395d542f2813c79d3c87ab1a802f09f49 Feb 26 11:32:37 crc kubenswrapper[4699]: I0226 11:32:37.667367 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:38 crc kubenswrapper[4699]: I0226 11:32:38.131470 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-57899c756d-w9pc5" podUID="78d85906-b78a-46eb-b5dd-4da95c1222d8" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Feb 26 11:32:38 crc kubenswrapper[4699]: I0226 11:32:38.289162 4699 generic.go:334] "Generic (PLEG): container finished" podID="81e6c561-d55c-48fa-94a9-2dd7d491fd48" containerID="85950baeb6758614d608c7fa199dcfa1f953ec17c373ed9a7510133fe5c2bb78" exitCode=0 Feb 26 11:32:38 crc kubenswrapper[4699]: W0226 11:32:38.303716 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a4ece68_df2a_480c_9531_1d133d7f4bd0.slice/crio-dbf6e98d742fcea8eca1e3480f96f547bbcf81fb4012078188fbc090746daeab WatchSource:0}: Error finding container dbf6e98d742fcea8eca1e3480f96f547bbcf81fb4012078188fbc090746daeab: Status 404 returned error can't find the container with id dbf6e98d742fcea8eca1e3480f96f547bbcf81fb4012078188fbc090746daeab Feb 26 11:32:38 crc kubenswrapper[4699]: I0226 11:32:38.308199 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21ee9717-aaae-4511-9cee-fb022818e57d" path="/var/lib/kubelet/pods/21ee9717-aaae-4511-9cee-fb022818e57d/volumes" Feb 26 11:32:38 crc kubenswrapper[4699]: I0226 11:32:38.308794 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"81e6c561-d55c-48fa-94a9-2dd7d491fd48","Type":"ContainerDied","Data":"85950baeb6758614d608c7fa199dcfa1f953ec17c373ed9a7510133fe5c2bb78"} Feb 26 11:32:38 crc kubenswrapper[4699]: I0226 11:32:38.308829 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"16db7cc3-bd7c-44aa-b92f-d2a645d96ef0","Type":"ContainerStarted","Data":"2ee3b4fc6642667decf6be1b2b1d77f395d542f2813c79d3c87ab1a802f09f49"} Feb 26 11:32:38 crc kubenswrapper[4699]: I0226 11:32:38.308844 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-78cbc76b59-m6shv"] Feb 26 11:32:38 crc kubenswrapper[4699]: I0226 11:32:38.308859 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07f96d49-8858-4aca-b9c2-3cf489845764","Type":"ContainerStarted","Data":"02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9"} Feb 26 11:32:38 crc kubenswrapper[4699]: I0226 11:32:38.308869 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07f96d49-8858-4aca-b9c2-3cf489845764","Type":"ContainerStarted","Data":"9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c"} Feb 26 11:32:39 crc kubenswrapper[4699]: I0226 11:32:39.320871 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-78cbc76b59-m6shv" event={"ID":"5a4ece68-df2a-480c-9531-1d133d7f4bd0","Type":"ContainerStarted","Data":"9d313c26fcbd3d6642064b0ae4b90d726851d1cb87e4a49ead108da7f89fa77e"} Feb 26 11:32:39 crc kubenswrapper[4699]: I0226 11:32:39.321284 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-78cbc76b59-m6shv" event={"ID":"5a4ece68-df2a-480c-9531-1d133d7f4bd0","Type":"ContainerStarted","Data":"acd5aa6b8be65943874c4007750d0de6cfc1464d6616e24207131683c54b76b0"} Feb 26 11:32:39 crc kubenswrapper[4699]: I0226 11:32:39.321304 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-78cbc76b59-m6shv" event={"ID":"5a4ece68-df2a-480c-9531-1d133d7f4bd0","Type":"ContainerStarted","Data":"dbf6e98d742fcea8eca1e3480f96f547bbcf81fb4012078188fbc090746daeab"} Feb 26 11:32:39 crc kubenswrapper[4699]: I0226 11:32:39.321324 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:39 crc kubenswrapper[4699]: I0226 11:32:39.321339 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:39 crc kubenswrapper[4699]: I0226 11:32:39.352656 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-78cbc76b59-m6shv" podStartSLOduration=2.352640116 podStartE2EDuration="2.352640116s" podCreationTimestamp="2026-02-26 11:32:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:32:39.35209305 +0000 UTC m=+1305.162919524" watchObservedRunningTime="2026-02-26 11:32:39.352640116 +0000 UTC m=+1305.163466560" Feb 26 11:32:40 crc kubenswrapper[4699]: I0226 11:32:40.855063 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:32:41 crc kubenswrapper[4699]: I0226 11:32:41.338434 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07f96d49-8858-4aca-b9c2-3cf489845764","Type":"ContainerStarted","Data":"c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351"} Feb 26 11:32:41 crc kubenswrapper[4699]: I0226 11:32:41.338603 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07f96d49-8858-4aca-b9c2-3cf489845764" containerName="ceilometer-central-agent" containerID="cri-o://e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3" gracePeriod=30 Feb 26 11:32:41 crc kubenswrapper[4699]: I0226 11:32:41.338804 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07f96d49-8858-4aca-b9c2-3cf489845764" containerName="proxy-httpd" containerID="cri-o://c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351" gracePeriod=30 Feb 26 11:32:41 crc kubenswrapper[4699]: I0226 11:32:41.338818 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07f96d49-8858-4aca-b9c2-3cf489845764" containerName="sg-core" containerID="cri-o://02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9" gracePeriod=30 Feb 26 11:32:41 crc kubenswrapper[4699]: I0226 11:32:41.338828 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07f96d49-8858-4aca-b9c2-3cf489845764" containerName="ceilometer-notification-agent" containerID="cri-o://9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c" gracePeriod=30 Feb 26 11:32:41 crc kubenswrapper[4699]: I0226 11:32:41.339014 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 11:32:41 crc kubenswrapper[4699]: I0226 11:32:41.373926 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.881027367 podStartE2EDuration="7.373910446s" podCreationTimestamp="2026-02-26 11:32:34 +0000 UTC" firstStartedPulling="2026-02-26 11:32:35.369286336 +0000 UTC m=+1301.180112770" lastFinishedPulling="2026-02-26 11:32:40.862169415 +0000 UTC m=+1306.672995849" observedRunningTime="2026-02-26 11:32:41.3726405 +0000 UTC m=+1307.183466934" watchObservedRunningTime="2026-02-26 11:32:41.373910446 +0000 UTC m=+1307.184736870" Feb 26 11:32:41 crc kubenswrapper[4699]: I0226 11:32:41.796760 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 11:32:41 crc kubenswrapper[4699]: I0226 11:32:41.900842 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/81e6c561-d55c-48fa-94a9-2dd7d491fd48-etc-machine-id\") pod \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " Feb 26 11:32:41 crc kubenswrapper[4699]: I0226 11:32:41.900973 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-scripts\") pod \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " Feb 26 11:32:41 crc kubenswrapper[4699]: I0226 11:32:41.900972 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81e6c561-d55c-48fa-94a9-2dd7d491fd48-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "81e6c561-d55c-48fa-94a9-2dd7d491fd48" (UID: "81e6c561-d55c-48fa-94a9-2dd7d491fd48"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 11:32:41 crc kubenswrapper[4699]: I0226 11:32:41.901087 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9rw5\" (UniqueName: \"kubernetes.io/projected/81e6c561-d55c-48fa-94a9-2dd7d491fd48-kube-api-access-j9rw5\") pod \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " Feb 26 11:32:41 crc kubenswrapper[4699]: I0226 11:32:41.901165 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-config-data\") pod \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " Feb 26 11:32:41 crc kubenswrapper[4699]: I0226 11:32:41.901231 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-config-data-custom\") pod \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " Feb 26 11:32:41 crc kubenswrapper[4699]: I0226 11:32:41.901263 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-combined-ca-bundle\") pod \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\" (UID: \"81e6c561-d55c-48fa-94a9-2dd7d491fd48\") " Feb 26 11:32:41 crc kubenswrapper[4699]: I0226 11:32:41.901615 4699 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/81e6c561-d55c-48fa-94a9-2dd7d491fd48-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:41 crc kubenswrapper[4699]: I0226 11:32:41.906539 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-scripts" (OuterVolumeSpecName: "scripts") pod "81e6c561-d55c-48fa-94a9-2dd7d491fd48" (UID: "81e6c561-d55c-48fa-94a9-2dd7d491fd48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:41 crc kubenswrapper[4699]: I0226 11:32:41.909300 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "81e6c561-d55c-48fa-94a9-2dd7d491fd48" (UID: "81e6c561-d55c-48fa-94a9-2dd7d491fd48"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:41 crc kubenswrapper[4699]: I0226 11:32:41.922142 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81e6c561-d55c-48fa-94a9-2dd7d491fd48-kube-api-access-j9rw5" (OuterVolumeSpecName: "kube-api-access-j9rw5") pod "81e6c561-d55c-48fa-94a9-2dd7d491fd48" (UID: "81e6c561-d55c-48fa-94a9-2dd7d491fd48"). InnerVolumeSpecName "kube-api-access-j9rw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:32:41 crc kubenswrapper[4699]: I0226 11:32:41.974291 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81e6c561-d55c-48fa-94a9-2dd7d491fd48" (UID: "81e6c561-d55c-48fa-94a9-2dd7d491fd48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.003045 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.003067 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9rw5\" (UniqueName: \"kubernetes.io/projected/81e6c561-d55c-48fa-94a9-2dd7d491fd48-kube-api-access-j9rw5\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.003076 4699 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.003132 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.041201 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-config-data" (OuterVolumeSpecName: "config-data") pod "81e6c561-d55c-48fa-94a9-2dd7d491fd48" (UID: "81e6c561-d55c-48fa-94a9-2dd7d491fd48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.104736 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e6c561-d55c-48fa-94a9-2dd7d491fd48-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.124898 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.313734 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07f96d49-8858-4aca-b9c2-3cf489845764-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "07f96d49-8858-4aca-b9c2-3cf489845764" (UID: "07f96d49-8858-4aca-b9c2-3cf489845764"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.310846 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07f96d49-8858-4aca-b9c2-3cf489845764-log-httpd\") pod \"07f96d49-8858-4aca-b9c2-3cf489845764\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.317789 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07f96d49-8858-4aca-b9c2-3cf489845764-run-httpd\") pod \"07f96d49-8858-4aca-b9c2-3cf489845764\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.318060 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkp48\" (UniqueName: \"kubernetes.io/projected/07f96d49-8858-4aca-b9c2-3cf489845764-kube-api-access-rkp48\") pod \"07f96d49-8858-4aca-b9c2-3cf489845764\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.318302 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-sg-core-conf-yaml\") pod \"07f96d49-8858-4aca-b9c2-3cf489845764\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.318637 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-combined-ca-bundle\") pod \"07f96d49-8858-4aca-b9c2-3cf489845764\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.318766 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-scripts\") pod \"07f96d49-8858-4aca-b9c2-3cf489845764\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.321469 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-config-data\") pod \"07f96d49-8858-4aca-b9c2-3cf489845764\" (UID: \"07f96d49-8858-4aca-b9c2-3cf489845764\") " Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.322256 4699 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07f96d49-8858-4aca-b9c2-3cf489845764-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.317966 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07f96d49-8858-4aca-b9c2-3cf489845764-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "07f96d49-8858-4aca-b9c2-3cf489845764" (UID: "07f96d49-8858-4aca-b9c2-3cf489845764"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.332343 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-scripts" (OuterVolumeSpecName: "scripts") pod "07f96d49-8858-4aca-b9c2-3cf489845764" (UID: "07f96d49-8858-4aca-b9c2-3cf489845764"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.332453 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07f96d49-8858-4aca-b9c2-3cf489845764-kube-api-access-rkp48" (OuterVolumeSpecName: "kube-api-access-rkp48") pod "07f96d49-8858-4aca-b9c2-3cf489845764" (UID: "07f96d49-8858-4aca-b9c2-3cf489845764"). InnerVolumeSpecName "kube-api-access-rkp48". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.353228 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "07f96d49-8858-4aca-b9c2-3cf489845764" (UID: "07f96d49-8858-4aca-b9c2-3cf489845764"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.359223 4699 generic.go:334] "Generic (PLEG): container finished" podID="07f96d49-8858-4aca-b9c2-3cf489845764" containerID="c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351" exitCode=0 Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.359284 4699 generic.go:334] "Generic (PLEG): container finished" podID="07f96d49-8858-4aca-b9c2-3cf489845764" containerID="02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9" exitCode=2 Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.359294 4699 generic.go:334] "Generic (PLEG): container finished" podID="07f96d49-8858-4aca-b9c2-3cf489845764" containerID="9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c" exitCode=0 Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.359303 4699 generic.go:334] "Generic (PLEG): container finished" podID="07f96d49-8858-4aca-b9c2-3cf489845764" containerID="e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3" exitCode=0 Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.359407 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07f96d49-8858-4aca-b9c2-3cf489845764","Type":"ContainerDied","Data":"c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351"} Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.359444 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07f96d49-8858-4aca-b9c2-3cf489845764","Type":"ContainerDied","Data":"02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9"} Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.359460 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07f96d49-8858-4aca-b9c2-3cf489845764","Type":"ContainerDied","Data":"9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c"} Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.359473 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07f96d49-8858-4aca-b9c2-3cf489845764","Type":"ContainerDied","Data":"e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3"} Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.359483 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07f96d49-8858-4aca-b9c2-3cf489845764","Type":"ContainerDied","Data":"47100af0619c1b65c26d0c6fde8e00cd8b96fe31ad0063ee87c9c6c36917918c"} Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.359502 4699 scope.go:117] "RemoveContainer" containerID="c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.359693 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.366137 4699 generic.go:334] "Generic (PLEG): container finished" podID="81e6c561-d55c-48fa-94a9-2dd7d491fd48" containerID="22eb49c2ddda393e2d578a662fade5ebe09a2739d93f76bbdbb5c7f8759be981" exitCode=0 Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.369250 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.372209 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"81e6c561-d55c-48fa-94a9-2dd7d491fd48","Type":"ContainerDied","Data":"22eb49c2ddda393e2d578a662fade5ebe09a2739d93f76bbdbb5c7f8759be981"} Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.372269 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"81e6c561-d55c-48fa-94a9-2dd7d491fd48","Type":"ContainerDied","Data":"48e983e75dfbee9e41159572aae0afa12ee51c7366ffabb530747e91bb647659"} Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.388013 4699 scope.go:117] "RemoveContainer" containerID="02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.416279 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.422363 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07f96d49-8858-4aca-b9c2-3cf489845764" (UID: "07f96d49-8858-4aca-b9c2-3cf489845764"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.433291 4699 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07f96d49-8858-4aca-b9c2-3cf489845764-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.433325 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkp48\" (UniqueName: \"kubernetes.io/projected/07f96d49-8858-4aca-b9c2-3cf489845764-kube-api-access-rkp48\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.433336 4699 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.433345 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.433353 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.437761 4699 scope.go:117] "RemoveContainer" containerID="9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.438843 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-config-data" (OuterVolumeSpecName: "config-data") pod "07f96d49-8858-4aca-b9c2-3cf489845764" (UID: "07f96d49-8858-4aca-b9c2-3cf489845764"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.439691 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.453673 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 11:32:42 crc kubenswrapper[4699]: E0226 11:32:42.454103 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f96d49-8858-4aca-b9c2-3cf489845764" containerName="ceilometer-notification-agent" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.454138 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f96d49-8858-4aca-b9c2-3cf489845764" containerName="ceilometer-notification-agent" Feb 26 11:32:42 crc kubenswrapper[4699]: E0226 11:32:42.454153 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f96d49-8858-4aca-b9c2-3cf489845764" containerName="sg-core" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.454159 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f96d49-8858-4aca-b9c2-3cf489845764" containerName="sg-core" Feb 26 11:32:42 crc kubenswrapper[4699]: E0226 11:32:42.454171 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81e6c561-d55c-48fa-94a9-2dd7d491fd48" containerName="probe" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.454178 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e6c561-d55c-48fa-94a9-2dd7d491fd48" containerName="probe" Feb 26 11:32:42 crc kubenswrapper[4699]: E0226 11:32:42.454190 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81e6c561-d55c-48fa-94a9-2dd7d491fd48" containerName="cinder-scheduler" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.454196 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e6c561-d55c-48fa-94a9-2dd7d491fd48" containerName="cinder-scheduler" Feb 26 11:32:42 crc kubenswrapper[4699]: E0226 11:32:42.454212 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f96d49-8858-4aca-b9c2-3cf489845764" containerName="ceilometer-central-agent" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.454219 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f96d49-8858-4aca-b9c2-3cf489845764" containerName="ceilometer-central-agent" Feb 26 11:32:42 crc kubenswrapper[4699]: E0226 11:32:42.454240 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f96d49-8858-4aca-b9c2-3cf489845764" containerName="proxy-httpd" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.454246 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f96d49-8858-4aca-b9c2-3cf489845764" containerName="proxy-httpd" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.454412 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="81e6c561-d55c-48fa-94a9-2dd7d491fd48" containerName="probe" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.454423 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="07f96d49-8858-4aca-b9c2-3cf489845764" containerName="sg-core" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.454438 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="81e6c561-d55c-48fa-94a9-2dd7d491fd48" containerName="cinder-scheduler" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.454450 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="07f96d49-8858-4aca-b9c2-3cf489845764" containerName="ceilometer-central-agent" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.454464 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="07f96d49-8858-4aca-b9c2-3cf489845764" containerName="proxy-httpd" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.454475 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="07f96d49-8858-4aca-b9c2-3cf489845764" containerName="ceilometer-notification-agent" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.455468 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.458181 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.463360 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.500351 4699 scope.go:117] "RemoveContainer" containerID="e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.524105 4699 scope.go:117] "RemoveContainer" containerID="c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351" Feb 26 11:32:42 crc kubenswrapper[4699]: E0226 11:32:42.524585 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351\": container with ID starting with c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351 not found: ID does not exist" containerID="c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.524613 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351"} err="failed to get container status \"c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351\": rpc error: code = NotFound desc = could not find container \"c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351\": container with ID starting with c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351 not found: ID does not exist" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.524634 4699 scope.go:117] "RemoveContainer" containerID="02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9" Feb 26 11:32:42 crc kubenswrapper[4699]: E0226 11:32:42.525045 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9\": container with ID starting with 02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9 not found: ID does not exist" containerID="02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.525066 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9"} err="failed to get container status \"02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9\": rpc error: code = NotFound desc = could not find container \"02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9\": container with ID starting with 02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9 not found: ID does not exist" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.525079 4699 scope.go:117] "RemoveContainer" containerID="9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c" Feb 26 11:32:42 crc kubenswrapper[4699]: E0226 11:32:42.525824 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c\": container with ID starting with 9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c not found: ID does not exist" containerID="9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.525865 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c"} err="failed to get container status \"9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c\": rpc error: code = NotFound desc = could not find container \"9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c\": container with ID starting with 9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c not found: ID does not exist" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.525897 4699 scope.go:117] "RemoveContainer" containerID="e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3" Feb 26 11:32:42 crc kubenswrapper[4699]: E0226 11:32:42.526236 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3\": container with ID starting with e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3 not found: ID does not exist" containerID="e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.526261 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3"} err="failed to get container status \"e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3\": rpc error: code = NotFound desc = could not find container \"e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3\": container with ID starting with e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3 not found: ID does not exist" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.526278 4699 scope.go:117] "RemoveContainer" containerID="c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.526552 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351"} err="failed to get container status \"c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351\": rpc error: code = NotFound desc = could not find container \"c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351\": container with ID starting with c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351 not found: ID does not exist" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.526576 4699 scope.go:117] "RemoveContainer" containerID="02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.526910 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9"} err="failed to get container status \"02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9\": rpc error: code = NotFound desc = could not find container \"02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9\": container with ID starting with 02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9 not found: ID does not exist" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.526958 4699 scope.go:117] "RemoveContainer" containerID="9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.527765 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c"} err="failed to get container status \"9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c\": rpc error: code = NotFound desc = could not find container \"9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c\": container with ID starting with 9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c not found: ID does not exist" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.527838 4699 scope.go:117] "RemoveContainer" containerID="e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.528364 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3"} err="failed to get container status \"e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3\": rpc error: code = NotFound desc = could not find container \"e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3\": container with ID starting with e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3 not found: ID does not exist" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.528394 4699 scope.go:117] "RemoveContainer" containerID="c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.528687 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351"} err="failed to get container status \"c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351\": rpc error: code = NotFound desc = could not find container \"c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351\": container with ID starting with c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351 not found: ID does not exist" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.528713 4699 scope.go:117] "RemoveContainer" containerID="02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.528885 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9"} err="failed to get container status \"02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9\": rpc error: code = NotFound desc = could not find container \"02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9\": container with ID starting with 02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9 not found: ID does not exist" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.528907 4699 scope.go:117] "RemoveContainer" containerID="9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.529224 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c"} err="failed to get container status \"9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c\": rpc error: code = NotFound desc = could not find container \"9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c\": container with ID starting with 9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c not found: ID does not exist" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.529248 4699 scope.go:117] "RemoveContainer" containerID="e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.529713 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3"} err="failed to get container status \"e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3\": rpc error: code = NotFound desc = could not find container \"e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3\": container with ID starting with e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3 not found: ID does not exist" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.529739 4699 scope.go:117] "RemoveContainer" containerID="c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.529996 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351"} err="failed to get container status \"c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351\": rpc error: code = NotFound desc = could not find container \"c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351\": container with ID starting with c714dd3df08d24893c1686f2182b14c09d893c6e1d7a839c51ccd459a3ce9351 not found: ID does not exist" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.530220 4699 scope.go:117] "RemoveContainer" containerID="02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.530674 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9"} err="failed to get container status \"02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9\": rpc error: code = NotFound desc = could not find container \"02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9\": container with ID starting with 02dd2ccd8503a1659c35420f8d63f4295c914297fd09360844038e1619316fe9 not found: ID does not exist" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.530706 4699 scope.go:117] "RemoveContainer" containerID="9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.531197 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c"} err="failed to get container status \"9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c\": rpc error: code = NotFound desc = could not find container \"9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c\": container with ID starting with 9d1a910b1344e0f76de46c17e25b2c0531d04770743ec34ccf90ed802219e36c not found: ID does not exist" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.531222 4699 scope.go:117] "RemoveContainer" containerID="e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.531497 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3"} err="failed to get container status \"e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3\": rpc error: code = NotFound desc = could not find container \"e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3\": container with ID starting with e4311314351f6e5a0b18fcaf16a77e48c75c4d300a5b98e2ac381a56049f63e3 not found: ID does not exist" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.531523 4699 scope.go:117] "RemoveContainer" containerID="85950baeb6758614d608c7fa199dcfa1f953ec17c373ed9a7510133fe5c2bb78" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.534810 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07f96d49-8858-4aca-b9c2-3cf489845764-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.557511 4699 scope.go:117] "RemoveContainer" containerID="22eb49c2ddda393e2d578a662fade5ebe09a2739d93f76bbdbb5c7f8759be981" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.586811 4699 scope.go:117] "RemoveContainer" containerID="85950baeb6758614d608c7fa199dcfa1f953ec17c373ed9a7510133fe5c2bb78" Feb 26 11:32:42 crc kubenswrapper[4699]: E0226 11:32:42.587286 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85950baeb6758614d608c7fa199dcfa1f953ec17c373ed9a7510133fe5c2bb78\": container with ID starting with 85950baeb6758614d608c7fa199dcfa1f953ec17c373ed9a7510133fe5c2bb78 not found: ID does not exist" containerID="85950baeb6758614d608c7fa199dcfa1f953ec17c373ed9a7510133fe5c2bb78" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.587341 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85950baeb6758614d608c7fa199dcfa1f953ec17c373ed9a7510133fe5c2bb78"} err="failed to get container status \"85950baeb6758614d608c7fa199dcfa1f953ec17c373ed9a7510133fe5c2bb78\": rpc error: code = NotFound desc = could not find container \"85950baeb6758614d608c7fa199dcfa1f953ec17c373ed9a7510133fe5c2bb78\": container with ID starting with 85950baeb6758614d608c7fa199dcfa1f953ec17c373ed9a7510133fe5c2bb78 not found: ID does not exist" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.587370 4699 scope.go:117] "RemoveContainer" containerID="22eb49c2ddda393e2d578a662fade5ebe09a2739d93f76bbdbb5c7f8759be981" Feb 26 11:32:42 crc kubenswrapper[4699]: E0226 11:32:42.587624 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22eb49c2ddda393e2d578a662fade5ebe09a2739d93f76bbdbb5c7f8759be981\": container with ID starting with 22eb49c2ddda393e2d578a662fade5ebe09a2739d93f76bbdbb5c7f8759be981 not found: ID does not exist" containerID="22eb49c2ddda393e2d578a662fade5ebe09a2739d93f76bbdbb5c7f8759be981" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.587654 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22eb49c2ddda393e2d578a662fade5ebe09a2739d93f76bbdbb5c7f8759be981"} err="failed to get container status \"22eb49c2ddda393e2d578a662fade5ebe09a2739d93f76bbdbb5c7f8759be981\": rpc error: code = NotFound desc = could not find container \"22eb49c2ddda393e2d578a662fade5ebe09a2739d93f76bbdbb5c7f8759be981\": container with ID starting with 22eb49c2ddda393e2d578a662fade5ebe09a2739d93f76bbdbb5c7f8759be981 not found: ID does not exist" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.637066 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbf1f488-444f-45d3-b5e6-44506bf45f8e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fbf1f488-444f-45d3-b5e6-44506bf45f8e\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.637153 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fbf1f488-444f-45d3-b5e6-44506bf45f8e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fbf1f488-444f-45d3-b5e6-44506bf45f8e\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.637211 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fbf1f488-444f-45d3-b5e6-44506bf45f8e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fbf1f488-444f-45d3-b5e6-44506bf45f8e\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.637273 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbf1f488-444f-45d3-b5e6-44506bf45f8e-config-data\") pod \"cinder-scheduler-0\" (UID: \"fbf1f488-444f-45d3-b5e6-44506bf45f8e\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.637407 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr9rw\" (UniqueName: \"kubernetes.io/projected/fbf1f488-444f-45d3-b5e6-44506bf45f8e-kube-api-access-rr9rw\") pod \"cinder-scheduler-0\" (UID: \"fbf1f488-444f-45d3-b5e6-44506bf45f8e\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.637475 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbf1f488-444f-45d3-b5e6-44506bf45f8e-scripts\") pod \"cinder-scheduler-0\" (UID: \"fbf1f488-444f-45d3-b5e6-44506bf45f8e\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.696163 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.704240 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.735194 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.738511 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.739350 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fbf1f488-444f-45d3-b5e6-44506bf45f8e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fbf1f488-444f-45d3-b5e6-44506bf45f8e\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.739435 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fbf1f488-444f-45d3-b5e6-44506bf45f8e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fbf1f488-444f-45d3-b5e6-44506bf45f8e\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.739555 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbf1f488-444f-45d3-b5e6-44506bf45f8e-config-data\") pod \"cinder-scheduler-0\" (UID: \"fbf1f488-444f-45d3-b5e6-44506bf45f8e\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.739699 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr9rw\" (UniqueName: \"kubernetes.io/projected/fbf1f488-444f-45d3-b5e6-44506bf45f8e-kube-api-access-rr9rw\") pod \"cinder-scheduler-0\" (UID: \"fbf1f488-444f-45d3-b5e6-44506bf45f8e\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.739812 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbf1f488-444f-45d3-b5e6-44506bf45f8e-scripts\") pod \"cinder-scheduler-0\" (UID: \"fbf1f488-444f-45d3-b5e6-44506bf45f8e\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.739905 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbf1f488-444f-45d3-b5e6-44506bf45f8e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fbf1f488-444f-45d3-b5e6-44506bf45f8e\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.739994 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fbf1f488-444f-45d3-b5e6-44506bf45f8e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fbf1f488-444f-45d3-b5e6-44506bf45f8e\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.743884 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.744328 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.744986 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbf1f488-444f-45d3-b5e6-44506bf45f8e-scripts\") pod \"cinder-scheduler-0\" (UID: \"fbf1f488-444f-45d3-b5e6-44506bf45f8e\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.744994 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fbf1f488-444f-45d3-b5e6-44506bf45f8e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fbf1f488-444f-45d3-b5e6-44506bf45f8e\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.746406 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbf1f488-444f-45d3-b5e6-44506bf45f8e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fbf1f488-444f-45d3-b5e6-44506bf45f8e\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.748450 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbf1f488-444f-45d3-b5e6-44506bf45f8e-config-data\") pod \"cinder-scheduler-0\" (UID: \"fbf1f488-444f-45d3-b5e6-44506bf45f8e\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.752515 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.785501 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr9rw\" (UniqueName: \"kubernetes.io/projected/fbf1f488-444f-45d3-b5e6-44506bf45f8e-kube-api-access-rr9rw\") pod \"cinder-scheduler-0\" (UID: \"fbf1f488-444f-45d3-b5e6-44506bf45f8e\") " pod="openstack/cinder-scheduler-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.787665 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.841643 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b07016c-61a8-4b19-8635-4f6475523855-log-httpd\") pod \"ceilometer-0\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " pod="openstack/ceilometer-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.841691 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b07016c-61a8-4b19-8635-4f6475523855-run-httpd\") pod \"ceilometer-0\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " pod="openstack/ceilometer-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.841727 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " pod="openstack/ceilometer-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.841771 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " pod="openstack/ceilometer-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.841798 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-scripts\") pod \"ceilometer-0\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " pod="openstack/ceilometer-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.841897 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-config-data\") pod \"ceilometer-0\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " pod="openstack/ceilometer-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.841926 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnc2v\" (UniqueName: \"kubernetes.io/projected/6b07016c-61a8-4b19-8635-4f6475523855-kube-api-access-vnc2v\") pod \"ceilometer-0\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " pod="openstack/ceilometer-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.945015 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b07016c-61a8-4b19-8635-4f6475523855-log-httpd\") pod \"ceilometer-0\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " pod="openstack/ceilometer-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.945456 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b07016c-61a8-4b19-8635-4f6475523855-run-httpd\") pod \"ceilometer-0\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " pod="openstack/ceilometer-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.945496 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " pod="openstack/ceilometer-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.945555 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " pod="openstack/ceilometer-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.945585 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-scripts\") pod \"ceilometer-0\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " pod="openstack/ceilometer-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.945610 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-config-data\") pod \"ceilometer-0\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " pod="openstack/ceilometer-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.945638 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnc2v\" (UniqueName: \"kubernetes.io/projected/6b07016c-61a8-4b19-8635-4f6475523855-kube-api-access-vnc2v\") pod \"ceilometer-0\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " pod="openstack/ceilometer-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.946065 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b07016c-61a8-4b19-8635-4f6475523855-run-httpd\") pod \"ceilometer-0\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " pod="openstack/ceilometer-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.946455 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b07016c-61a8-4b19-8635-4f6475523855-log-httpd\") pod \"ceilometer-0\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " pod="openstack/ceilometer-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.952478 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-scripts\") pod \"ceilometer-0\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " pod="openstack/ceilometer-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.952840 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-config-data\") pod \"ceilometer-0\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " pod="openstack/ceilometer-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.962161 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " pod="openstack/ceilometer-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.964065 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " pod="openstack/ceilometer-0" Feb 26 11:32:42 crc kubenswrapper[4699]: I0226 11:32:42.966132 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnc2v\" (UniqueName: \"kubernetes.io/projected/6b07016c-61a8-4b19-8635-4f6475523855-kube-api-access-vnc2v\") pod \"ceilometer-0\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " pod="openstack/ceilometer-0" Feb 26 11:32:43 crc kubenswrapper[4699]: I0226 11:32:43.090930 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:32:43 crc kubenswrapper[4699]: I0226 11:32:43.510341 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 11:32:43 crc kubenswrapper[4699]: I0226 11:32:43.741058 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:32:44 crc kubenswrapper[4699]: I0226 11:32:44.151723 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 26 11:32:44 crc kubenswrapper[4699]: I0226 11:32:44.288931 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07f96d49-8858-4aca-b9c2-3cf489845764" path="/var/lib/kubelet/pods/07f96d49-8858-4aca-b9c2-3cf489845764/volumes" Feb 26 11:32:44 crc kubenswrapper[4699]: I0226 11:32:44.290010 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81e6c561-d55c-48fa-94a9-2dd7d491fd48" path="/var/lib/kubelet/pods/81e6c561-d55c-48fa-94a9-2dd7d491fd48/volumes" Feb 26 11:32:44 crc kubenswrapper[4699]: I0226 11:32:44.417369 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fbf1f488-444f-45d3-b5e6-44506bf45f8e","Type":"ContainerStarted","Data":"d7313f7d812f0c721fef8f099cad83da0fa5cd005f09edd6e1f3b5f85eb5c41c"} Feb 26 11:32:44 crc kubenswrapper[4699]: I0226 11:32:44.417685 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fbf1f488-444f-45d3-b5e6-44506bf45f8e","Type":"ContainerStarted","Data":"ba90720c72681e041c30d96ae30052b46b323ce9d0eb3d66eef995ce500a24cd"} Feb 26 11:32:44 crc kubenswrapper[4699]: I0226 11:32:44.420093 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b07016c-61a8-4b19-8635-4f6475523855","Type":"ContainerStarted","Data":"902e07d96029273f87858340fd822c319ca3a6b168bf4b0377c625b530b7ae55"} Feb 26 11:32:45 crc kubenswrapper[4699]: I0226 11:32:45.433855 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fbf1f488-444f-45d3-b5e6-44506bf45f8e","Type":"ContainerStarted","Data":"6a6319f2fcf1acee6f01d40acf526a906716c01afd194e182603f39590d2124d"} Feb 26 11:32:45 crc kubenswrapper[4699]: I0226 11:32:45.442973 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b07016c-61a8-4b19-8635-4f6475523855","Type":"ContainerStarted","Data":"b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399"} Feb 26 11:32:45 crc kubenswrapper[4699]: I0226 11:32:45.443013 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b07016c-61a8-4b19-8635-4f6475523855","Type":"ContainerStarted","Data":"a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924"} Feb 26 11:32:45 crc kubenswrapper[4699]: I0226 11:32:45.456217 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.4562006 podStartE2EDuration="3.4562006s" podCreationTimestamp="2026-02-26 11:32:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:32:45.452356909 +0000 UTC m=+1311.263183343" watchObservedRunningTime="2026-02-26 11:32:45.4562006 +0000 UTC m=+1311.267027034" Feb 26 11:32:45 crc kubenswrapper[4699]: I0226 11:32:45.830457 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:32:47 crc kubenswrapper[4699]: I0226 11:32:47.465826 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b07016c-61a8-4b19-8635-4f6475523855","Type":"ContainerStarted","Data":"8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb"} Feb 26 11:32:47 crc kubenswrapper[4699]: I0226 11:32:47.673746 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:47 crc kubenswrapper[4699]: I0226 11:32:47.675696 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-78cbc76b59-m6shv" Feb 26 11:32:47 crc kubenswrapper[4699]: I0226 11:32:47.788239 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 26 11:32:48 crc kubenswrapper[4699]: I0226 11:32:48.130947 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-57899c756d-w9pc5" podUID="78d85906-b78a-46eb-b5dd-4da95c1222d8" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Feb 26 11:32:48 crc kubenswrapper[4699]: I0226 11:32:48.131095 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:32:48 crc kubenswrapper[4699]: I0226 11:32:48.489735 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6d45896d49-mh862" Feb 26 11:32:48 crc kubenswrapper[4699]: I0226 11:32:48.581096 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-59dd795c56-7kv72"] Feb 26 11:32:48 crc kubenswrapper[4699]: I0226 11:32:48.581483 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-59dd795c56-7kv72" podUID="715a80f0-cdba-439c-8a82-4838bf8f7e50" containerName="neutron-api" containerID="cri-o://e5405c70871ee395752c1da1df07066a938aedcb1ac422f960283753ab469ce2" gracePeriod=30 Feb 26 11:32:48 crc kubenswrapper[4699]: I0226 11:32:48.582090 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-59dd795c56-7kv72" podUID="715a80f0-cdba-439c-8a82-4838bf8f7e50" containerName="neutron-httpd" containerID="cri-o://79c878075032024e487997f5af9db4e3be830d392f6df8fc08ba5dbf79596db4" gracePeriod=30 Feb 26 11:32:49 crc kubenswrapper[4699]: I0226 11:32:49.487900 4699 generic.go:334] "Generic (PLEG): container finished" podID="715a80f0-cdba-439c-8a82-4838bf8f7e50" containerID="79c878075032024e487997f5af9db4e3be830d392f6df8fc08ba5dbf79596db4" exitCode=0 Feb 26 11:32:49 crc kubenswrapper[4699]: I0226 11:32:49.487968 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59dd795c56-7kv72" event={"ID":"715a80f0-cdba-439c-8a82-4838bf8f7e50","Type":"ContainerDied","Data":"79c878075032024e487997f5af9db4e3be830d392f6df8fc08ba5dbf79596db4"} Feb 26 11:32:53 crc kubenswrapper[4699]: I0226 11:32:53.154417 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 26 11:32:53 crc kubenswrapper[4699]: I0226 11:32:53.530949 4699 generic.go:334] "Generic (PLEG): container finished" podID="78d85906-b78a-46eb-b5dd-4da95c1222d8" containerID="de9a25314ef41f7d3414b57dcaeec2a9add4d5ecb708b80dc9af27c79856ba9b" exitCode=137 Feb 26 11:32:53 crc kubenswrapper[4699]: I0226 11:32:53.530990 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57899c756d-w9pc5" event={"ID":"78d85906-b78a-46eb-b5dd-4da95c1222d8","Type":"ContainerDied","Data":"de9a25314ef41f7d3414b57dcaeec2a9add4d5ecb708b80dc9af27c79856ba9b"} Feb 26 11:32:53 crc kubenswrapper[4699]: I0226 11:32:53.958248 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.101302 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78d85906-b78a-46eb-b5dd-4da95c1222d8-logs\") pod \"78d85906-b78a-46eb-b5dd-4da95c1222d8\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.102164 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78d85906-b78a-46eb-b5dd-4da95c1222d8-logs" (OuterVolumeSpecName: "logs") pod "78d85906-b78a-46eb-b5dd-4da95c1222d8" (UID: "78d85906-b78a-46eb-b5dd-4da95c1222d8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.102338 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78d85906-b78a-46eb-b5dd-4da95c1222d8-scripts\") pod \"78d85906-b78a-46eb-b5dd-4da95c1222d8\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.102988 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d85906-b78a-46eb-b5dd-4da95c1222d8-combined-ca-bundle\") pod \"78d85906-b78a-46eb-b5dd-4da95c1222d8\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.103045 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78d85906-b78a-46eb-b5dd-4da95c1222d8-config-data\") pod \"78d85906-b78a-46eb-b5dd-4da95c1222d8\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.103083 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/78d85906-b78a-46eb-b5dd-4da95c1222d8-horizon-tls-certs\") pod \"78d85906-b78a-46eb-b5dd-4da95c1222d8\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.103300 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/78d85906-b78a-46eb-b5dd-4da95c1222d8-horizon-secret-key\") pod \"78d85906-b78a-46eb-b5dd-4da95c1222d8\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.103351 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-252gw\" (UniqueName: \"kubernetes.io/projected/78d85906-b78a-46eb-b5dd-4da95c1222d8-kube-api-access-252gw\") pod \"78d85906-b78a-46eb-b5dd-4da95c1222d8\" (UID: \"78d85906-b78a-46eb-b5dd-4da95c1222d8\") " Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.104376 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78d85906-b78a-46eb-b5dd-4da95c1222d8-logs\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.108174 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d85906-b78a-46eb-b5dd-4da95c1222d8-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "78d85906-b78a-46eb-b5dd-4da95c1222d8" (UID: "78d85906-b78a-46eb-b5dd-4da95c1222d8"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.112129 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78d85906-b78a-46eb-b5dd-4da95c1222d8-kube-api-access-252gw" (OuterVolumeSpecName: "kube-api-access-252gw") pod "78d85906-b78a-46eb-b5dd-4da95c1222d8" (UID: "78d85906-b78a-46eb-b5dd-4da95c1222d8"). InnerVolumeSpecName "kube-api-access-252gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.134566 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78d85906-b78a-46eb-b5dd-4da95c1222d8-scripts" (OuterVolumeSpecName: "scripts") pod "78d85906-b78a-46eb-b5dd-4da95c1222d8" (UID: "78d85906-b78a-46eb-b5dd-4da95c1222d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.135554 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78d85906-b78a-46eb-b5dd-4da95c1222d8-config-data" (OuterVolumeSpecName: "config-data") pod "78d85906-b78a-46eb-b5dd-4da95c1222d8" (UID: "78d85906-b78a-46eb-b5dd-4da95c1222d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.137308 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d85906-b78a-46eb-b5dd-4da95c1222d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78d85906-b78a-46eb-b5dd-4da95c1222d8" (UID: "78d85906-b78a-46eb-b5dd-4da95c1222d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.186189 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d85906-b78a-46eb-b5dd-4da95c1222d8-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "78d85906-b78a-46eb-b5dd-4da95c1222d8" (UID: "78d85906-b78a-46eb-b5dd-4da95c1222d8"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.208308 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78d85906-b78a-46eb-b5dd-4da95c1222d8-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.208342 4699 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/78d85906-b78a-46eb-b5dd-4da95c1222d8-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.208356 4699 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/78d85906-b78a-46eb-b5dd-4da95c1222d8-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.208371 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-252gw\" (UniqueName: \"kubernetes.io/projected/78d85906-b78a-46eb-b5dd-4da95c1222d8-kube-api-access-252gw\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.208385 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78d85906-b78a-46eb-b5dd-4da95c1222d8-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.208396 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d85906-b78a-46eb-b5dd-4da95c1222d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.540811 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"16db7cc3-bd7c-44aa-b92f-d2a645d96ef0","Type":"ContainerStarted","Data":"7a084309507e408ca5233a482e41c1bf08c7da3ff18ca5d93123d8caac0c9c63"} Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.543890 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57899c756d-w9pc5" event={"ID":"78d85906-b78a-46eb-b5dd-4da95c1222d8","Type":"ContainerDied","Data":"70b6c63ca13b9c59a7d033612c4fd91b9c2d11c7f06db99a50ef89d5c7c7c5da"} Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.544169 4699 scope.go:117] "RemoveContainer" containerID="5570b961c7c2f73533bbe65fa87a9f8cc0b880e79add1f25b918377e32b9375d" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.544267 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57899c756d-w9pc5" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.556161 4699 generic.go:334] "Generic (PLEG): container finished" podID="715a80f0-cdba-439c-8a82-4838bf8f7e50" containerID="e5405c70871ee395752c1da1df07066a938aedcb1ac422f960283753ab469ce2" exitCode=0 Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.556225 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59dd795c56-7kv72" event={"ID":"715a80f0-cdba-439c-8a82-4838bf8f7e50","Type":"ContainerDied","Data":"e5405c70871ee395752c1da1df07066a938aedcb1ac422f960283753ab469ce2"} Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.578487 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.671453034 podStartE2EDuration="18.578468089s" podCreationTimestamp="2026-02-26 11:32:36 +0000 UTC" firstStartedPulling="2026-02-26 11:32:37.657299522 +0000 UTC m=+1303.468125956" lastFinishedPulling="2026-02-26 11:32:53.564314577 +0000 UTC m=+1319.375141011" observedRunningTime="2026-02-26 11:32:54.57539743 +0000 UTC m=+1320.386223884" watchObservedRunningTime="2026-02-26 11:32:54.578468089 +0000 UTC m=+1320.389294523" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.591920 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b07016c-61a8-4b19-8635-4f6475523855","Type":"ContainerStarted","Data":"f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca"} Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.592144 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b07016c-61a8-4b19-8635-4f6475523855" containerName="ceilometer-central-agent" containerID="cri-o://a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924" gracePeriod=30 Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.592462 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.592790 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b07016c-61a8-4b19-8635-4f6475523855" containerName="proxy-httpd" containerID="cri-o://f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca" gracePeriod=30 Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.592846 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b07016c-61a8-4b19-8635-4f6475523855" containerName="sg-core" containerID="cri-o://8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb" gracePeriod=30 Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.592887 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b07016c-61a8-4b19-8635-4f6475523855" containerName="ceilometer-notification-agent" containerID="cri-o://b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399" gracePeriod=30 Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.622130 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59dd795c56-7kv72" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.648336 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.8557516720000002 podStartE2EDuration="12.648315398s" podCreationTimestamp="2026-02-26 11:32:42 +0000 UTC" firstStartedPulling="2026-02-26 11:32:43.765570862 +0000 UTC m=+1309.576397296" lastFinishedPulling="2026-02-26 11:32:53.558134588 +0000 UTC m=+1319.368961022" observedRunningTime="2026-02-26 11:32:54.641299614 +0000 UTC m=+1320.452126068" watchObservedRunningTime="2026-02-26 11:32:54.648315398 +0000 UTC m=+1320.459141832" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.720666 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-ovndb-tls-certs\") pod \"715a80f0-cdba-439c-8a82-4838bf8f7e50\" (UID: \"715a80f0-cdba-439c-8a82-4838bf8f7e50\") " Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.720764 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st8wb\" (UniqueName: \"kubernetes.io/projected/715a80f0-cdba-439c-8a82-4838bf8f7e50-kube-api-access-st8wb\") pod \"715a80f0-cdba-439c-8a82-4838bf8f7e50\" (UID: \"715a80f0-cdba-439c-8a82-4838bf8f7e50\") " Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.720842 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-combined-ca-bundle\") pod \"715a80f0-cdba-439c-8a82-4838bf8f7e50\" (UID: \"715a80f0-cdba-439c-8a82-4838bf8f7e50\") " Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.720986 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-httpd-config\") pod \"715a80f0-cdba-439c-8a82-4838bf8f7e50\" (UID: \"715a80f0-cdba-439c-8a82-4838bf8f7e50\") " Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.721030 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-config\") pod \"715a80f0-cdba-439c-8a82-4838bf8f7e50\" (UID: \"715a80f0-cdba-439c-8a82-4838bf8f7e50\") " Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.730147 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57899c756d-w9pc5"] Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.734650 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/715a80f0-cdba-439c-8a82-4838bf8f7e50-kube-api-access-st8wb" (OuterVolumeSpecName: "kube-api-access-st8wb") pod "715a80f0-cdba-439c-8a82-4838bf8f7e50" (UID: "715a80f0-cdba-439c-8a82-4838bf8f7e50"). InnerVolumeSpecName "kube-api-access-st8wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.740439 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-57899c756d-w9pc5"] Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.752325 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "715a80f0-cdba-439c-8a82-4838bf8f7e50" (UID: "715a80f0-cdba-439c-8a82-4838bf8f7e50"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.816318 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "715a80f0-cdba-439c-8a82-4838bf8f7e50" (UID: "715a80f0-cdba-439c-8a82-4838bf8f7e50"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.822201 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-config" (OuterVolumeSpecName: "config") pod "715a80f0-cdba-439c-8a82-4838bf8f7e50" (UID: "715a80f0-cdba-439c-8a82-4838bf8f7e50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.822928 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-config\") pod \"715a80f0-cdba-439c-8a82-4838bf8f7e50\" (UID: \"715a80f0-cdba-439c-8a82-4838bf8f7e50\") " Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.823866 4699 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.823888 4699 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.823901 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st8wb\" (UniqueName: \"kubernetes.io/projected/715a80f0-cdba-439c-8a82-4838bf8f7e50-kube-api-access-st8wb\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:54 crc kubenswrapper[4699]: W0226 11:32:54.823995 4699 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/715a80f0-cdba-439c-8a82-4838bf8f7e50/volumes/kubernetes.io~secret/config Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.824016 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-config" (OuterVolumeSpecName: "config") pod "715a80f0-cdba-439c-8a82-4838bf8f7e50" (UID: "715a80f0-cdba-439c-8a82-4838bf8f7e50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.825736 4699 scope.go:117] "RemoveContainer" containerID="de9a25314ef41f7d3414b57dcaeec2a9add4d5ecb708b80dc9af27c79856ba9b" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.859766 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "715a80f0-cdba-439c-8a82-4838bf8f7e50" (UID: "715a80f0-cdba-439c-8a82-4838bf8f7e50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.925338 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:54 crc kubenswrapper[4699]: I0226 11:32:54.925600 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/715a80f0-cdba-439c-8a82-4838bf8f7e50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.352614 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.434161 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-sg-core-conf-yaml\") pod \"6b07016c-61a8-4b19-8635-4f6475523855\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.434309 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b07016c-61a8-4b19-8635-4f6475523855-log-httpd\") pod \"6b07016c-61a8-4b19-8635-4f6475523855\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.434365 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnc2v\" (UniqueName: \"kubernetes.io/projected/6b07016c-61a8-4b19-8635-4f6475523855-kube-api-access-vnc2v\") pod \"6b07016c-61a8-4b19-8635-4f6475523855\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.434391 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-scripts\") pod \"6b07016c-61a8-4b19-8635-4f6475523855\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.434439 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b07016c-61a8-4b19-8635-4f6475523855-run-httpd\") pod \"6b07016c-61a8-4b19-8635-4f6475523855\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.434946 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b07016c-61a8-4b19-8635-4f6475523855-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6b07016c-61a8-4b19-8635-4f6475523855" (UID: "6b07016c-61a8-4b19-8635-4f6475523855"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.435096 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-config-data\") pod \"6b07016c-61a8-4b19-8635-4f6475523855\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.435200 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-combined-ca-bundle\") pod \"6b07016c-61a8-4b19-8635-4f6475523855\" (UID: \"6b07016c-61a8-4b19-8635-4f6475523855\") " Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.435216 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b07016c-61a8-4b19-8635-4f6475523855-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6b07016c-61a8-4b19-8635-4f6475523855" (UID: "6b07016c-61a8-4b19-8635-4f6475523855"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.435798 4699 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b07016c-61a8-4b19-8635-4f6475523855-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.435818 4699 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b07016c-61a8-4b19-8635-4f6475523855-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.438957 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-scripts" (OuterVolumeSpecName: "scripts") pod "6b07016c-61a8-4b19-8635-4f6475523855" (UID: "6b07016c-61a8-4b19-8635-4f6475523855"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.439287 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b07016c-61a8-4b19-8635-4f6475523855-kube-api-access-vnc2v" (OuterVolumeSpecName: "kube-api-access-vnc2v") pod "6b07016c-61a8-4b19-8635-4f6475523855" (UID: "6b07016c-61a8-4b19-8635-4f6475523855"). InnerVolumeSpecName "kube-api-access-vnc2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.461906 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6b07016c-61a8-4b19-8635-4f6475523855" (UID: "6b07016c-61a8-4b19-8635-4f6475523855"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.529823 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b07016c-61a8-4b19-8635-4f6475523855" (UID: "6b07016c-61a8-4b19-8635-4f6475523855"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.537506 4699 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.537588 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnc2v\" (UniqueName: \"kubernetes.io/projected/6b07016c-61a8-4b19-8635-4f6475523855-kube-api-access-vnc2v\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.537604 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.537631 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.557600 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-config-data" (OuterVolumeSpecName: "config-data") pod "6b07016c-61a8-4b19-8635-4f6475523855" (UID: "6b07016c-61a8-4b19-8635-4f6475523855"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.636577 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59dd795c56-7kv72" event={"ID":"715a80f0-cdba-439c-8a82-4838bf8f7e50","Type":"ContainerDied","Data":"6345d756a7b816036dc69f325dd74145097fc551abbeb710dfcdf0451b76e1c8"} Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.636633 4699 scope.go:117] "RemoveContainer" containerID="79c878075032024e487997f5af9db4e3be830d392f6df8fc08ba5dbf79596db4" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.636647 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59dd795c56-7kv72" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.649087 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b07016c-61a8-4b19-8635-4f6475523855-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.671967 4699 generic.go:334] "Generic (PLEG): container finished" podID="6b07016c-61a8-4b19-8635-4f6475523855" containerID="f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca" exitCode=0 Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.672001 4699 generic.go:334] "Generic (PLEG): container finished" podID="6b07016c-61a8-4b19-8635-4f6475523855" containerID="8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb" exitCode=2 Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.672010 4699 generic.go:334] "Generic (PLEG): container finished" podID="6b07016c-61a8-4b19-8635-4f6475523855" containerID="b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399" exitCode=0 Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.672018 4699 generic.go:334] "Generic (PLEG): container finished" podID="6b07016c-61a8-4b19-8635-4f6475523855" containerID="a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924" exitCode=0 Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.672969 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.673005 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b07016c-61a8-4b19-8635-4f6475523855","Type":"ContainerDied","Data":"f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca"} Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.673035 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b07016c-61a8-4b19-8635-4f6475523855","Type":"ContainerDied","Data":"8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb"} Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.673047 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b07016c-61a8-4b19-8635-4f6475523855","Type":"ContainerDied","Data":"b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399"} Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.673060 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b07016c-61a8-4b19-8635-4f6475523855","Type":"ContainerDied","Data":"a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924"} Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.673069 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b07016c-61a8-4b19-8635-4f6475523855","Type":"ContainerDied","Data":"902e07d96029273f87858340fd822c319ca3a6b168bf4b0377c625b530b7ae55"} Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.695313 4699 scope.go:117] "RemoveContainer" containerID="e5405c70871ee395752c1da1df07066a938aedcb1ac422f960283753ab469ce2" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.699878 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-snmfx"] Feb 26 11:32:55 crc kubenswrapper[4699]: E0226 11:32:55.700299 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715a80f0-cdba-439c-8a82-4838bf8f7e50" containerName="neutron-httpd" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.700311 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="715a80f0-cdba-439c-8a82-4838bf8f7e50" containerName="neutron-httpd" Feb 26 11:32:55 crc kubenswrapper[4699]: E0226 11:32:55.700328 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715a80f0-cdba-439c-8a82-4838bf8f7e50" containerName="neutron-api" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.700333 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="715a80f0-cdba-439c-8a82-4838bf8f7e50" containerName="neutron-api" Feb 26 11:32:55 crc kubenswrapper[4699]: E0226 11:32:55.700350 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b07016c-61a8-4b19-8635-4f6475523855" containerName="ceilometer-notification-agent" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.700356 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b07016c-61a8-4b19-8635-4f6475523855" containerName="ceilometer-notification-agent" Feb 26 11:32:55 crc kubenswrapper[4699]: E0226 11:32:55.700368 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d85906-b78a-46eb-b5dd-4da95c1222d8" containerName="horizon-log" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.700374 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d85906-b78a-46eb-b5dd-4da95c1222d8" containerName="horizon-log" Feb 26 11:32:55 crc kubenswrapper[4699]: E0226 11:32:55.700392 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b07016c-61a8-4b19-8635-4f6475523855" containerName="sg-core" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.700398 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b07016c-61a8-4b19-8635-4f6475523855" containerName="sg-core" Feb 26 11:32:55 crc kubenswrapper[4699]: E0226 11:32:55.700407 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d85906-b78a-46eb-b5dd-4da95c1222d8" containerName="horizon" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.700412 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d85906-b78a-46eb-b5dd-4da95c1222d8" containerName="horizon" Feb 26 11:32:55 crc kubenswrapper[4699]: E0226 11:32:55.700422 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b07016c-61a8-4b19-8635-4f6475523855" containerName="ceilometer-central-agent" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.700428 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b07016c-61a8-4b19-8635-4f6475523855" containerName="ceilometer-central-agent" Feb 26 11:32:55 crc kubenswrapper[4699]: E0226 11:32:55.700446 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b07016c-61a8-4b19-8635-4f6475523855" containerName="proxy-httpd" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.700453 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b07016c-61a8-4b19-8635-4f6475523855" containerName="proxy-httpd" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.702084 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="78d85906-b78a-46eb-b5dd-4da95c1222d8" containerName="horizon-log" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.702156 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b07016c-61a8-4b19-8635-4f6475523855" containerName="ceilometer-central-agent" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.702174 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b07016c-61a8-4b19-8635-4f6475523855" containerName="proxy-httpd" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.702227 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="78d85906-b78a-46eb-b5dd-4da95c1222d8" containerName="horizon" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.702251 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b07016c-61a8-4b19-8635-4f6475523855" containerName="sg-core" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.702266 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="715a80f0-cdba-439c-8a82-4838bf8f7e50" containerName="neutron-httpd" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.702308 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b07016c-61a8-4b19-8635-4f6475523855" containerName="ceilometer-notification-agent" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.702322 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="715a80f0-cdba-439c-8a82-4838bf8f7e50" containerName="neutron-api" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.704058 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-snmfx" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.711555 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-snmfx"] Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.727236 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.740873 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.747182 4699 scope.go:117] "RemoveContainer" containerID="f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.791465 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.793498 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.797083 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.797291 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.799173 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-59dd795c56-7kv72"] Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.831524 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-59dd795c56-7kv72"] Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.842428 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.848723 4699 scope.go:117] "RemoveContainer" containerID="8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.852332 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c5acc31-dbe4-4698-8346-9a0dbc05234b-operator-scripts\") pod \"nova-api-db-create-snmfx\" (UID: \"6c5acc31-dbe4-4698-8346-9a0dbc05234b\") " pod="openstack/nova-api-db-create-snmfx" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.852471 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znlcj\" (UniqueName: \"kubernetes.io/projected/6c5acc31-dbe4-4698-8346-9a0dbc05234b-kube-api-access-znlcj\") pod \"nova-api-db-create-snmfx\" (UID: \"6c5acc31-dbe4-4698-8346-9a0dbc05234b\") " pod="openstack/nova-api-db-create-snmfx" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.852954 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-62mhs"] Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.854304 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-62mhs" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.868973 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-62mhs"] Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.886167 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-3146-account-create-update-xf6c8"] Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.887475 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3146-account-create-update-xf6c8" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.898277 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3146-account-create-update-xf6c8"] Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.904743 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.941876 4699 scope.go:117] "RemoveContainer" containerID="b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.955023 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " pod="openstack/ceilometer-0" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.955088 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-log-httpd\") pod \"ceilometer-0\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " pod="openstack/ceilometer-0" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.955168 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " pod="openstack/ceilometer-0" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.955201 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89884\" (UniqueName: \"kubernetes.io/projected/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-kube-api-access-89884\") pod \"ceilometer-0\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " pod="openstack/ceilometer-0" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.955220 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e54b257-33a7-43bd-80c5-30915ae82341-operator-scripts\") pod \"nova-cell0-db-create-62mhs\" (UID: \"7e54b257-33a7-43bd-80c5-30915ae82341\") " pod="openstack/nova-cell0-db-create-62mhs" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.955236 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-scripts\") pod \"ceilometer-0\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " pod="openstack/ceilometer-0" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.955274 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcfgb\" (UniqueName: \"kubernetes.io/projected/7e54b257-33a7-43bd-80c5-30915ae82341-kube-api-access-jcfgb\") pod \"nova-cell0-db-create-62mhs\" (UID: \"7e54b257-33a7-43bd-80c5-30915ae82341\") " pod="openstack/nova-cell0-db-create-62mhs" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.955290 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-config-data\") pod \"ceilometer-0\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " pod="openstack/ceilometer-0" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.955310 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-run-httpd\") pod \"ceilometer-0\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " pod="openstack/ceilometer-0" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.955354 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c5acc31-dbe4-4698-8346-9a0dbc05234b-operator-scripts\") pod \"nova-api-db-create-snmfx\" (UID: \"6c5acc31-dbe4-4698-8346-9a0dbc05234b\") " pod="openstack/nova-api-db-create-snmfx" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.955391 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znlcj\" (UniqueName: \"kubernetes.io/projected/6c5acc31-dbe4-4698-8346-9a0dbc05234b-kube-api-access-znlcj\") pod \"nova-api-db-create-snmfx\" (UID: \"6c5acc31-dbe4-4698-8346-9a0dbc05234b\") " pod="openstack/nova-api-db-create-snmfx" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.956723 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c5acc31-dbe4-4698-8346-9a0dbc05234b-operator-scripts\") pod \"nova-api-db-create-snmfx\" (UID: \"6c5acc31-dbe4-4698-8346-9a0dbc05234b\") " pod="openstack/nova-api-db-create-snmfx" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.966259 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-hq69l"] Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.967761 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hq69l" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.975343 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hq69l"] Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.976073 4699 scope.go:117] "RemoveContainer" containerID="a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924" Feb 26 11:32:55 crc kubenswrapper[4699]: I0226 11:32:55.995233 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znlcj\" (UniqueName: \"kubernetes.io/projected/6c5acc31-dbe4-4698-8346-9a0dbc05234b-kube-api-access-znlcj\") pod \"nova-api-db-create-snmfx\" (UID: \"6c5acc31-dbe4-4698-8346-9a0dbc05234b\") " pod="openstack/nova-api-db-create-snmfx" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.055915 4699 scope.go:117] "RemoveContainer" containerID="f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.056108 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-snmfx" Feb 26 11:32:56 crc kubenswrapper[4699]: E0226 11:32:56.056390 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca\": container with ID starting with f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca not found: ID does not exist" containerID="f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.056428 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca"} err="failed to get container status \"f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca\": rpc error: code = NotFound desc = could not find container \"f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca\": container with ID starting with f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca not found: ID does not exist" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.056453 4699 scope.go:117] "RemoveContainer" containerID="8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb" Feb 26 11:32:56 crc kubenswrapper[4699]: E0226 11:32:56.056769 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb\": container with ID starting with 8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb not found: ID does not exist" containerID="8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.056801 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb"} err="failed to get container status \"8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb\": rpc error: code = NotFound desc = could not find container \"8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb\": container with ID starting with 8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb not found: ID does not exist" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.056819 4699 scope.go:117] "RemoveContainer" containerID="b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.056849 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " pod="openstack/ceilometer-0" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.056894 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bssrg\" (UniqueName: \"kubernetes.io/projected/86425865-434f-43e8-9592-e890078837a2-kube-api-access-bssrg\") pod \"nova-cell1-db-create-hq69l\" (UID: \"86425865-434f-43e8-9592-e890078837a2\") " pod="openstack/nova-cell1-db-create-hq69l" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.056925 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89884\" (UniqueName: \"kubernetes.io/projected/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-kube-api-access-89884\") pod \"ceilometer-0\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " pod="openstack/ceilometer-0" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.056948 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e54b257-33a7-43bd-80c5-30915ae82341-operator-scripts\") pod \"nova-cell0-db-create-62mhs\" (UID: \"7e54b257-33a7-43bd-80c5-30915ae82341\") " pod="openstack/nova-cell0-db-create-62mhs" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.056962 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-scripts\") pod \"ceilometer-0\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " pod="openstack/ceilometer-0" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.057160 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcfgb\" (UniqueName: \"kubernetes.io/projected/7e54b257-33a7-43bd-80c5-30915ae82341-kube-api-access-jcfgb\") pod \"nova-cell0-db-create-62mhs\" (UID: \"7e54b257-33a7-43bd-80c5-30915ae82341\") " pod="openstack/nova-cell0-db-create-62mhs" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.057222 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-config-data\") pod \"ceilometer-0\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " pod="openstack/ceilometer-0" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.057280 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-run-httpd\") pod \"ceilometer-0\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " pod="openstack/ceilometer-0" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.057340 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86425865-434f-43e8-9592-e890078837a2-operator-scripts\") pod \"nova-cell1-db-create-hq69l\" (UID: \"86425865-434f-43e8-9592-e890078837a2\") " pod="openstack/nova-cell1-db-create-hq69l" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.057409 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrl4z\" (UniqueName: \"kubernetes.io/projected/f4a229eb-75a5-41b1-8342-53a3a1b433a0-kube-api-access-mrl4z\") pod \"nova-api-3146-account-create-update-xf6c8\" (UID: \"f4a229eb-75a5-41b1-8342-53a3a1b433a0\") " pod="openstack/nova-api-3146-account-create-update-xf6c8" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.057538 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4a229eb-75a5-41b1-8342-53a3a1b433a0-operator-scripts\") pod \"nova-api-3146-account-create-update-xf6c8\" (UID: \"f4a229eb-75a5-41b1-8342-53a3a1b433a0\") " pod="openstack/nova-api-3146-account-create-update-xf6c8" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.057591 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " pod="openstack/ceilometer-0" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.057674 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-log-httpd\") pod \"ceilometer-0\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " pod="openstack/ceilometer-0" Feb 26 11:32:56 crc kubenswrapper[4699]: E0226 11:32:56.061280 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399\": container with ID starting with b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399 not found: ID does not exist" containerID="b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.061331 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399"} err="failed to get container status \"b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399\": rpc error: code = NotFound desc = could not find container \"b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399\": container with ID starting with b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399 not found: ID does not exist" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.061360 4699 scope.go:117] "RemoveContainer" containerID="a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.062136 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.062239 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-run-httpd\") pod \"ceilometer-0\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " pod="openstack/ceilometer-0" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.062279 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-log-httpd\") pod \"ceilometer-0\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " pod="openstack/ceilometer-0" Feb 26 11:32:56 crc kubenswrapper[4699]: E0226 11:32:56.062593 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924\": container with ID starting with a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924 not found: ID does not exist" containerID="a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.062641 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924"} err="failed to get container status \"a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924\": rpc error: code = NotFound desc = could not find container \"a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924\": container with ID starting with a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924 not found: ID does not exist" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.062671 4699 scope.go:117] "RemoveContainer" containerID="f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.062939 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e54b257-33a7-43bd-80c5-30915ae82341-operator-scripts\") pod \"nova-cell0-db-create-62mhs\" (UID: \"7e54b257-33a7-43bd-80c5-30915ae82341\") " pod="openstack/nova-cell0-db-create-62mhs" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.063840 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.066004 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " pod="openstack/ceilometer-0" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.070966 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca"} err="failed to get container status \"f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca\": rpc error: code = NotFound desc = could not find container \"f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca\": container with ID starting with f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca not found: ID does not exist" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.071258 4699 scope.go:117] "RemoveContainer" containerID="8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.075136 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb"} err="failed to get container status \"8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb\": rpc error: code = NotFound desc = could not find container \"8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb\": container with ID starting with 8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb not found: ID does not exist" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.075195 4699 scope.go:117] "RemoveContainer" containerID="b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.075651 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-scripts\") pod \"ceilometer-0\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " pod="openstack/ceilometer-0" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.077519 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " pod="openstack/ceilometer-0" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.077596 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-66cf-account-create-update-qvvdk"] Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.079434 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-66cf-account-create-update-qvvdk" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.080359 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399"} err="failed to get container status \"b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399\": rpc error: code = NotFound desc = could not find container \"b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399\": container with ID starting with b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399 not found: ID does not exist" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.080405 4699 scope.go:117] "RemoveContainer" containerID="a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.081458 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924"} err="failed to get container status \"a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924\": rpc error: code = NotFound desc = could not find container \"a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924\": container with ID starting with a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924 not found: ID does not exist" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.081512 4699 scope.go:117] "RemoveContainer" containerID="f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.082872 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.083382 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca"} err="failed to get container status \"f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca\": rpc error: code = NotFound desc = could not find container \"f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca\": container with ID starting with f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca not found: ID does not exist" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.083404 4699 scope.go:117] "RemoveContainer" containerID="8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.083953 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb"} err="failed to get container status \"8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb\": rpc error: code = NotFound desc = could not find container \"8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb\": container with ID starting with 8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb not found: ID does not exist" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.083977 4699 scope.go:117] "RemoveContainer" containerID="b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.084325 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399"} err="failed to get container status \"b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399\": rpc error: code = NotFound desc = could not find container \"b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399\": container with ID starting with b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399 not found: ID does not exist" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.084343 4699 scope.go:117] "RemoveContainer" containerID="a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.085330 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924"} err="failed to get container status \"a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924\": rpc error: code = NotFound desc = could not find container \"a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924\": container with ID starting with a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924 not found: ID does not exist" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.085362 4699 scope.go:117] "RemoveContainer" containerID="f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.086040 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcfgb\" (UniqueName: \"kubernetes.io/projected/7e54b257-33a7-43bd-80c5-30915ae82341-kube-api-access-jcfgb\") pod \"nova-cell0-db-create-62mhs\" (UID: \"7e54b257-33a7-43bd-80c5-30915ae82341\") " pod="openstack/nova-cell0-db-create-62mhs" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.086336 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca"} err="failed to get container status \"f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca\": rpc error: code = NotFound desc = could not find container \"f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca\": container with ID starting with f7f57995e81f7d57f16c415cdd4fc8b824f88da1357748823087bec1b66775ca not found: ID does not exist" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.086474 4699 scope.go:117] "RemoveContainer" containerID="8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.086827 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb"} err="failed to get container status \"8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb\": rpc error: code = NotFound desc = could not find container \"8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb\": container with ID starting with 8a2b3d98e1b247b2e00c05dc8e82dc50d659148596d3e539832fe8084c0ef4fb not found: ID does not exist" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.086973 4699 scope.go:117] "RemoveContainer" containerID="b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.087302 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-config-data\") pod \"ceilometer-0\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " pod="openstack/ceilometer-0" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.088212 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89884\" (UniqueName: \"kubernetes.io/projected/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-kube-api-access-89884\") pod \"ceilometer-0\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " pod="openstack/ceilometer-0" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.088221 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-66cf-account-create-update-qvvdk"] Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.088524 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399"} err="failed to get container status \"b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399\": rpc error: code = NotFound desc = could not find container \"b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399\": container with ID starting with b81ac7705e4e99a6910779d59e3a0a7b4d83c6a14f827e737f193b76841b4399 not found: ID does not exist" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.088705 4699 scope.go:117] "RemoveContainer" containerID="a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.089091 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924"} err="failed to get container status \"a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924\": rpc error: code = NotFound desc = could not find container \"a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924\": container with ID starting with a35ff88a75232546cd35dc8ab0d6697d72733b6a1f0634f1f02e398729694924 not found: ID does not exist" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.120212 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.160782 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bssrg\" (UniqueName: \"kubernetes.io/projected/86425865-434f-43e8-9592-e890078837a2-kube-api-access-bssrg\") pod \"nova-cell1-db-create-hq69l\" (UID: \"86425865-434f-43e8-9592-e890078837a2\") " pod="openstack/nova-cell1-db-create-hq69l" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.161057 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86425865-434f-43e8-9592-e890078837a2-operator-scripts\") pod \"nova-cell1-db-create-hq69l\" (UID: \"86425865-434f-43e8-9592-e890078837a2\") " pod="openstack/nova-cell1-db-create-hq69l" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.161175 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrl4z\" (UniqueName: \"kubernetes.io/projected/f4a229eb-75a5-41b1-8342-53a3a1b433a0-kube-api-access-mrl4z\") pod \"nova-api-3146-account-create-update-xf6c8\" (UID: \"f4a229eb-75a5-41b1-8342-53a3a1b433a0\") " pod="openstack/nova-api-3146-account-create-update-xf6c8" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.161261 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4a229eb-75a5-41b1-8342-53a3a1b433a0-operator-scripts\") pod \"nova-api-3146-account-create-update-xf6c8\" (UID: \"f4a229eb-75a5-41b1-8342-53a3a1b433a0\") " pod="openstack/nova-api-3146-account-create-update-xf6c8" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.162081 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4a229eb-75a5-41b1-8342-53a3a1b433a0-operator-scripts\") pod \"nova-api-3146-account-create-update-xf6c8\" (UID: \"f4a229eb-75a5-41b1-8342-53a3a1b433a0\") " pod="openstack/nova-api-3146-account-create-update-xf6c8" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.163011 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86425865-434f-43e8-9592-e890078837a2-operator-scripts\") pod \"nova-cell1-db-create-hq69l\" (UID: \"86425865-434f-43e8-9592-e890078837a2\") " pod="openstack/nova-cell1-db-create-hq69l" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.180990 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrl4z\" (UniqueName: \"kubernetes.io/projected/f4a229eb-75a5-41b1-8342-53a3a1b433a0-kube-api-access-mrl4z\") pod \"nova-api-3146-account-create-update-xf6c8\" (UID: \"f4a229eb-75a5-41b1-8342-53a3a1b433a0\") " pod="openstack/nova-api-3146-account-create-update-xf6c8" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.184688 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bssrg\" (UniqueName: \"kubernetes.io/projected/86425865-434f-43e8-9592-e890078837a2-kube-api-access-bssrg\") pod \"nova-cell1-db-create-hq69l\" (UID: \"86425865-434f-43e8-9592-e890078837a2\") " pod="openstack/nova-cell1-db-create-hq69l" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.234968 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-62mhs" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.244916 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3146-account-create-update-xf6c8" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.268292 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nl2p\" (UniqueName: \"kubernetes.io/projected/f99c6b36-a5f6-4f0b-973f-dfa853d2c558-kube-api-access-7nl2p\") pod \"nova-cell0-66cf-account-create-update-qvvdk\" (UID: \"f99c6b36-a5f6-4f0b-973f-dfa853d2c558\") " pod="openstack/nova-cell0-66cf-account-create-update-qvvdk" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.268815 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f99c6b36-a5f6-4f0b-973f-dfa853d2c558-operator-scripts\") pod \"nova-cell0-66cf-account-create-update-qvvdk\" (UID: \"f99c6b36-a5f6-4f0b-973f-dfa853d2c558\") " pod="openstack/nova-cell0-66cf-account-create-update-qvvdk" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.332855 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hq69l" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.353833 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b07016c-61a8-4b19-8635-4f6475523855" path="/var/lib/kubelet/pods/6b07016c-61a8-4b19-8635-4f6475523855/volumes" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.354751 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="715a80f0-cdba-439c-8a82-4838bf8f7e50" path="/var/lib/kubelet/pods/715a80f0-cdba-439c-8a82-4838bf8f7e50/volumes" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.355985 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78d85906-b78a-46eb-b5dd-4da95c1222d8" path="/var/lib/kubelet/pods/78d85906-b78a-46eb-b5dd-4da95c1222d8/volumes" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.356825 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-43f0-account-create-update-vgmlz"] Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.358359 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-43f0-account-create-update-vgmlz"] Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.358508 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-43f0-account-create-update-vgmlz" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.360497 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.370978 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f99c6b36-a5f6-4f0b-973f-dfa853d2c558-operator-scripts\") pod \"nova-cell0-66cf-account-create-update-qvvdk\" (UID: \"f99c6b36-a5f6-4f0b-973f-dfa853d2c558\") " pod="openstack/nova-cell0-66cf-account-create-update-qvvdk" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.371204 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nl2p\" (UniqueName: \"kubernetes.io/projected/f99c6b36-a5f6-4f0b-973f-dfa853d2c558-kube-api-access-7nl2p\") pod \"nova-cell0-66cf-account-create-update-qvvdk\" (UID: \"f99c6b36-a5f6-4f0b-973f-dfa853d2c558\") " pod="openstack/nova-cell0-66cf-account-create-update-qvvdk" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.373846 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f99c6b36-a5f6-4f0b-973f-dfa853d2c558-operator-scripts\") pod \"nova-cell0-66cf-account-create-update-qvvdk\" (UID: \"f99c6b36-a5f6-4f0b-973f-dfa853d2c558\") " pod="openstack/nova-cell0-66cf-account-create-update-qvvdk" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.396047 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nl2p\" (UniqueName: \"kubernetes.io/projected/f99c6b36-a5f6-4f0b-973f-dfa853d2c558-kube-api-access-7nl2p\") pod \"nova-cell0-66cf-account-create-update-qvvdk\" (UID: \"f99c6b36-a5f6-4f0b-973f-dfa853d2c558\") " pod="openstack/nova-cell0-66cf-account-create-update-qvvdk" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.473632 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgx45\" (UniqueName: \"kubernetes.io/projected/dea40818-89fa-4b78-9833-82635861fee1-kube-api-access-qgx45\") pod \"nova-cell1-43f0-account-create-update-vgmlz\" (UID: \"dea40818-89fa-4b78-9833-82635861fee1\") " pod="openstack/nova-cell1-43f0-account-create-update-vgmlz" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.473771 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dea40818-89fa-4b78-9833-82635861fee1-operator-scripts\") pod \"nova-cell1-43f0-account-create-update-vgmlz\" (UID: \"dea40818-89fa-4b78-9833-82635861fee1\") " pod="openstack/nova-cell1-43f0-account-create-update-vgmlz" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.496633 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-66cf-account-create-update-qvvdk" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.575663 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgx45\" (UniqueName: \"kubernetes.io/projected/dea40818-89fa-4b78-9833-82635861fee1-kube-api-access-qgx45\") pod \"nova-cell1-43f0-account-create-update-vgmlz\" (UID: \"dea40818-89fa-4b78-9833-82635861fee1\") " pod="openstack/nova-cell1-43f0-account-create-update-vgmlz" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.576022 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dea40818-89fa-4b78-9833-82635861fee1-operator-scripts\") pod \"nova-cell1-43f0-account-create-update-vgmlz\" (UID: \"dea40818-89fa-4b78-9833-82635861fee1\") " pod="openstack/nova-cell1-43f0-account-create-update-vgmlz" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.577485 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dea40818-89fa-4b78-9833-82635861fee1-operator-scripts\") pod \"nova-cell1-43f0-account-create-update-vgmlz\" (UID: \"dea40818-89fa-4b78-9833-82635861fee1\") " pod="openstack/nova-cell1-43f0-account-create-update-vgmlz" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.599281 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgx45\" (UniqueName: \"kubernetes.io/projected/dea40818-89fa-4b78-9833-82635861fee1-kube-api-access-qgx45\") pod \"nova-cell1-43f0-account-create-update-vgmlz\" (UID: \"dea40818-89fa-4b78-9833-82635861fee1\") " pod="openstack/nova-cell1-43f0-account-create-update-vgmlz" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.675320 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.728679 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-43f0-account-create-update-vgmlz" Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.769411 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3af9cf0d-3dcb-4d56-9373-2ec6ea323564","Type":"ContainerStarted","Data":"56fc6b0ffcfa6ade2b63264a18f35f46ed39dca62b34ae50c392d8a43e061b9a"} Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.904707 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-snmfx"] Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.914249 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3146-account-create-update-xf6c8"] Feb 26 11:32:56 crc kubenswrapper[4699]: I0226 11:32:56.982018 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 26 11:32:57 crc kubenswrapper[4699]: I0226 11:32:57.177500 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-62mhs"] Feb 26 11:32:57 crc kubenswrapper[4699]: I0226 11:32:57.267978 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hq69l"] Feb 26 11:32:57 crc kubenswrapper[4699]: W0226 11:32:57.287917 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86425865_434f_43e8_9592_e890078837a2.slice/crio-cae37a1dd41503ce2668296075b414232a79b4124cf7606abb527f141a12944b WatchSource:0}: Error finding container cae37a1dd41503ce2668296075b414232a79b4124cf7606abb527f141a12944b: Status 404 returned error can't find the container with id cae37a1dd41503ce2668296075b414232a79b4124cf7606abb527f141a12944b Feb 26 11:32:57 crc kubenswrapper[4699]: I0226 11:32:57.334126 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-66cf-account-create-update-qvvdk"] Feb 26 11:32:57 crc kubenswrapper[4699]: I0226 11:32:57.505318 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-43f0-account-create-update-vgmlz"] Feb 26 11:32:57 crc kubenswrapper[4699]: W0226 11:32:57.514337 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddea40818_89fa_4b78_9833_82635861fee1.slice/crio-28d6dd8244e4d61e26f02b09033cd56dadc3f377a4f21e89d14a901d2be9c9d5 WatchSource:0}: Error finding container 28d6dd8244e4d61e26f02b09033cd56dadc3f377a4f21e89d14a901d2be9c9d5: Status 404 returned error can't find the container with id 28d6dd8244e4d61e26f02b09033cd56dadc3f377a4f21e89d14a901d2be9c9d5 Feb 26 11:32:57 crc kubenswrapper[4699]: I0226 11:32:57.805013 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3af9cf0d-3dcb-4d56-9373-2ec6ea323564","Type":"ContainerStarted","Data":"5676ecd8fae29c4c55da186cdff47b53a9f38f95d3b5e21ebca1ea44099ac805"} Feb 26 11:32:57 crc kubenswrapper[4699]: I0226 11:32:57.808404 4699 generic.go:334] "Generic (PLEG): container finished" podID="f4a229eb-75a5-41b1-8342-53a3a1b433a0" containerID="b2b62d6d79c5c992c3884d7e4c7aa453502b8500701d02db975cc913cb332656" exitCode=0 Feb 26 11:32:57 crc kubenswrapper[4699]: I0226 11:32:57.808473 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3146-account-create-update-xf6c8" event={"ID":"f4a229eb-75a5-41b1-8342-53a3a1b433a0","Type":"ContainerDied","Data":"b2b62d6d79c5c992c3884d7e4c7aa453502b8500701d02db975cc913cb332656"} Feb 26 11:32:57 crc kubenswrapper[4699]: I0226 11:32:57.808503 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3146-account-create-update-xf6c8" event={"ID":"f4a229eb-75a5-41b1-8342-53a3a1b433a0","Type":"ContainerStarted","Data":"129588bf429da3e10dc74316d2ee44e760af7923cd0ca87e4e69a1941943c196"} Feb 26 11:32:57 crc kubenswrapper[4699]: I0226 11:32:57.810267 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-43f0-account-create-update-vgmlz" event={"ID":"dea40818-89fa-4b78-9833-82635861fee1","Type":"ContainerStarted","Data":"28d6dd8244e4d61e26f02b09033cd56dadc3f377a4f21e89d14a901d2be9c9d5"} Feb 26 11:32:57 crc kubenswrapper[4699]: I0226 11:32:57.812694 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-66cf-account-create-update-qvvdk" event={"ID":"f99c6b36-a5f6-4f0b-973f-dfa853d2c558","Type":"ContainerStarted","Data":"ea224b941b0465af7d8b7b7d5e0297ed56d62f796e3b6566730ce00cb01d16ec"} Feb 26 11:32:57 crc kubenswrapper[4699]: I0226 11:32:57.812731 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-66cf-account-create-update-qvvdk" event={"ID":"f99c6b36-a5f6-4f0b-973f-dfa853d2c558","Type":"ContainerStarted","Data":"2361c68db1b6f7a97eaf5c8da87ffb02f30d64940e48194e78df269546c62761"} Feb 26 11:32:57 crc kubenswrapper[4699]: I0226 11:32:57.815798 4699 generic.go:334] "Generic (PLEG): container finished" podID="6c5acc31-dbe4-4698-8346-9a0dbc05234b" containerID="0569f07824e60d0703bc892d604ca5230523b1fde72c768bd283ae0d47703780" exitCode=0 Feb 26 11:32:57 crc kubenswrapper[4699]: I0226 11:32:57.815865 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-snmfx" event={"ID":"6c5acc31-dbe4-4698-8346-9a0dbc05234b","Type":"ContainerDied","Data":"0569f07824e60d0703bc892d604ca5230523b1fde72c768bd283ae0d47703780"} Feb 26 11:32:57 crc kubenswrapper[4699]: I0226 11:32:57.815887 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-snmfx" event={"ID":"6c5acc31-dbe4-4698-8346-9a0dbc05234b","Type":"ContainerStarted","Data":"367226eed7c68edeebb6220e8afa4a67d5e1c5ee276d3ca9afc548c3ebb597e1"} Feb 26 11:32:57 crc kubenswrapper[4699]: I0226 11:32:57.818562 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hq69l" event={"ID":"86425865-434f-43e8-9592-e890078837a2","Type":"ContainerStarted","Data":"e2f8c469ec04f6028bf261997ea76ce892a579e71cd0b1e3cbda4d1a898468a0"} Feb 26 11:32:57 crc kubenswrapper[4699]: I0226 11:32:57.818606 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hq69l" event={"ID":"86425865-434f-43e8-9592-e890078837a2","Type":"ContainerStarted","Data":"cae37a1dd41503ce2668296075b414232a79b4124cf7606abb527f141a12944b"} Feb 26 11:32:57 crc kubenswrapper[4699]: I0226 11:32:57.825471 4699 generic.go:334] "Generic (PLEG): container finished" podID="7e54b257-33a7-43bd-80c5-30915ae82341" containerID="9eff27ca91f87caa5ed2a02975a6d6bc2e239264a6a323e5cbc0471084500265" exitCode=0 Feb 26 11:32:57 crc kubenswrapper[4699]: I0226 11:32:57.825522 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-62mhs" event={"ID":"7e54b257-33a7-43bd-80c5-30915ae82341","Type":"ContainerDied","Data":"9eff27ca91f87caa5ed2a02975a6d6bc2e239264a6a323e5cbc0471084500265"} Feb 26 11:32:57 crc kubenswrapper[4699]: I0226 11:32:57.825548 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-62mhs" event={"ID":"7e54b257-33a7-43bd-80c5-30915ae82341","Type":"ContainerStarted","Data":"a09cad48346ffd6c0e7b88aa7cf1e96d3627b8eded2b64f26a84ecb47f6b8740"} Feb 26 11:32:57 crc kubenswrapper[4699]: I0226 11:32:57.854833 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-hq69l" podStartSLOduration=2.854813687 podStartE2EDuration="2.854813687s" podCreationTimestamp="2026-02-26 11:32:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:32:57.850387779 +0000 UTC m=+1323.661214213" watchObservedRunningTime="2026-02-26 11:32:57.854813687 +0000 UTC m=+1323.665640121" Feb 26 11:32:57 crc kubenswrapper[4699]: I0226 11:32:57.873032 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-66cf-account-create-update-qvvdk" podStartSLOduration=1.8730150559999998 podStartE2EDuration="1.873015056s" podCreationTimestamp="2026-02-26 11:32:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:32:57.864159909 +0000 UTC m=+1323.674986353" watchObservedRunningTime="2026-02-26 11:32:57.873015056 +0000 UTC m=+1323.683841490" Feb 26 11:32:58 crc kubenswrapper[4699]: I0226 11:32:58.838653 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3af9cf0d-3dcb-4d56-9373-2ec6ea323564","Type":"ContainerStarted","Data":"49f60262b1f3a898a9823898820585f9f8010c300b1f1a4ba7911b0817efb1e1"} Feb 26 11:32:58 crc kubenswrapper[4699]: I0226 11:32:58.840818 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-43f0-account-create-update-vgmlz" event={"ID":"dea40818-89fa-4b78-9833-82635861fee1","Type":"ContainerStarted","Data":"853cdd9a99dcd559f8a9a9863c9ecd3351cc72fb23481557abd22c41a3816b2d"} Feb 26 11:32:58 crc kubenswrapper[4699]: I0226 11:32:58.842256 4699 generic.go:334] "Generic (PLEG): container finished" podID="f99c6b36-a5f6-4f0b-973f-dfa853d2c558" containerID="ea224b941b0465af7d8b7b7d5e0297ed56d62f796e3b6566730ce00cb01d16ec" exitCode=0 Feb 26 11:32:58 crc kubenswrapper[4699]: I0226 11:32:58.842341 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-66cf-account-create-update-qvvdk" event={"ID":"f99c6b36-a5f6-4f0b-973f-dfa853d2c558","Type":"ContainerDied","Data":"ea224b941b0465af7d8b7b7d5e0297ed56d62f796e3b6566730ce00cb01d16ec"} Feb 26 11:32:58 crc kubenswrapper[4699]: I0226 11:32:58.850747 4699 generic.go:334] "Generic (PLEG): container finished" podID="86425865-434f-43e8-9592-e890078837a2" containerID="e2f8c469ec04f6028bf261997ea76ce892a579e71cd0b1e3cbda4d1a898468a0" exitCode=0 Feb 26 11:32:58 crc kubenswrapper[4699]: I0226 11:32:58.851203 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hq69l" event={"ID":"86425865-434f-43e8-9592-e890078837a2","Type":"ContainerDied","Data":"e2f8c469ec04f6028bf261997ea76ce892a579e71cd0b1e3cbda4d1a898468a0"} Feb 26 11:32:58 crc kubenswrapper[4699]: I0226 11:32:58.870322 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-43f0-account-create-update-vgmlz" podStartSLOduration=2.870294618 podStartE2EDuration="2.870294618s" podCreationTimestamp="2026-02-26 11:32:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:32:58.858476165 +0000 UTC m=+1324.669302629" watchObservedRunningTime="2026-02-26 11:32:58.870294618 +0000 UTC m=+1324.681121092" Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.187187 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.390608 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-62mhs" Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.396689 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-snmfx" Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.403975 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3146-account-create-update-xf6c8" Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.455977 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znlcj\" (UniqueName: \"kubernetes.io/projected/6c5acc31-dbe4-4698-8346-9a0dbc05234b-kube-api-access-znlcj\") pod \"6c5acc31-dbe4-4698-8346-9a0dbc05234b\" (UID: \"6c5acc31-dbe4-4698-8346-9a0dbc05234b\") " Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.456080 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c5acc31-dbe4-4698-8346-9a0dbc05234b-operator-scripts\") pod \"6c5acc31-dbe4-4698-8346-9a0dbc05234b\" (UID: \"6c5acc31-dbe4-4698-8346-9a0dbc05234b\") " Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.456216 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcfgb\" (UniqueName: \"kubernetes.io/projected/7e54b257-33a7-43bd-80c5-30915ae82341-kube-api-access-jcfgb\") pod \"7e54b257-33a7-43bd-80c5-30915ae82341\" (UID: \"7e54b257-33a7-43bd-80c5-30915ae82341\") " Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.456243 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e54b257-33a7-43bd-80c5-30915ae82341-operator-scripts\") pod \"7e54b257-33a7-43bd-80c5-30915ae82341\" (UID: \"7e54b257-33a7-43bd-80c5-30915ae82341\") " Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.456944 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c5acc31-dbe4-4698-8346-9a0dbc05234b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6c5acc31-dbe4-4698-8346-9a0dbc05234b" (UID: "6c5acc31-dbe4-4698-8346-9a0dbc05234b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.457075 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e54b257-33a7-43bd-80c5-30915ae82341-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e54b257-33a7-43bd-80c5-30915ae82341" (UID: "7e54b257-33a7-43bd-80c5-30915ae82341"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.462931 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c5acc31-dbe4-4698-8346-9a0dbc05234b-kube-api-access-znlcj" (OuterVolumeSpecName: "kube-api-access-znlcj") pod "6c5acc31-dbe4-4698-8346-9a0dbc05234b" (UID: "6c5acc31-dbe4-4698-8346-9a0dbc05234b"). InnerVolumeSpecName "kube-api-access-znlcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.463381 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e54b257-33a7-43bd-80c5-30915ae82341-kube-api-access-jcfgb" (OuterVolumeSpecName: "kube-api-access-jcfgb") pod "7e54b257-33a7-43bd-80c5-30915ae82341" (UID: "7e54b257-33a7-43bd-80c5-30915ae82341"). InnerVolumeSpecName "kube-api-access-jcfgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.557897 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4a229eb-75a5-41b1-8342-53a3a1b433a0-operator-scripts\") pod \"f4a229eb-75a5-41b1-8342-53a3a1b433a0\" (UID: \"f4a229eb-75a5-41b1-8342-53a3a1b433a0\") " Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.558297 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrl4z\" (UniqueName: \"kubernetes.io/projected/f4a229eb-75a5-41b1-8342-53a3a1b433a0-kube-api-access-mrl4z\") pod \"f4a229eb-75a5-41b1-8342-53a3a1b433a0\" (UID: \"f4a229eb-75a5-41b1-8342-53a3a1b433a0\") " Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.559018 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znlcj\" (UniqueName: \"kubernetes.io/projected/6c5acc31-dbe4-4698-8346-9a0dbc05234b-kube-api-access-znlcj\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.559042 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c5acc31-dbe4-4698-8346-9a0dbc05234b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.559054 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcfgb\" (UniqueName: \"kubernetes.io/projected/7e54b257-33a7-43bd-80c5-30915ae82341-kube-api-access-jcfgb\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.559065 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e54b257-33a7-43bd-80c5-30915ae82341-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.560080 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4a229eb-75a5-41b1-8342-53a3a1b433a0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f4a229eb-75a5-41b1-8342-53a3a1b433a0" (UID: "f4a229eb-75a5-41b1-8342-53a3a1b433a0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.567619 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4a229eb-75a5-41b1-8342-53a3a1b433a0-kube-api-access-mrl4z" (OuterVolumeSpecName: "kube-api-access-mrl4z") pod "f4a229eb-75a5-41b1-8342-53a3a1b433a0" (UID: "f4a229eb-75a5-41b1-8342-53a3a1b433a0"). InnerVolumeSpecName "kube-api-access-mrl4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.660683 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4a229eb-75a5-41b1-8342-53a3a1b433a0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.660724 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrl4z\" (UniqueName: \"kubernetes.io/projected/f4a229eb-75a5-41b1-8342-53a3a1b433a0-kube-api-access-mrl4z\") on node \"crc\" DevicePath \"\"" Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.861046 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-62mhs" Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.861057 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-62mhs" event={"ID":"7e54b257-33a7-43bd-80c5-30915ae82341","Type":"ContainerDied","Data":"a09cad48346ffd6c0e7b88aa7cf1e96d3627b8eded2b64f26a84ecb47f6b8740"} Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.861100 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a09cad48346ffd6c0e7b88aa7cf1e96d3627b8eded2b64f26a84ecb47f6b8740" Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.862797 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3146-account-create-update-xf6c8" event={"ID":"f4a229eb-75a5-41b1-8342-53a3a1b433a0","Type":"ContainerDied","Data":"129588bf429da3e10dc74316d2ee44e760af7923cd0ca87e4e69a1941943c196"} Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.862830 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="129588bf429da3e10dc74316d2ee44e760af7923cd0ca87e4e69a1941943c196" Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.862896 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3146-account-create-update-xf6c8" Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.866033 4699 generic.go:334] "Generic (PLEG): container finished" podID="dea40818-89fa-4b78-9833-82635861fee1" containerID="853cdd9a99dcd559f8a9a9863c9ecd3351cc72fb23481557abd22c41a3816b2d" exitCode=0 Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.866092 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-43f0-account-create-update-vgmlz" event={"ID":"dea40818-89fa-4b78-9833-82635861fee1","Type":"ContainerDied","Data":"853cdd9a99dcd559f8a9a9863c9ecd3351cc72fb23481557abd22c41a3816b2d"} Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.868743 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-snmfx" Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.868789 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-snmfx" event={"ID":"6c5acc31-dbe4-4698-8346-9a0dbc05234b","Type":"ContainerDied","Data":"367226eed7c68edeebb6220e8afa4a67d5e1c5ee276d3ca9afc548c3ebb597e1"} Feb 26 11:32:59 crc kubenswrapper[4699]: I0226 11:32:59.868815 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="367226eed7c68edeebb6220e8afa4a67d5e1c5ee276d3ca9afc548c3ebb597e1" Feb 26 11:33:00 crc kubenswrapper[4699]: I0226 11:33:00.299842 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hq69l" Feb 26 11:33:00 crc kubenswrapper[4699]: I0226 11:33:00.345487 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-66cf-account-create-update-qvvdk" Feb 26 11:33:00 crc kubenswrapper[4699]: I0226 11:33:00.381490 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bssrg\" (UniqueName: \"kubernetes.io/projected/86425865-434f-43e8-9592-e890078837a2-kube-api-access-bssrg\") pod \"86425865-434f-43e8-9592-e890078837a2\" (UID: \"86425865-434f-43e8-9592-e890078837a2\") " Feb 26 11:33:00 crc kubenswrapper[4699]: I0226 11:33:00.381681 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86425865-434f-43e8-9592-e890078837a2-operator-scripts\") pod \"86425865-434f-43e8-9592-e890078837a2\" (UID: \"86425865-434f-43e8-9592-e890078837a2\") " Feb 26 11:33:00 crc kubenswrapper[4699]: I0226 11:33:00.383356 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86425865-434f-43e8-9592-e890078837a2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "86425865-434f-43e8-9592-e890078837a2" (UID: "86425865-434f-43e8-9592-e890078837a2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:33:00 crc kubenswrapper[4699]: I0226 11:33:00.386281 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86425865-434f-43e8-9592-e890078837a2-kube-api-access-bssrg" (OuterVolumeSpecName: "kube-api-access-bssrg") pod "86425865-434f-43e8-9592-e890078837a2" (UID: "86425865-434f-43e8-9592-e890078837a2"). InnerVolumeSpecName "kube-api-access-bssrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:33:00 crc kubenswrapper[4699]: I0226 11:33:00.494740 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f99c6b36-a5f6-4f0b-973f-dfa853d2c558-operator-scripts\") pod \"f99c6b36-a5f6-4f0b-973f-dfa853d2c558\" (UID: \"f99c6b36-a5f6-4f0b-973f-dfa853d2c558\") " Feb 26 11:33:00 crc kubenswrapper[4699]: I0226 11:33:00.495088 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nl2p\" (UniqueName: \"kubernetes.io/projected/f99c6b36-a5f6-4f0b-973f-dfa853d2c558-kube-api-access-7nl2p\") pod \"f99c6b36-a5f6-4f0b-973f-dfa853d2c558\" (UID: \"f99c6b36-a5f6-4f0b-973f-dfa853d2c558\") " Feb 26 11:33:00 crc kubenswrapper[4699]: I0226 11:33:00.495671 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86425865-434f-43e8-9592-e890078837a2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:00 crc kubenswrapper[4699]: I0226 11:33:00.495701 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bssrg\" (UniqueName: \"kubernetes.io/projected/86425865-434f-43e8-9592-e890078837a2-kube-api-access-bssrg\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:00 crc kubenswrapper[4699]: I0226 11:33:00.501187 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f99c6b36-a5f6-4f0b-973f-dfa853d2c558-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f99c6b36-a5f6-4f0b-973f-dfa853d2c558" (UID: "f99c6b36-a5f6-4f0b-973f-dfa853d2c558"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:33:00 crc kubenswrapper[4699]: I0226 11:33:00.504318 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f99c6b36-a5f6-4f0b-973f-dfa853d2c558-kube-api-access-7nl2p" (OuterVolumeSpecName: "kube-api-access-7nl2p") pod "f99c6b36-a5f6-4f0b-973f-dfa853d2c558" (UID: "f99c6b36-a5f6-4f0b-973f-dfa853d2c558"). InnerVolumeSpecName "kube-api-access-7nl2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:33:00 crc kubenswrapper[4699]: I0226 11:33:00.597102 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f99c6b36-a5f6-4f0b-973f-dfa853d2c558-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:00 crc kubenswrapper[4699]: I0226 11:33:00.597156 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nl2p\" (UniqueName: \"kubernetes.io/projected/f99c6b36-a5f6-4f0b-973f-dfa853d2c558-kube-api-access-7nl2p\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:00 crc kubenswrapper[4699]: I0226 11:33:00.878906 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3af9cf0d-3dcb-4d56-9373-2ec6ea323564","Type":"ContainerStarted","Data":"640ce3baeea3e4b12efb1f4f719021e9fcae968de1b1812fed4c585f37198921"} Feb 26 11:33:00 crc kubenswrapper[4699]: I0226 11:33:00.881628 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-66cf-account-create-update-qvvdk" Feb 26 11:33:00 crc kubenswrapper[4699]: I0226 11:33:00.881676 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-66cf-account-create-update-qvvdk" event={"ID":"f99c6b36-a5f6-4f0b-973f-dfa853d2c558","Type":"ContainerDied","Data":"2361c68db1b6f7a97eaf5c8da87ffb02f30d64940e48194e78df269546c62761"} Feb 26 11:33:00 crc kubenswrapper[4699]: I0226 11:33:00.881707 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2361c68db1b6f7a97eaf5c8da87ffb02f30d64940e48194e78df269546c62761" Feb 26 11:33:00 crc kubenswrapper[4699]: I0226 11:33:00.883180 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hq69l" Feb 26 11:33:00 crc kubenswrapper[4699]: I0226 11:33:00.883202 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hq69l" event={"ID":"86425865-434f-43e8-9592-e890078837a2","Type":"ContainerDied","Data":"cae37a1dd41503ce2668296075b414232a79b4124cf7606abb527f141a12944b"} Feb 26 11:33:00 crc kubenswrapper[4699]: I0226 11:33:00.883255 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cae37a1dd41503ce2668296075b414232a79b4124cf7606abb527f141a12944b" Feb 26 11:33:01 crc kubenswrapper[4699]: I0226 11:33:01.236744 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-43f0-account-create-update-vgmlz" Feb 26 11:33:01 crc kubenswrapper[4699]: I0226 11:33:01.309873 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dea40818-89fa-4b78-9833-82635861fee1-operator-scripts\") pod \"dea40818-89fa-4b78-9833-82635861fee1\" (UID: \"dea40818-89fa-4b78-9833-82635861fee1\") " Feb 26 11:33:01 crc kubenswrapper[4699]: I0226 11:33:01.309953 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgx45\" (UniqueName: \"kubernetes.io/projected/dea40818-89fa-4b78-9833-82635861fee1-kube-api-access-qgx45\") pod \"dea40818-89fa-4b78-9833-82635861fee1\" (UID: \"dea40818-89fa-4b78-9833-82635861fee1\") " Feb 26 11:33:01 crc kubenswrapper[4699]: I0226 11:33:01.310961 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dea40818-89fa-4b78-9833-82635861fee1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dea40818-89fa-4b78-9833-82635861fee1" (UID: "dea40818-89fa-4b78-9833-82635861fee1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:33:01 crc kubenswrapper[4699]: I0226 11:33:01.314958 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dea40818-89fa-4b78-9833-82635861fee1-kube-api-access-qgx45" (OuterVolumeSpecName: "kube-api-access-qgx45") pod "dea40818-89fa-4b78-9833-82635861fee1" (UID: "dea40818-89fa-4b78-9833-82635861fee1"). InnerVolumeSpecName "kube-api-access-qgx45". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:33:01 crc kubenswrapper[4699]: I0226 11:33:01.411874 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dea40818-89fa-4b78-9833-82635861fee1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:01 crc kubenswrapper[4699]: I0226 11:33:01.411912 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgx45\" (UniqueName: \"kubernetes.io/projected/dea40818-89fa-4b78-9833-82635861fee1-kube-api-access-qgx45\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:01 crc kubenswrapper[4699]: I0226 11:33:01.892608 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-43f0-account-create-update-vgmlz" event={"ID":"dea40818-89fa-4b78-9833-82635861fee1","Type":"ContainerDied","Data":"28d6dd8244e4d61e26f02b09033cd56dadc3f377a4f21e89d14a901d2be9c9d5"} Feb 26 11:33:01 crc kubenswrapper[4699]: I0226 11:33:01.892663 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28d6dd8244e4d61e26f02b09033cd56dadc3f377a4f21e89d14a901d2be9c9d5" Feb 26 11:33:01 crc kubenswrapper[4699]: I0226 11:33:01.892727 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-43f0-account-create-update-vgmlz" Feb 26 11:33:02 crc kubenswrapper[4699]: I0226 11:33:02.903872 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3af9cf0d-3dcb-4d56-9373-2ec6ea323564","Type":"ContainerStarted","Data":"122b3010bd28520fa64679c76c74e7f10dc4aef71ddb5d070e468d1dfb7dddcf"} Feb 26 11:33:02 crc kubenswrapper[4699]: I0226 11:33:02.904295 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 11:33:02 crc kubenswrapper[4699]: I0226 11:33:02.904064 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3af9cf0d-3dcb-4d56-9373-2ec6ea323564" containerName="proxy-httpd" containerID="cri-o://122b3010bd28520fa64679c76c74e7f10dc4aef71ddb5d070e468d1dfb7dddcf" gracePeriod=30 Feb 26 11:33:02 crc kubenswrapper[4699]: I0226 11:33:02.904022 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3af9cf0d-3dcb-4d56-9373-2ec6ea323564" containerName="ceilometer-central-agent" containerID="cri-o://5676ecd8fae29c4c55da186cdff47b53a9f38f95d3b5e21ebca1ea44099ac805" gracePeriod=30 Feb 26 11:33:02 crc kubenswrapper[4699]: I0226 11:33:02.904080 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3af9cf0d-3dcb-4d56-9373-2ec6ea323564" containerName="sg-core" containerID="cri-o://640ce3baeea3e4b12efb1f4f719021e9fcae968de1b1812fed4c585f37198921" gracePeriod=30 Feb 26 11:33:02 crc kubenswrapper[4699]: I0226 11:33:02.904092 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3af9cf0d-3dcb-4d56-9373-2ec6ea323564" containerName="ceilometer-notification-agent" containerID="cri-o://49f60262b1f3a898a9823898820585f9f8010c300b1f1a4ba7911b0817efb1e1" gracePeriod=30 Feb 26 11:33:02 crc kubenswrapper[4699]: I0226 11:33:02.929934 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.328763021 podStartE2EDuration="7.929907274s" podCreationTimestamp="2026-02-26 11:32:55 +0000 UTC" firstStartedPulling="2026-02-26 11:32:56.681566655 +0000 UTC m=+1322.492393099" lastFinishedPulling="2026-02-26 11:33:02.282710918 +0000 UTC m=+1328.093537352" observedRunningTime="2026-02-26 11:33:02.922571841 +0000 UTC m=+1328.733398275" watchObservedRunningTime="2026-02-26 11:33:02.929907274 +0000 UTC m=+1328.740733708" Feb 26 11:33:03 crc kubenswrapper[4699]: I0226 11:33:03.917456 4699 generic.go:334] "Generic (PLEG): container finished" podID="3af9cf0d-3dcb-4d56-9373-2ec6ea323564" containerID="122b3010bd28520fa64679c76c74e7f10dc4aef71ddb5d070e468d1dfb7dddcf" exitCode=0 Feb 26 11:33:03 crc kubenswrapper[4699]: I0226 11:33:03.917490 4699 generic.go:334] "Generic (PLEG): container finished" podID="3af9cf0d-3dcb-4d56-9373-2ec6ea323564" containerID="640ce3baeea3e4b12efb1f4f719021e9fcae968de1b1812fed4c585f37198921" exitCode=2 Feb 26 11:33:03 crc kubenswrapper[4699]: I0226 11:33:03.917499 4699 generic.go:334] "Generic (PLEG): container finished" podID="3af9cf0d-3dcb-4d56-9373-2ec6ea323564" containerID="49f60262b1f3a898a9823898820585f9f8010c300b1f1a4ba7911b0817efb1e1" exitCode=0 Feb 26 11:33:03 crc kubenswrapper[4699]: I0226 11:33:03.917523 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3af9cf0d-3dcb-4d56-9373-2ec6ea323564","Type":"ContainerDied","Data":"122b3010bd28520fa64679c76c74e7f10dc4aef71ddb5d070e468d1dfb7dddcf"} Feb 26 11:33:03 crc kubenswrapper[4699]: I0226 11:33:03.917558 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3af9cf0d-3dcb-4d56-9373-2ec6ea323564","Type":"ContainerDied","Data":"640ce3baeea3e4b12efb1f4f719021e9fcae968de1b1812fed4c585f37198921"} Feb 26 11:33:03 crc kubenswrapper[4699]: I0226 11:33:03.917574 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3af9cf0d-3dcb-4d56-9373-2ec6ea323564","Type":"ContainerDied","Data":"49f60262b1f3a898a9823898820585f9f8010c300b1f1a4ba7911b0817efb1e1"} Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.612152 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.689376 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-config-data\") pod \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.689440 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-run-httpd\") pod \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.689495 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-sg-core-conf-yaml\") pod \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.689536 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-combined-ca-bundle\") pod \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.689562 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89884\" (UniqueName: \"kubernetes.io/projected/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-kube-api-access-89884\") pod \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.689621 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-scripts\") pod \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.689964 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-log-httpd\") pod \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\" (UID: \"3af9cf0d-3dcb-4d56-9373-2ec6ea323564\") " Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.691082 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3af9cf0d-3dcb-4d56-9373-2ec6ea323564" (UID: "3af9cf0d-3dcb-4d56-9373-2ec6ea323564"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.692407 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3af9cf0d-3dcb-4d56-9373-2ec6ea323564" (UID: "3af9cf0d-3dcb-4d56-9373-2ec6ea323564"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.698989 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-kube-api-access-89884" (OuterVolumeSpecName: "kube-api-access-89884") pod "3af9cf0d-3dcb-4d56-9373-2ec6ea323564" (UID: "3af9cf0d-3dcb-4d56-9373-2ec6ea323564"). InnerVolumeSpecName "kube-api-access-89884". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.704535 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-scripts" (OuterVolumeSpecName: "scripts") pod "3af9cf0d-3dcb-4d56-9373-2ec6ea323564" (UID: "3af9cf0d-3dcb-4d56-9373-2ec6ea323564"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.721372 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3af9cf0d-3dcb-4d56-9373-2ec6ea323564" (UID: "3af9cf0d-3dcb-4d56-9373-2ec6ea323564"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.767024 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3af9cf0d-3dcb-4d56-9373-2ec6ea323564" (UID: "3af9cf0d-3dcb-4d56-9373-2ec6ea323564"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.791953 4699 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.791982 4699 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.791993 4699 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.792003 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.792013 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89884\" (UniqueName: \"kubernetes.io/projected/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-kube-api-access-89884\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.792023 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.795425 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-config-data" (OuterVolumeSpecName: "config-data") pod "3af9cf0d-3dcb-4d56-9373-2ec6ea323564" (UID: "3af9cf0d-3dcb-4d56-9373-2ec6ea323564"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.893459 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3af9cf0d-3dcb-4d56-9373-2ec6ea323564-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.941096 4699 generic.go:334] "Generic (PLEG): container finished" podID="3af9cf0d-3dcb-4d56-9373-2ec6ea323564" containerID="5676ecd8fae29c4c55da186cdff47b53a9f38f95d3b5e21ebca1ea44099ac805" exitCode=0 Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.941170 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3af9cf0d-3dcb-4d56-9373-2ec6ea323564","Type":"ContainerDied","Data":"5676ecd8fae29c4c55da186cdff47b53a9f38f95d3b5e21ebca1ea44099ac805"} Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.941203 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3af9cf0d-3dcb-4d56-9373-2ec6ea323564","Type":"ContainerDied","Data":"56fc6b0ffcfa6ade2b63264a18f35f46ed39dca62b34ae50c392d8a43e061b9a"} Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.941221 4699 scope.go:117] "RemoveContainer" containerID="122b3010bd28520fa64679c76c74e7f10dc4aef71ddb5d070e468d1dfb7dddcf" Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.941381 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.988080 4699 scope.go:117] "RemoveContainer" containerID="640ce3baeea3e4b12efb1f4f719021e9fcae968de1b1812fed4c585f37198921" Feb 26 11:33:05 crc kubenswrapper[4699]: I0226 11:33:05.991569 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.013474 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.025898 4699 scope.go:117] "RemoveContainer" containerID="49f60262b1f3a898a9823898820585f9f8010c300b1f1a4ba7911b0817efb1e1" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.026376 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:33:06 crc kubenswrapper[4699]: E0226 11:33:06.026838 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3af9cf0d-3dcb-4d56-9373-2ec6ea323564" containerName="proxy-httpd" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.026858 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="3af9cf0d-3dcb-4d56-9373-2ec6ea323564" containerName="proxy-httpd" Feb 26 11:33:06 crc kubenswrapper[4699]: E0226 11:33:06.026866 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f99c6b36-a5f6-4f0b-973f-dfa853d2c558" containerName="mariadb-account-create-update" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.026874 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f99c6b36-a5f6-4f0b-973f-dfa853d2c558" containerName="mariadb-account-create-update" Feb 26 11:33:06 crc kubenswrapper[4699]: E0226 11:33:06.026895 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3af9cf0d-3dcb-4d56-9373-2ec6ea323564" containerName="ceilometer-central-agent" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.026902 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="3af9cf0d-3dcb-4d56-9373-2ec6ea323564" containerName="ceilometer-central-agent" Feb 26 11:33:06 crc kubenswrapper[4699]: E0226 11:33:06.026913 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3af9cf0d-3dcb-4d56-9373-2ec6ea323564" containerName="ceilometer-notification-agent" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.026920 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="3af9cf0d-3dcb-4d56-9373-2ec6ea323564" containerName="ceilometer-notification-agent" Feb 26 11:33:06 crc kubenswrapper[4699]: E0226 11:33:06.026935 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e54b257-33a7-43bd-80c5-30915ae82341" containerName="mariadb-database-create" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.026942 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e54b257-33a7-43bd-80c5-30915ae82341" containerName="mariadb-database-create" Feb 26 11:33:06 crc kubenswrapper[4699]: E0226 11:33:06.026969 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea40818-89fa-4b78-9833-82635861fee1" containerName="mariadb-account-create-update" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.026976 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea40818-89fa-4b78-9833-82635861fee1" containerName="mariadb-account-create-update" Feb 26 11:33:06 crc kubenswrapper[4699]: E0226 11:33:06.026992 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5acc31-dbe4-4698-8346-9a0dbc05234b" containerName="mariadb-database-create" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.026999 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5acc31-dbe4-4698-8346-9a0dbc05234b" containerName="mariadb-database-create" Feb 26 11:33:06 crc kubenswrapper[4699]: E0226 11:33:06.027012 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a229eb-75a5-41b1-8342-53a3a1b433a0" containerName="mariadb-account-create-update" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.027018 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a229eb-75a5-41b1-8342-53a3a1b433a0" containerName="mariadb-account-create-update" Feb 26 11:33:06 crc kubenswrapper[4699]: E0226 11:33:06.027031 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86425865-434f-43e8-9592-e890078837a2" containerName="mariadb-database-create" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.027037 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="86425865-434f-43e8-9592-e890078837a2" containerName="mariadb-database-create" Feb 26 11:33:06 crc kubenswrapper[4699]: E0226 11:33:06.027049 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3af9cf0d-3dcb-4d56-9373-2ec6ea323564" containerName="sg-core" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.027056 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="3af9cf0d-3dcb-4d56-9373-2ec6ea323564" containerName="sg-core" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.027292 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e54b257-33a7-43bd-80c5-30915ae82341" containerName="mariadb-database-create" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.027310 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="3af9cf0d-3dcb-4d56-9373-2ec6ea323564" containerName="ceilometer-notification-agent" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.027322 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4a229eb-75a5-41b1-8342-53a3a1b433a0" containerName="mariadb-account-create-update" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.027329 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="3af9cf0d-3dcb-4d56-9373-2ec6ea323564" containerName="proxy-httpd" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.027340 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="f99c6b36-a5f6-4f0b-973f-dfa853d2c558" containerName="mariadb-account-create-update" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.027352 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="3af9cf0d-3dcb-4d56-9373-2ec6ea323564" containerName="sg-core" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.027361 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="3af9cf0d-3dcb-4d56-9373-2ec6ea323564" containerName="ceilometer-central-agent" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.027374 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea40818-89fa-4b78-9833-82635861fee1" containerName="mariadb-account-create-update" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.027385 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="86425865-434f-43e8-9592-e890078837a2" containerName="mariadb-database-create" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.027394 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5acc31-dbe4-4698-8346-9a0dbc05234b" containerName="mariadb-database-create" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.029347 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.038071 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.056250 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.056289 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.077828 4699 scope.go:117] "RemoveContainer" containerID="5676ecd8fae29c4c55da186cdff47b53a9f38f95d3b5e21ebca1ea44099ac805" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.097246 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-scripts\") pod \"ceilometer-0\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " pod="openstack/ceilometer-0" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.097300 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snlzj\" (UniqueName: \"kubernetes.io/projected/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-kube-api-access-snlzj\") pod \"ceilometer-0\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " pod="openstack/ceilometer-0" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.097354 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-run-httpd\") pod \"ceilometer-0\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " pod="openstack/ceilometer-0" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.097372 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " pod="openstack/ceilometer-0" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.097392 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " pod="openstack/ceilometer-0" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.097436 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-config-data\") pod \"ceilometer-0\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " pod="openstack/ceilometer-0" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.097488 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-log-httpd\") pod \"ceilometer-0\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " pod="openstack/ceilometer-0" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.120475 4699 scope.go:117] "RemoveContainer" containerID="122b3010bd28520fa64679c76c74e7f10dc4aef71ddb5d070e468d1dfb7dddcf" Feb 26 11:33:06 crc kubenswrapper[4699]: E0226 11:33:06.120929 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"122b3010bd28520fa64679c76c74e7f10dc4aef71ddb5d070e468d1dfb7dddcf\": container with ID starting with 122b3010bd28520fa64679c76c74e7f10dc4aef71ddb5d070e468d1dfb7dddcf not found: ID does not exist" containerID="122b3010bd28520fa64679c76c74e7f10dc4aef71ddb5d070e468d1dfb7dddcf" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.121072 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"122b3010bd28520fa64679c76c74e7f10dc4aef71ddb5d070e468d1dfb7dddcf"} err="failed to get container status \"122b3010bd28520fa64679c76c74e7f10dc4aef71ddb5d070e468d1dfb7dddcf\": rpc error: code = NotFound desc = could not find container \"122b3010bd28520fa64679c76c74e7f10dc4aef71ddb5d070e468d1dfb7dddcf\": container with ID starting with 122b3010bd28520fa64679c76c74e7f10dc4aef71ddb5d070e468d1dfb7dddcf not found: ID does not exist" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.121215 4699 scope.go:117] "RemoveContainer" containerID="640ce3baeea3e4b12efb1f4f719021e9fcae968de1b1812fed4c585f37198921" Feb 26 11:33:06 crc kubenswrapper[4699]: E0226 11:33:06.121601 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"640ce3baeea3e4b12efb1f4f719021e9fcae968de1b1812fed4c585f37198921\": container with ID starting with 640ce3baeea3e4b12efb1f4f719021e9fcae968de1b1812fed4c585f37198921 not found: ID does not exist" containerID="640ce3baeea3e4b12efb1f4f719021e9fcae968de1b1812fed4c585f37198921" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.121694 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"640ce3baeea3e4b12efb1f4f719021e9fcae968de1b1812fed4c585f37198921"} err="failed to get container status \"640ce3baeea3e4b12efb1f4f719021e9fcae968de1b1812fed4c585f37198921\": rpc error: code = NotFound desc = could not find container \"640ce3baeea3e4b12efb1f4f719021e9fcae968de1b1812fed4c585f37198921\": container with ID starting with 640ce3baeea3e4b12efb1f4f719021e9fcae968de1b1812fed4c585f37198921 not found: ID does not exist" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.121774 4699 scope.go:117] "RemoveContainer" containerID="49f60262b1f3a898a9823898820585f9f8010c300b1f1a4ba7911b0817efb1e1" Feb 26 11:33:06 crc kubenswrapper[4699]: E0226 11:33:06.122182 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49f60262b1f3a898a9823898820585f9f8010c300b1f1a4ba7911b0817efb1e1\": container with ID starting with 49f60262b1f3a898a9823898820585f9f8010c300b1f1a4ba7911b0817efb1e1 not found: ID does not exist" containerID="49f60262b1f3a898a9823898820585f9f8010c300b1f1a4ba7911b0817efb1e1" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.122206 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49f60262b1f3a898a9823898820585f9f8010c300b1f1a4ba7911b0817efb1e1"} err="failed to get container status \"49f60262b1f3a898a9823898820585f9f8010c300b1f1a4ba7911b0817efb1e1\": rpc error: code = NotFound desc = could not find container \"49f60262b1f3a898a9823898820585f9f8010c300b1f1a4ba7911b0817efb1e1\": container with ID starting with 49f60262b1f3a898a9823898820585f9f8010c300b1f1a4ba7911b0817efb1e1 not found: ID does not exist" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.122221 4699 scope.go:117] "RemoveContainer" containerID="5676ecd8fae29c4c55da186cdff47b53a9f38f95d3b5e21ebca1ea44099ac805" Feb 26 11:33:06 crc kubenswrapper[4699]: E0226 11:33:06.122501 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5676ecd8fae29c4c55da186cdff47b53a9f38f95d3b5e21ebca1ea44099ac805\": container with ID starting with 5676ecd8fae29c4c55da186cdff47b53a9f38f95d3b5e21ebca1ea44099ac805 not found: ID does not exist" containerID="5676ecd8fae29c4c55da186cdff47b53a9f38f95d3b5e21ebca1ea44099ac805" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.122590 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5676ecd8fae29c4c55da186cdff47b53a9f38f95d3b5e21ebca1ea44099ac805"} err="failed to get container status \"5676ecd8fae29c4c55da186cdff47b53a9f38f95d3b5e21ebca1ea44099ac805\": rpc error: code = NotFound desc = could not find container \"5676ecd8fae29c4c55da186cdff47b53a9f38f95d3b5e21ebca1ea44099ac805\": container with ID starting with 5676ecd8fae29c4c55da186cdff47b53a9f38f95d3b5e21ebca1ea44099ac805 not found: ID does not exist" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.199094 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-scripts\") pod \"ceilometer-0\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " pod="openstack/ceilometer-0" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.199163 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snlzj\" (UniqueName: \"kubernetes.io/projected/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-kube-api-access-snlzj\") pod \"ceilometer-0\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " pod="openstack/ceilometer-0" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.199224 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-run-httpd\") pod \"ceilometer-0\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " pod="openstack/ceilometer-0" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.199260 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " pod="openstack/ceilometer-0" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.199280 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " pod="openstack/ceilometer-0" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.199316 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-config-data\") pod \"ceilometer-0\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " pod="openstack/ceilometer-0" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.199346 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-log-httpd\") pod \"ceilometer-0\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " pod="openstack/ceilometer-0" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.199775 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-log-httpd\") pod \"ceilometer-0\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " pod="openstack/ceilometer-0" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.200394 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-run-httpd\") pod \"ceilometer-0\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " pod="openstack/ceilometer-0" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.202984 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " pod="openstack/ceilometer-0" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.203296 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-scripts\") pod \"ceilometer-0\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " pod="openstack/ceilometer-0" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.203963 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " pod="openstack/ceilometer-0" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.204319 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-config-data\") pod \"ceilometer-0\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " pod="openstack/ceilometer-0" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.220508 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snlzj\" (UniqueName: \"kubernetes.io/projected/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-kube-api-access-snlzj\") pod \"ceilometer-0\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " pod="openstack/ceilometer-0" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.272218 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3af9cf0d-3dcb-4d56-9373-2ec6ea323564" path="/var/lib/kubelet/pods/3af9cf0d-3dcb-4d56-9373-2ec6ea323564/volumes" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.379584 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.531881 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vx5jv"] Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.533607 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vx5jv" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.535274 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.535801 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.536134 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-wwbkn" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.543335 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vx5jv"] Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.607800 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef20f352-fa9c-4bc8-875d-d537f00f75d5-scripts\") pod \"nova-cell0-conductor-db-sync-vx5jv\" (UID: \"ef20f352-fa9c-4bc8-875d-d537f00f75d5\") " pod="openstack/nova-cell0-conductor-db-sync-vx5jv" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.607844 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nmzl\" (UniqueName: \"kubernetes.io/projected/ef20f352-fa9c-4bc8-875d-d537f00f75d5-kube-api-access-7nmzl\") pod \"nova-cell0-conductor-db-sync-vx5jv\" (UID: \"ef20f352-fa9c-4bc8-875d-d537f00f75d5\") " pod="openstack/nova-cell0-conductor-db-sync-vx5jv" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.607893 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef20f352-fa9c-4bc8-875d-d537f00f75d5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vx5jv\" (UID: \"ef20f352-fa9c-4bc8-875d-d537f00f75d5\") " pod="openstack/nova-cell0-conductor-db-sync-vx5jv" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.608203 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef20f352-fa9c-4bc8-875d-d537f00f75d5-config-data\") pod \"nova-cell0-conductor-db-sync-vx5jv\" (UID: \"ef20f352-fa9c-4bc8-875d-d537f00f75d5\") " pod="openstack/nova-cell0-conductor-db-sync-vx5jv" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.711818 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef20f352-fa9c-4bc8-875d-d537f00f75d5-config-data\") pod \"nova-cell0-conductor-db-sync-vx5jv\" (UID: \"ef20f352-fa9c-4bc8-875d-d537f00f75d5\") " pod="openstack/nova-cell0-conductor-db-sync-vx5jv" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.711975 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef20f352-fa9c-4bc8-875d-d537f00f75d5-scripts\") pod \"nova-cell0-conductor-db-sync-vx5jv\" (UID: \"ef20f352-fa9c-4bc8-875d-d537f00f75d5\") " pod="openstack/nova-cell0-conductor-db-sync-vx5jv" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.712015 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nmzl\" (UniqueName: \"kubernetes.io/projected/ef20f352-fa9c-4bc8-875d-d537f00f75d5-kube-api-access-7nmzl\") pod \"nova-cell0-conductor-db-sync-vx5jv\" (UID: \"ef20f352-fa9c-4bc8-875d-d537f00f75d5\") " pod="openstack/nova-cell0-conductor-db-sync-vx5jv" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.712077 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef20f352-fa9c-4bc8-875d-d537f00f75d5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vx5jv\" (UID: \"ef20f352-fa9c-4bc8-875d-d537f00f75d5\") " pod="openstack/nova-cell0-conductor-db-sync-vx5jv" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.718694 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef20f352-fa9c-4bc8-875d-d537f00f75d5-scripts\") pod \"nova-cell0-conductor-db-sync-vx5jv\" (UID: \"ef20f352-fa9c-4bc8-875d-d537f00f75d5\") " pod="openstack/nova-cell0-conductor-db-sync-vx5jv" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.718906 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef20f352-fa9c-4bc8-875d-d537f00f75d5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vx5jv\" (UID: \"ef20f352-fa9c-4bc8-875d-d537f00f75d5\") " pod="openstack/nova-cell0-conductor-db-sync-vx5jv" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.718977 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef20f352-fa9c-4bc8-875d-d537f00f75d5-config-data\") pod \"nova-cell0-conductor-db-sync-vx5jv\" (UID: \"ef20f352-fa9c-4bc8-875d-d537f00f75d5\") " pod="openstack/nova-cell0-conductor-db-sync-vx5jv" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.730909 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nmzl\" (UniqueName: \"kubernetes.io/projected/ef20f352-fa9c-4bc8-875d-d537f00f75d5-kube-api-access-7nmzl\") pod \"nova-cell0-conductor-db-sync-vx5jv\" (UID: \"ef20f352-fa9c-4bc8-875d-d537f00f75d5\") " pod="openstack/nova-cell0-conductor-db-sync-vx5jv" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.859429 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vx5jv" Feb 26 11:33:06 crc kubenswrapper[4699]: I0226 11:33:06.937473 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:33:07 crc kubenswrapper[4699]: I0226 11:33:07.345083 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vx5jv"] Feb 26 11:33:07 crc kubenswrapper[4699]: W0226 11:33:07.352968 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef20f352_fa9c_4bc8_875d_d537f00f75d5.slice/crio-eb24292f1cef0da184d4ae6d21a79311c33325b5ebc6fce8d6d4a49689b83c5c WatchSource:0}: Error finding container eb24292f1cef0da184d4ae6d21a79311c33325b5ebc6fce8d6d4a49689b83c5c: Status 404 returned error can't find the container with id eb24292f1cef0da184d4ae6d21a79311c33325b5ebc6fce8d6d4a49689b83c5c Feb 26 11:33:07 crc kubenswrapper[4699]: I0226 11:33:07.991466 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"208a51e1-6d1d-4dc4-be5e-fa414dd87c53","Type":"ContainerStarted","Data":"613ad9080c787405540b5bbf9c6327648fb6d0e715ed0b9269326172474014dc"} Feb 26 11:33:07 crc kubenswrapper[4699]: I0226 11:33:07.992242 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"208a51e1-6d1d-4dc4-be5e-fa414dd87c53","Type":"ContainerStarted","Data":"f5db747a1df1457a54dc26f8dc7732ec37162001229405d6e3c5d95928877c82"} Feb 26 11:33:07 crc kubenswrapper[4699]: I0226 11:33:07.992552 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vx5jv" event={"ID":"ef20f352-fa9c-4bc8-875d-d537f00f75d5","Type":"ContainerStarted","Data":"eb24292f1cef0da184d4ae6d21a79311c33325b5ebc6fce8d6d4a49689b83c5c"} Feb 26 11:33:09 crc kubenswrapper[4699]: I0226 11:33:09.488248 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"208a51e1-6d1d-4dc4-be5e-fa414dd87c53","Type":"ContainerStarted","Data":"69a68218e2ed5906b1f2da6c10e033f2dc29321a9daa1c590a95092e50555d5e"} Feb 26 11:33:10 crc kubenswrapper[4699]: I0226 11:33:10.499040 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"208a51e1-6d1d-4dc4-be5e-fa414dd87c53","Type":"ContainerStarted","Data":"2ba5f61bbedbf25c6d3fc6052b14a28f362683db8e3bec9d0ffc62f27a5a70db"} Feb 26 11:33:10 crc kubenswrapper[4699]: I0226 11:33:10.600010 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 11:33:10 crc kubenswrapper[4699]: I0226 11:33:10.600285 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d42e724c-224e-4c68-b5b4-b72d72d4ded8" containerName="glance-log" containerID="cri-o://6389553c6ab212c6bdd09de56f8a6c0bc3ab110475816ef41318a3d8e60aa198" gracePeriod=30 Feb 26 11:33:10 crc kubenswrapper[4699]: I0226 11:33:10.600423 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d42e724c-224e-4c68-b5b4-b72d72d4ded8" containerName="glance-httpd" containerID="cri-o://178382a3de582f32c10d899adfcff30626331736ab73b2469c8ca1b37fab0c4c" gracePeriod=30 Feb 26 11:33:11 crc kubenswrapper[4699]: I0226 11:33:11.424064 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 11:33:11 crc kubenswrapper[4699]: I0226 11:33:11.424579 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="03f1bc3b-c587-4c47-bbc2-3dca2240d30c" containerName="glance-httpd" containerID="cri-o://9c85812a27867e8983c7de4f13e637e2314c74c4cdd4c4523f56efe2ed2922e2" gracePeriod=30 Feb 26 11:33:11 crc kubenswrapper[4699]: I0226 11:33:11.424839 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="03f1bc3b-c587-4c47-bbc2-3dca2240d30c" containerName="glance-log" containerID="cri-o://96b8b1dbd04dece91d5638035ad1782637a57aa6ee4d0c482a41be17ca67c558" gracePeriod=30 Feb 26 11:33:11 crc kubenswrapper[4699]: I0226 11:33:11.509481 4699 generic.go:334] "Generic (PLEG): container finished" podID="d42e724c-224e-4c68-b5b4-b72d72d4ded8" containerID="6389553c6ab212c6bdd09de56f8a6c0bc3ab110475816ef41318a3d8e60aa198" exitCode=143 Feb 26 11:33:11 crc kubenswrapper[4699]: I0226 11:33:11.509636 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d42e724c-224e-4c68-b5b4-b72d72d4ded8","Type":"ContainerDied","Data":"6389553c6ab212c6bdd09de56f8a6c0bc3ab110475816ef41318a3d8e60aa198"} Feb 26 11:33:11 crc kubenswrapper[4699]: I0226 11:33:11.584466 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:33:11 crc kubenswrapper[4699]: I0226 11:33:11.585017 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:33:12 crc kubenswrapper[4699]: I0226 11:33:12.512756 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:33:12 crc kubenswrapper[4699]: I0226 11:33:12.524574 4699 generic.go:334] "Generic (PLEG): container finished" podID="03f1bc3b-c587-4c47-bbc2-3dca2240d30c" containerID="96b8b1dbd04dece91d5638035ad1782637a57aa6ee4d0c482a41be17ca67c558" exitCode=143 Feb 26 11:33:12 crc kubenswrapper[4699]: I0226 11:33:12.524629 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"03f1bc3b-c587-4c47-bbc2-3dca2240d30c","Type":"ContainerDied","Data":"96b8b1dbd04dece91d5638035ad1782637a57aa6ee4d0c482a41be17ca67c558"} Feb 26 11:33:14 crc kubenswrapper[4699]: I0226 11:33:14.543662 4699 generic.go:334] "Generic (PLEG): container finished" podID="d42e724c-224e-4c68-b5b4-b72d72d4ded8" containerID="178382a3de582f32c10d899adfcff30626331736ab73b2469c8ca1b37fab0c4c" exitCode=0 Feb 26 11:33:14 crc kubenswrapper[4699]: I0226 11:33:14.543733 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d42e724c-224e-4c68-b5b4-b72d72d4ded8","Type":"ContainerDied","Data":"178382a3de582f32c10d899adfcff30626331736ab73b2469c8ca1b37fab0c4c"} Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.285186 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.374649 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d42e724c-224e-4c68-b5b4-b72d72d4ded8-httpd-run\") pod \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.374738 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.374817 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-combined-ca-bundle\") pod \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.374933 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-public-tls-certs\") pod \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.374962 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q7ml\" (UniqueName: \"kubernetes.io/projected/d42e724c-224e-4c68-b5b4-b72d72d4ded8-kube-api-access-6q7ml\") pod \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.374980 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-scripts\") pod \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.375019 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d42e724c-224e-4c68-b5b4-b72d72d4ded8-logs\") pod \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.375034 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-config-data\") pod \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\" (UID: \"d42e724c-224e-4c68-b5b4-b72d72d4ded8\") " Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.375186 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d42e724c-224e-4c68-b5b4-b72d72d4ded8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d42e724c-224e-4c68-b5b4-b72d72d4ded8" (UID: "d42e724c-224e-4c68-b5b4-b72d72d4ded8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.375442 4699 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d42e724c-224e-4c68-b5b4-b72d72d4ded8-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.377560 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d42e724c-224e-4c68-b5b4-b72d72d4ded8-logs" (OuterVolumeSpecName: "logs") pod "d42e724c-224e-4c68-b5b4-b72d72d4ded8" (UID: "d42e724c-224e-4c68-b5b4-b72d72d4ded8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.385271 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "d42e724c-224e-4c68-b5b4-b72d72d4ded8" (UID: "d42e724c-224e-4c68-b5b4-b72d72d4ded8"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.385306 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-scripts" (OuterVolumeSpecName: "scripts") pod "d42e724c-224e-4c68-b5b4-b72d72d4ded8" (UID: "d42e724c-224e-4c68-b5b4-b72d72d4ded8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.404251 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d42e724c-224e-4c68-b5b4-b72d72d4ded8-kube-api-access-6q7ml" (OuterVolumeSpecName: "kube-api-access-6q7ml") pod "d42e724c-224e-4c68-b5b4-b72d72d4ded8" (UID: "d42e724c-224e-4c68-b5b4-b72d72d4ded8"). InnerVolumeSpecName "kube-api-access-6q7ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.406354 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.443548 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d42e724c-224e-4c68-b5b4-b72d72d4ded8" (UID: "d42e724c-224e-4c68-b5b4-b72d72d4ded8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.468933 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-config-data" (OuterVolumeSpecName: "config-data") pod "d42e724c-224e-4c68-b5b4-b72d72d4ded8" (UID: "d42e724c-224e-4c68-b5b4-b72d72d4ded8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.476108 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-internal-tls-certs\") pod \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.476181 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-logs\") pod \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.476227 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c2rv\" (UniqueName: \"kubernetes.io/projected/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-kube-api-access-7c2rv\") pod \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.476249 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.476284 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-scripts\") pod \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.476360 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-combined-ca-bundle\") pod \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.476485 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-config-data\") pod \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.476535 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-httpd-run\") pod \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\" (UID: \"03f1bc3b-c587-4c47-bbc2-3dca2240d30c\") " Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.477956 4699 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.477992 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.478006 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q7ml\" (UniqueName: \"kubernetes.io/projected/d42e724c-224e-4c68-b5b4-b72d72d4ded8-kube-api-access-6q7ml\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.478017 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.478028 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d42e724c-224e-4c68-b5b4-b72d72d4ded8-logs\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.478038 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.478306 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-logs" (OuterVolumeSpecName: "logs") pod "03f1bc3b-c587-4c47-bbc2-3dca2240d30c" (UID: "03f1bc3b-c587-4c47-bbc2-3dca2240d30c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.478696 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "03f1bc3b-c587-4c47-bbc2-3dca2240d30c" (UID: "03f1bc3b-c587-4c47-bbc2-3dca2240d30c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.486421 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d42e724c-224e-4c68-b5b4-b72d72d4ded8" (UID: "d42e724c-224e-4c68-b5b4-b72d72d4ded8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.488479 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "03f1bc3b-c587-4c47-bbc2-3dca2240d30c" (UID: "03f1bc3b-c587-4c47-bbc2-3dca2240d30c"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.489262 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-scripts" (OuterVolumeSpecName: "scripts") pod "03f1bc3b-c587-4c47-bbc2-3dca2240d30c" (UID: "03f1bc3b-c587-4c47-bbc2-3dca2240d30c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.507471 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-kube-api-access-7c2rv" (OuterVolumeSpecName: "kube-api-access-7c2rv") pod "03f1bc3b-c587-4c47-bbc2-3dca2240d30c" (UID: "03f1bc3b-c587-4c47-bbc2-3dca2240d30c"). InnerVolumeSpecName "kube-api-access-7c2rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.525909 4699 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.560984 4699 generic.go:334] "Generic (PLEG): container finished" podID="03f1bc3b-c587-4c47-bbc2-3dca2240d30c" containerID="9c85812a27867e8983c7de4f13e637e2314c74c4cdd4c4523f56efe2ed2922e2" exitCode=0 Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.561041 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"03f1bc3b-c587-4c47-bbc2-3dca2240d30c","Type":"ContainerDied","Data":"9c85812a27867e8983c7de4f13e637e2314c74c4cdd4c4523f56efe2ed2922e2"} Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.561067 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"03f1bc3b-c587-4c47-bbc2-3dca2240d30c","Type":"ContainerDied","Data":"1807142ca5ad23a6f297805a63bd9002973ffc620dfe243a1b7bd07573b9a98e"} Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.561084 4699 scope.go:117] "RemoveContainer" containerID="9c85812a27867e8983c7de4f13e637e2314c74c4cdd4c4523f56efe2ed2922e2" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.561265 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.562965 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "03f1bc3b-c587-4c47-bbc2-3dca2240d30c" (UID: "03f1bc3b-c587-4c47-bbc2-3dca2240d30c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.563822 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d42e724c-224e-4c68-b5b4-b72d72d4ded8","Type":"ContainerDied","Data":"93414ab02ed9ca4e817beb6280ab1441d20975697c632df8c1a82aa6fe45a0b0"} Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.563898 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.578614 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="208a51e1-6d1d-4dc4-be5e-fa414dd87c53" containerName="ceilometer-central-agent" containerID="cri-o://613ad9080c787405540b5bbf9c6327648fb6d0e715ed0b9269326172474014dc" gracePeriod=30 Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.578718 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"208a51e1-6d1d-4dc4-be5e-fa414dd87c53","Type":"ContainerStarted","Data":"c2d216c82faf1eaa8cebf746f867d73e9bdd46ef39c2375acff0d8012285a29a"} Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.578788 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.579210 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="208a51e1-6d1d-4dc4-be5e-fa414dd87c53" containerName="proxy-httpd" containerID="cri-o://c2d216c82faf1eaa8cebf746f867d73e9bdd46ef39c2375acff0d8012285a29a" gracePeriod=30 Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.579291 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="208a51e1-6d1d-4dc4-be5e-fa414dd87c53" containerName="sg-core" containerID="cri-o://2ba5f61bbedbf25c6d3fc6052b14a28f362683db8e3bec9d0ffc62f27a5a70db" gracePeriod=30 Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.579341 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="208a51e1-6d1d-4dc4-be5e-fa414dd87c53" containerName="ceilometer-notification-agent" containerID="cri-o://69a68218e2ed5906b1f2da6c10e033f2dc29321a9daa1c590a95092e50555d5e" gracePeriod=30 Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.586769 4699 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.586796 4699 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.586809 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-logs\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.586818 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c2rv\" (UniqueName: \"kubernetes.io/projected/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-kube-api-access-7c2rv\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.586845 4699 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.586854 4699 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.586863 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.586872 4699 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d42e724c-224e-4c68-b5b4-b72d72d4ded8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.593511 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03f1bc3b-c587-4c47-bbc2-3dca2240d30c" (UID: "03f1bc3b-c587-4c47-bbc2-3dca2240d30c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.606522 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vx5jv" event={"ID":"ef20f352-fa9c-4bc8-875d-d537f00f75d5","Type":"ContainerStarted","Data":"b4034fed15cab382c6c5fd47ff21f822b9c9aa9789392181d8ca9fe59c0d233d"} Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.614033 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.599954267 podStartE2EDuration="10.61401043s" podCreationTimestamp="2026-02-26 11:33:05 +0000 UTC" firstStartedPulling="2026-02-26 11:33:06.960653761 +0000 UTC m=+1332.771480215" lastFinishedPulling="2026-02-26 11:33:14.974709954 +0000 UTC m=+1340.785536378" observedRunningTime="2026-02-26 11:33:15.60642938 +0000 UTC m=+1341.417255814" watchObservedRunningTime="2026-02-26 11:33:15.61401043 +0000 UTC m=+1341.424836864" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.639347 4699 scope.go:117] "RemoveContainer" containerID="96b8b1dbd04dece91d5638035ad1782637a57aa6ee4d0c482a41be17ca67c558" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.690416 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-vx5jv" podStartSLOduration=2.069838932 podStartE2EDuration="9.690393668s" podCreationTimestamp="2026-02-26 11:33:06 +0000 UTC" firstStartedPulling="2026-02-26 11:33:07.354754855 +0000 UTC m=+1333.165581289" lastFinishedPulling="2026-02-26 11:33:14.975309581 +0000 UTC m=+1340.786136025" observedRunningTime="2026-02-26 11:33:15.627984426 +0000 UTC m=+1341.438810860" watchObservedRunningTime="2026-02-26 11:33:15.690393668 +0000 UTC m=+1341.501220102" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.691602 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-config-data" (OuterVolumeSpecName: "config-data") pod "03f1bc3b-c587-4c47-bbc2-3dca2240d30c" (UID: "03f1bc3b-c587-4c47-bbc2-3dca2240d30c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.693160 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.723768 4699 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.724047 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.731171 4699 scope.go:117] "RemoveContainer" containerID="9c85812a27867e8983c7de4f13e637e2314c74c4cdd4c4523f56efe2ed2922e2" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.738484 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 11:33:15 crc kubenswrapper[4699]: E0226 11:33:15.748404 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c85812a27867e8983c7de4f13e637e2314c74c4cdd4c4523f56efe2ed2922e2\": container with ID starting with 9c85812a27867e8983c7de4f13e637e2314c74c4cdd4c4523f56efe2ed2922e2 not found: ID does not exist" containerID="9c85812a27867e8983c7de4f13e637e2314c74c4cdd4c4523f56efe2ed2922e2" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.748448 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c85812a27867e8983c7de4f13e637e2314c74c4cdd4c4523f56efe2ed2922e2"} err="failed to get container status \"9c85812a27867e8983c7de4f13e637e2314c74c4cdd4c4523f56efe2ed2922e2\": rpc error: code = NotFound desc = could not find container \"9c85812a27867e8983c7de4f13e637e2314c74c4cdd4c4523f56efe2ed2922e2\": container with ID starting with 9c85812a27867e8983c7de4f13e637e2314c74c4cdd4c4523f56efe2ed2922e2 not found: ID does not exist" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.748479 4699 scope.go:117] "RemoveContainer" containerID="96b8b1dbd04dece91d5638035ad1782637a57aa6ee4d0c482a41be17ca67c558" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.749919 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 11:33:15 crc kubenswrapper[4699]: E0226 11:33:15.750070 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96b8b1dbd04dece91d5638035ad1782637a57aa6ee4d0c482a41be17ca67c558\": container with ID starting with 96b8b1dbd04dece91d5638035ad1782637a57aa6ee4d0c482a41be17ca67c558 not found: ID does not exist" containerID="96b8b1dbd04dece91d5638035ad1782637a57aa6ee4d0c482a41be17ca67c558" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.750129 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96b8b1dbd04dece91d5638035ad1782637a57aa6ee4d0c482a41be17ca67c558"} err="failed to get container status \"96b8b1dbd04dece91d5638035ad1782637a57aa6ee4d0c482a41be17ca67c558\": rpc error: code = NotFound desc = could not find container \"96b8b1dbd04dece91d5638035ad1782637a57aa6ee4d0c482a41be17ca67c558\": container with ID starting with 96b8b1dbd04dece91d5638035ad1782637a57aa6ee4d0c482a41be17ca67c558 not found: ID does not exist" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.750160 4699 scope.go:117] "RemoveContainer" containerID="178382a3de582f32c10d899adfcff30626331736ab73b2469c8ca1b37fab0c4c" Feb 26 11:33:15 crc kubenswrapper[4699]: E0226 11:33:15.750406 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d42e724c-224e-4c68-b5b4-b72d72d4ded8" containerName="glance-httpd" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.750428 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42e724c-224e-4c68-b5b4-b72d72d4ded8" containerName="glance-httpd" Feb 26 11:33:15 crc kubenswrapper[4699]: E0226 11:33:15.750468 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f1bc3b-c587-4c47-bbc2-3dca2240d30c" containerName="glance-log" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.750475 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f1bc3b-c587-4c47-bbc2-3dca2240d30c" containerName="glance-log" Feb 26 11:33:15 crc kubenswrapper[4699]: E0226 11:33:15.750483 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d42e724c-224e-4c68-b5b4-b72d72d4ded8" containerName="glance-log" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.750489 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42e724c-224e-4c68-b5b4-b72d72d4ded8" containerName="glance-log" Feb 26 11:33:15 crc kubenswrapper[4699]: E0226 11:33:15.750500 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f1bc3b-c587-4c47-bbc2-3dca2240d30c" containerName="glance-httpd" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.750507 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f1bc3b-c587-4c47-bbc2-3dca2240d30c" containerName="glance-httpd" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.750655 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="03f1bc3b-c587-4c47-bbc2-3dca2240d30c" containerName="glance-httpd" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.750673 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="d42e724c-224e-4c68-b5b4-b72d72d4ded8" containerName="glance-log" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.750681 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="03f1bc3b-c587-4c47-bbc2-3dca2240d30c" containerName="glance-log" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.750689 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="d42e724c-224e-4c68-b5b4-b72d72d4ded8" containerName="glance-httpd" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.751744 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.757768 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.757836 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.770023 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.795498 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03f1bc3b-c587-4c47-bbc2-3dca2240d30c-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.795538 4699 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.800067 4699 scope.go:117] "RemoveContainer" containerID="6389553c6ab212c6bdd09de56f8a6c0bc3ab110475816ef41318a3d8e60aa198" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.896826 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.896882 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c58ea0a-4ad4-47cf-8976-a004ef7e56da-logs\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.896990 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c58ea0a-4ad4-47cf-8976-a004ef7e56da-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.897170 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c58ea0a-4ad4-47cf-8976-a004ef7e56da-config-data\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.897212 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c58ea0a-4ad4-47cf-8976-a004ef7e56da-scripts\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.897232 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4xhd\" (UniqueName: \"kubernetes.io/projected/9c58ea0a-4ad4-47cf-8976-a004ef7e56da-kube-api-access-b4xhd\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.897283 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c58ea0a-4ad4-47cf-8976-a004ef7e56da-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.897299 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c58ea0a-4ad4-47cf-8976-a004ef7e56da-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.977640 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.987725 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.999161 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c58ea0a-4ad4-47cf-8976-a004ef7e56da-config-data\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.999215 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c58ea0a-4ad4-47cf-8976-a004ef7e56da-scripts\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.999234 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4xhd\" (UniqueName: \"kubernetes.io/projected/9c58ea0a-4ad4-47cf-8976-a004ef7e56da-kube-api-access-b4xhd\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.999266 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c58ea0a-4ad4-47cf-8976-a004ef7e56da-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.999283 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c58ea0a-4ad4-47cf-8976-a004ef7e56da-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.999338 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.999368 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c58ea0a-4ad4-47cf-8976-a004ef7e56da-logs\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.999436 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c58ea0a-4ad4-47cf-8976-a004ef7e56da-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:15 crc kubenswrapper[4699]: I0226 11:33:15.999922 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c58ea0a-4ad4-47cf-8976-a004ef7e56da-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.003998 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.004306 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.004956 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c58ea0a-4ad4-47cf-8976-a004ef7e56da-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.005831 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.011677 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.015383 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c58ea0a-4ad4-47cf-8976-a004ef7e56da-logs\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.015598 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.017092 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c58ea0a-4ad4-47cf-8976-a004ef7e56da-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.019720 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c58ea0a-4ad4-47cf-8976-a004ef7e56da-scripts\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.054851 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4xhd\" (UniqueName: \"kubernetes.io/projected/9c58ea0a-4ad4-47cf-8976-a004ef7e56da-kube-api-access-b4xhd\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.055872 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.056945 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c58ea0a-4ad4-47cf-8976-a004ef7e56da-config-data\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.069979 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"9c58ea0a-4ad4-47cf-8976-a004ef7e56da\") " pod="openstack/glance-default-external-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.101258 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/796738f1-8a6c-4e91-bdfe-bee2f252b3fc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.101299 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k462j\" (UniqueName: \"kubernetes.io/projected/796738f1-8a6c-4e91-bdfe-bee2f252b3fc-kube-api-access-k462j\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.101371 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.101406 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/796738f1-8a6c-4e91-bdfe-bee2f252b3fc-logs\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.101429 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/796738f1-8a6c-4e91-bdfe-bee2f252b3fc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.101482 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/796738f1-8a6c-4e91-bdfe-bee2f252b3fc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.101524 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/796738f1-8a6c-4e91-bdfe-bee2f252b3fc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.101540 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/796738f1-8a6c-4e91-bdfe-bee2f252b3fc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.202754 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k462j\" (UniqueName: \"kubernetes.io/projected/796738f1-8a6c-4e91-bdfe-bee2f252b3fc-kube-api-access-k462j\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.202876 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.202937 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/796738f1-8a6c-4e91-bdfe-bee2f252b3fc-logs\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.202966 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/796738f1-8a6c-4e91-bdfe-bee2f252b3fc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.203032 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/796738f1-8a6c-4e91-bdfe-bee2f252b3fc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.203084 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/796738f1-8a6c-4e91-bdfe-bee2f252b3fc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.203103 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/796738f1-8a6c-4e91-bdfe-bee2f252b3fc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.203136 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/796738f1-8a6c-4e91-bdfe-bee2f252b3fc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.203188 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.203594 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/796738f1-8a6c-4e91-bdfe-bee2f252b3fc-logs\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.203893 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/796738f1-8a6c-4e91-bdfe-bee2f252b3fc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.207183 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/796738f1-8a6c-4e91-bdfe-bee2f252b3fc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.209686 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/796738f1-8a6c-4e91-bdfe-bee2f252b3fc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.210094 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/796738f1-8a6c-4e91-bdfe-bee2f252b3fc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.212310 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/796738f1-8a6c-4e91-bdfe-bee2f252b3fc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.232522 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k462j\" (UniqueName: \"kubernetes.io/projected/796738f1-8a6c-4e91-bdfe-bee2f252b3fc-kube-api-access-k462j\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.240756 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"796738f1-8a6c-4e91-bdfe-bee2f252b3fc\") " pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.275904 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03f1bc3b-c587-4c47-bbc2-3dca2240d30c" path="/var/lib/kubelet/pods/03f1bc3b-c587-4c47-bbc2-3dca2240d30c/volumes" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.276706 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d42e724c-224e-4c68-b5b4-b72d72d4ded8" path="/var/lib/kubelet/pods/d42e724c-224e-4c68-b5b4-b72d72d4ded8/volumes" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.371435 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.445520 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.623811 4699 generic.go:334] "Generic (PLEG): container finished" podID="208a51e1-6d1d-4dc4-be5e-fa414dd87c53" containerID="c2d216c82faf1eaa8cebf746f867d73e9bdd46ef39c2375acff0d8012285a29a" exitCode=0 Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.623851 4699 generic.go:334] "Generic (PLEG): container finished" podID="208a51e1-6d1d-4dc4-be5e-fa414dd87c53" containerID="2ba5f61bbedbf25c6d3fc6052b14a28f362683db8e3bec9d0ffc62f27a5a70db" exitCode=2 Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.623864 4699 generic.go:334] "Generic (PLEG): container finished" podID="208a51e1-6d1d-4dc4-be5e-fa414dd87c53" containerID="69a68218e2ed5906b1f2da6c10e033f2dc29321a9daa1c590a95092e50555d5e" exitCode=0 Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.624796 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"208a51e1-6d1d-4dc4-be5e-fa414dd87c53","Type":"ContainerDied","Data":"c2d216c82faf1eaa8cebf746f867d73e9bdd46ef39c2375acff0d8012285a29a"} Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.624818 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"208a51e1-6d1d-4dc4-be5e-fa414dd87c53","Type":"ContainerDied","Data":"2ba5f61bbedbf25c6d3fc6052b14a28f362683db8e3bec9d0ffc62f27a5a70db"} Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.624829 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"208a51e1-6d1d-4dc4-be5e-fa414dd87c53","Type":"ContainerDied","Data":"69a68218e2ed5906b1f2da6c10e033f2dc29321a9daa1c590a95092e50555d5e"} Feb 26 11:33:16 crc kubenswrapper[4699]: I0226 11:33:16.982270 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.072149 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.525355 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.632183 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-combined-ca-bundle\") pod \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.632229 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-config-data\") pod \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.632286 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-log-httpd\") pod \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.632306 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-run-httpd\") pod \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.632330 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snlzj\" (UniqueName: \"kubernetes.io/projected/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-kube-api-access-snlzj\") pod \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.632363 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-scripts\") pod \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.632398 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-sg-core-conf-yaml\") pod \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\" (UID: \"208a51e1-6d1d-4dc4-be5e-fa414dd87c53\") " Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.633394 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "208a51e1-6d1d-4dc4-be5e-fa414dd87c53" (UID: "208a51e1-6d1d-4dc4-be5e-fa414dd87c53"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.634828 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "208a51e1-6d1d-4dc4-be5e-fa414dd87c53" (UID: "208a51e1-6d1d-4dc4-be5e-fa414dd87c53"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.640397 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-scripts" (OuterVolumeSpecName: "scripts") pod "208a51e1-6d1d-4dc4-be5e-fa414dd87c53" (UID: "208a51e1-6d1d-4dc4-be5e-fa414dd87c53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.640858 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9c58ea0a-4ad4-47cf-8976-a004ef7e56da","Type":"ContainerStarted","Data":"3529c7f0d736937a2fd1b50e08c54631e6f695cc73430fabf4e692c1f92856ce"} Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.641075 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-kube-api-access-snlzj" (OuterVolumeSpecName: "kube-api-access-snlzj") pod "208a51e1-6d1d-4dc4-be5e-fa414dd87c53" (UID: "208a51e1-6d1d-4dc4-be5e-fa414dd87c53"). InnerVolumeSpecName "kube-api-access-snlzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.646906 4699 generic.go:334] "Generic (PLEG): container finished" podID="208a51e1-6d1d-4dc4-be5e-fa414dd87c53" containerID="613ad9080c787405540b5bbf9c6327648fb6d0e715ed0b9269326172474014dc" exitCode=0 Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.646945 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"208a51e1-6d1d-4dc4-be5e-fa414dd87c53","Type":"ContainerDied","Data":"613ad9080c787405540b5bbf9c6327648fb6d0e715ed0b9269326172474014dc"} Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.647009 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"208a51e1-6d1d-4dc4-be5e-fa414dd87c53","Type":"ContainerDied","Data":"f5db747a1df1457a54dc26f8dc7732ec37162001229405d6e3c5d95928877c82"} Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.647011 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.647029 4699 scope.go:117] "RemoveContainer" containerID="c2d216c82faf1eaa8cebf746f867d73e9bdd46ef39c2375acff0d8012285a29a" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.648763 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"796738f1-8a6c-4e91-bdfe-bee2f252b3fc","Type":"ContainerStarted","Data":"1432b03cd9815938adecfe960a1371ee5d84e13b1503b4e8a86ec0e75efcacdb"} Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.691471 4699 scope.go:117] "RemoveContainer" containerID="2ba5f61bbedbf25c6d3fc6052b14a28f362683db8e3bec9d0ffc62f27a5a70db" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.693983 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "208a51e1-6d1d-4dc4-be5e-fa414dd87c53" (UID: "208a51e1-6d1d-4dc4-be5e-fa414dd87c53"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.711723 4699 scope.go:117] "RemoveContainer" containerID="69a68218e2ed5906b1f2da6c10e033f2dc29321a9daa1c590a95092e50555d5e" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.734154 4699 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.734191 4699 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.734203 4699 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.734217 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snlzj\" (UniqueName: \"kubernetes.io/projected/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-kube-api-access-snlzj\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.734232 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.734653 4699 scope.go:117] "RemoveContainer" containerID="613ad9080c787405540b5bbf9c6327648fb6d0e715ed0b9269326172474014dc" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.750761 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "208a51e1-6d1d-4dc4-be5e-fa414dd87c53" (UID: "208a51e1-6d1d-4dc4-be5e-fa414dd87c53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.778604 4699 scope.go:117] "RemoveContainer" containerID="c2d216c82faf1eaa8cebf746f867d73e9bdd46ef39c2375acff0d8012285a29a" Feb 26 11:33:17 crc kubenswrapper[4699]: E0226 11:33:17.779777 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2d216c82faf1eaa8cebf746f867d73e9bdd46ef39c2375acff0d8012285a29a\": container with ID starting with c2d216c82faf1eaa8cebf746f867d73e9bdd46ef39c2375acff0d8012285a29a not found: ID does not exist" containerID="c2d216c82faf1eaa8cebf746f867d73e9bdd46ef39c2375acff0d8012285a29a" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.779817 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2d216c82faf1eaa8cebf746f867d73e9bdd46ef39c2375acff0d8012285a29a"} err="failed to get container status \"c2d216c82faf1eaa8cebf746f867d73e9bdd46ef39c2375acff0d8012285a29a\": rpc error: code = NotFound desc = could not find container \"c2d216c82faf1eaa8cebf746f867d73e9bdd46ef39c2375acff0d8012285a29a\": container with ID starting with c2d216c82faf1eaa8cebf746f867d73e9bdd46ef39c2375acff0d8012285a29a not found: ID does not exist" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.779845 4699 scope.go:117] "RemoveContainer" containerID="2ba5f61bbedbf25c6d3fc6052b14a28f362683db8e3bec9d0ffc62f27a5a70db" Feb 26 11:33:17 crc kubenswrapper[4699]: E0226 11:33:17.780544 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ba5f61bbedbf25c6d3fc6052b14a28f362683db8e3bec9d0ffc62f27a5a70db\": container with ID starting with 2ba5f61bbedbf25c6d3fc6052b14a28f362683db8e3bec9d0ffc62f27a5a70db not found: ID does not exist" containerID="2ba5f61bbedbf25c6d3fc6052b14a28f362683db8e3bec9d0ffc62f27a5a70db" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.780581 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba5f61bbedbf25c6d3fc6052b14a28f362683db8e3bec9d0ffc62f27a5a70db"} err="failed to get container status \"2ba5f61bbedbf25c6d3fc6052b14a28f362683db8e3bec9d0ffc62f27a5a70db\": rpc error: code = NotFound desc = could not find container \"2ba5f61bbedbf25c6d3fc6052b14a28f362683db8e3bec9d0ffc62f27a5a70db\": container with ID starting with 2ba5f61bbedbf25c6d3fc6052b14a28f362683db8e3bec9d0ffc62f27a5a70db not found: ID does not exist" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.780606 4699 scope.go:117] "RemoveContainer" containerID="69a68218e2ed5906b1f2da6c10e033f2dc29321a9daa1c590a95092e50555d5e" Feb 26 11:33:17 crc kubenswrapper[4699]: E0226 11:33:17.781267 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69a68218e2ed5906b1f2da6c10e033f2dc29321a9daa1c590a95092e50555d5e\": container with ID starting with 69a68218e2ed5906b1f2da6c10e033f2dc29321a9daa1c590a95092e50555d5e not found: ID does not exist" containerID="69a68218e2ed5906b1f2da6c10e033f2dc29321a9daa1c590a95092e50555d5e" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.781313 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69a68218e2ed5906b1f2da6c10e033f2dc29321a9daa1c590a95092e50555d5e"} err="failed to get container status \"69a68218e2ed5906b1f2da6c10e033f2dc29321a9daa1c590a95092e50555d5e\": rpc error: code = NotFound desc = could not find container \"69a68218e2ed5906b1f2da6c10e033f2dc29321a9daa1c590a95092e50555d5e\": container with ID starting with 69a68218e2ed5906b1f2da6c10e033f2dc29321a9daa1c590a95092e50555d5e not found: ID does not exist" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.781343 4699 scope.go:117] "RemoveContainer" containerID="613ad9080c787405540b5bbf9c6327648fb6d0e715ed0b9269326172474014dc" Feb 26 11:33:17 crc kubenswrapper[4699]: E0226 11:33:17.781934 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"613ad9080c787405540b5bbf9c6327648fb6d0e715ed0b9269326172474014dc\": container with ID starting with 613ad9080c787405540b5bbf9c6327648fb6d0e715ed0b9269326172474014dc not found: ID does not exist" containerID="613ad9080c787405540b5bbf9c6327648fb6d0e715ed0b9269326172474014dc" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.781969 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"613ad9080c787405540b5bbf9c6327648fb6d0e715ed0b9269326172474014dc"} err="failed to get container status \"613ad9080c787405540b5bbf9c6327648fb6d0e715ed0b9269326172474014dc\": rpc error: code = NotFound desc = could not find container \"613ad9080c787405540b5bbf9c6327648fb6d0e715ed0b9269326172474014dc\": container with ID starting with 613ad9080c787405540b5bbf9c6327648fb6d0e715ed0b9269326172474014dc not found: ID does not exist" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.788060 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-config-data" (OuterVolumeSpecName: "config-data") pod "208a51e1-6d1d-4dc4-be5e-fa414dd87c53" (UID: "208a51e1-6d1d-4dc4-be5e-fa414dd87c53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.836656 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:17 crc kubenswrapper[4699]: I0226 11:33:17.836719 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/208a51e1-6d1d-4dc4-be5e-fa414dd87c53-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:17.996253 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.003499 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.020818 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:33:18 crc kubenswrapper[4699]: E0226 11:33:18.021500 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="208a51e1-6d1d-4dc4-be5e-fa414dd87c53" containerName="ceilometer-central-agent" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.021574 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="208a51e1-6d1d-4dc4-be5e-fa414dd87c53" containerName="ceilometer-central-agent" Feb 26 11:33:18 crc kubenswrapper[4699]: E0226 11:33:18.021684 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="208a51e1-6d1d-4dc4-be5e-fa414dd87c53" containerName="sg-core" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.021934 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="208a51e1-6d1d-4dc4-be5e-fa414dd87c53" containerName="sg-core" Feb 26 11:33:18 crc kubenswrapper[4699]: E0226 11:33:18.031576 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="208a51e1-6d1d-4dc4-be5e-fa414dd87c53" containerName="proxy-httpd" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.031870 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="208a51e1-6d1d-4dc4-be5e-fa414dd87c53" containerName="proxy-httpd" Feb 26 11:33:18 crc kubenswrapper[4699]: E0226 11:33:18.031962 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="208a51e1-6d1d-4dc4-be5e-fa414dd87c53" containerName="ceilometer-notification-agent" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.032024 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="208a51e1-6d1d-4dc4-be5e-fa414dd87c53" containerName="ceilometer-notification-agent" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.044300 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="208a51e1-6d1d-4dc4-be5e-fa414dd87c53" containerName="ceilometer-notification-agent" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.044533 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="208a51e1-6d1d-4dc4-be5e-fa414dd87c53" containerName="proxy-httpd" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.044637 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="208a51e1-6d1d-4dc4-be5e-fa414dd87c53" containerName="sg-core" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.044697 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="208a51e1-6d1d-4dc4-be5e-fa414dd87c53" containerName="ceilometer-central-agent" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.046601 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.047013 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.049839 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.050078 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.142039 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b59e03f-0c75-40b0-9eb3-d5113163f420-log-httpd\") pod \"ceilometer-0\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " pod="openstack/ceilometer-0" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.142084 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " pod="openstack/ceilometer-0" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.142108 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-config-data\") pod \"ceilometer-0\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " pod="openstack/ceilometer-0" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.142371 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwdvm\" (UniqueName: \"kubernetes.io/projected/4b59e03f-0c75-40b0-9eb3-d5113163f420-kube-api-access-jwdvm\") pod \"ceilometer-0\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " pod="openstack/ceilometer-0" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.142453 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-scripts\") pod \"ceilometer-0\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " pod="openstack/ceilometer-0" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.142577 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b59e03f-0c75-40b0-9eb3-d5113163f420-run-httpd\") pod \"ceilometer-0\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " pod="openstack/ceilometer-0" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.142666 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " pod="openstack/ceilometer-0" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.244801 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwdvm\" (UniqueName: \"kubernetes.io/projected/4b59e03f-0c75-40b0-9eb3-d5113163f420-kube-api-access-jwdvm\") pod \"ceilometer-0\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " pod="openstack/ceilometer-0" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.244872 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-scripts\") pod \"ceilometer-0\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " pod="openstack/ceilometer-0" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.244923 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b59e03f-0c75-40b0-9eb3-d5113163f420-run-httpd\") pod \"ceilometer-0\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " pod="openstack/ceilometer-0" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.244958 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " pod="openstack/ceilometer-0" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.245008 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b59e03f-0c75-40b0-9eb3-d5113163f420-log-httpd\") pod \"ceilometer-0\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " pod="openstack/ceilometer-0" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.245027 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " pod="openstack/ceilometer-0" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.245048 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-config-data\") pod \"ceilometer-0\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " pod="openstack/ceilometer-0" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.246581 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b59e03f-0c75-40b0-9eb3-d5113163f420-run-httpd\") pod \"ceilometer-0\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " pod="openstack/ceilometer-0" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.246930 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b59e03f-0c75-40b0-9eb3-d5113163f420-log-httpd\") pod \"ceilometer-0\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " pod="openstack/ceilometer-0" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.250085 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " pod="openstack/ceilometer-0" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.254601 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-scripts\") pod \"ceilometer-0\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " pod="openstack/ceilometer-0" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.260799 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " pod="openstack/ceilometer-0" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.263005 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-config-data\") pod \"ceilometer-0\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " pod="openstack/ceilometer-0" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.266993 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwdvm\" (UniqueName: \"kubernetes.io/projected/4b59e03f-0c75-40b0-9eb3-d5113163f420-kube-api-access-jwdvm\") pod \"ceilometer-0\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " pod="openstack/ceilometer-0" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.282673 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="208a51e1-6d1d-4dc4-be5e-fa414dd87c53" path="/var/lib/kubelet/pods/208a51e1-6d1d-4dc4-be5e-fa414dd87c53/volumes" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.370085 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.665323 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"796738f1-8a6c-4e91-bdfe-bee2f252b3fc","Type":"ContainerStarted","Data":"2b8fee4ffc6d987f733fcb660517e174087b7a69049cd4a1545a4a414dc25609"} Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.665623 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"796738f1-8a6c-4e91-bdfe-bee2f252b3fc","Type":"ContainerStarted","Data":"8d8ab9d1111be5f66f0565bbfdfa83bf5512fcfeaabe44e4d0202b0f795ac56d"} Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.670381 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9c58ea0a-4ad4-47cf-8976-a004ef7e56da","Type":"ContainerStarted","Data":"eda23f2ca8003c71c7bdfb45ca5b281325183f5eabcff56210b0b35d70f7be79"} Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.670409 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9c58ea0a-4ad4-47cf-8976-a004ef7e56da","Type":"ContainerStarted","Data":"2d453a4544f70f5f76c564c5d804434402017e95a912a871437a8d52c894ee6e"} Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.718960 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.71894036 podStartE2EDuration="3.71894036s" podCreationTimestamp="2026-02-26 11:33:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:33:18.708504527 +0000 UTC m=+1344.519330971" watchObservedRunningTime="2026-02-26 11:33:18.71894036 +0000 UTC m=+1344.529766794" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.738880 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.7388649689999998 podStartE2EDuration="3.738864969s" podCreationTimestamp="2026-02-26 11:33:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:33:18.734455271 +0000 UTC m=+1344.545281715" watchObservedRunningTime="2026-02-26 11:33:18.738864969 +0000 UTC m=+1344.549691403" Feb 26 11:33:18 crc kubenswrapper[4699]: I0226 11:33:18.757223 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:33:19 crc kubenswrapper[4699]: I0226 11:33:19.683545 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b59e03f-0c75-40b0-9eb3-d5113163f420","Type":"ContainerStarted","Data":"0e70588ca2c32d2c8bf18e61605cae154752eb6909030e1d1477c1cf1b1f9f0c"} Feb 26 11:33:20 crc kubenswrapper[4699]: I0226 11:33:20.694512 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b59e03f-0c75-40b0-9eb3-d5113163f420","Type":"ContainerStarted","Data":"04dcf7e8e201497d1cf45ed5c29abe7b3a178bdefc6cc1cf11f3cbae4131ffe7"} Feb 26 11:33:22 crc kubenswrapper[4699]: I0226 11:33:22.723255 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b59e03f-0c75-40b0-9eb3-d5113163f420","Type":"ContainerStarted","Data":"0a5d89be89958727d068c6f547173e1db9e09eeaa55949f9e4b10646a2418098"} Feb 26 11:33:22 crc kubenswrapper[4699]: I0226 11:33:22.723833 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b59e03f-0c75-40b0-9eb3-d5113163f420","Type":"ContainerStarted","Data":"1e8f4ef353b62f20e3fa0c0b216ab5527d39228a1eccacac6ba930465493a7ed"} Feb 26 11:33:24 crc kubenswrapper[4699]: I0226 11:33:24.741450 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b59e03f-0c75-40b0-9eb3-d5113163f420","Type":"ContainerStarted","Data":"69f610b3a3266f67627b13f99326d06ba576d343b85ee61005b092a805c73f19"} Feb 26 11:33:24 crc kubenswrapper[4699]: I0226 11:33:24.741923 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 11:33:24 crc kubenswrapper[4699]: I0226 11:33:24.762729 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.308242364 podStartE2EDuration="7.762712938s" podCreationTimestamp="2026-02-26 11:33:17 +0000 UTC" firstStartedPulling="2026-02-26 11:33:18.74889359 +0000 UTC m=+1344.559720014" lastFinishedPulling="2026-02-26 11:33:24.203364154 +0000 UTC m=+1350.014190588" observedRunningTime="2026-02-26 11:33:24.756995692 +0000 UTC m=+1350.567822136" watchObservedRunningTime="2026-02-26 11:33:24.762712938 +0000 UTC m=+1350.573539372" Feb 26 11:33:26 crc kubenswrapper[4699]: I0226 11:33:26.372510 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 11:33:26 crc kubenswrapper[4699]: I0226 11:33:26.372816 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 11:33:26 crc kubenswrapper[4699]: I0226 11:33:26.414530 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 11:33:26 crc kubenswrapper[4699]: I0226 11:33:26.426599 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 11:33:26 crc kubenswrapper[4699]: I0226 11:33:26.446960 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 11:33:26 crc kubenswrapper[4699]: I0226 11:33:26.447010 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 11:33:26 crc kubenswrapper[4699]: I0226 11:33:26.481613 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 11:33:26 crc kubenswrapper[4699]: I0226 11:33:26.496344 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 11:33:26 crc kubenswrapper[4699]: I0226 11:33:26.760336 4699 generic.go:334] "Generic (PLEG): container finished" podID="ef20f352-fa9c-4bc8-875d-d537f00f75d5" containerID="b4034fed15cab382c6c5fd47ff21f822b9c9aa9789392181d8ca9fe59c0d233d" exitCode=0 Feb 26 11:33:26 crc kubenswrapper[4699]: I0226 11:33:26.760405 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vx5jv" event={"ID":"ef20f352-fa9c-4bc8-875d-d537f00f75d5","Type":"ContainerDied","Data":"b4034fed15cab382c6c5fd47ff21f822b9c9aa9789392181d8ca9fe59c0d233d"} Feb 26 11:33:26 crc kubenswrapper[4699]: I0226 11:33:26.761168 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 11:33:26 crc kubenswrapper[4699]: I0226 11:33:26.761203 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 11:33:26 crc kubenswrapper[4699]: I0226 11:33:26.761216 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 11:33:26 crc kubenswrapper[4699]: I0226 11:33:26.761231 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.153136 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vx5jv" Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.224279 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nmzl\" (UniqueName: \"kubernetes.io/projected/ef20f352-fa9c-4bc8-875d-d537f00f75d5-kube-api-access-7nmzl\") pod \"ef20f352-fa9c-4bc8-875d-d537f00f75d5\" (UID: \"ef20f352-fa9c-4bc8-875d-d537f00f75d5\") " Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.225364 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef20f352-fa9c-4bc8-875d-d537f00f75d5-config-data\") pod \"ef20f352-fa9c-4bc8-875d-d537f00f75d5\" (UID: \"ef20f352-fa9c-4bc8-875d-d537f00f75d5\") " Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.225462 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef20f352-fa9c-4bc8-875d-d537f00f75d5-combined-ca-bundle\") pod \"ef20f352-fa9c-4bc8-875d-d537f00f75d5\" (UID: \"ef20f352-fa9c-4bc8-875d-d537f00f75d5\") " Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.225736 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef20f352-fa9c-4bc8-875d-d537f00f75d5-scripts\") pod \"ef20f352-fa9c-4bc8-875d-d537f00f75d5\" (UID: \"ef20f352-fa9c-4bc8-875d-d537f00f75d5\") " Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.235831 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef20f352-fa9c-4bc8-875d-d537f00f75d5-kube-api-access-7nmzl" (OuterVolumeSpecName: "kube-api-access-7nmzl") pod "ef20f352-fa9c-4bc8-875d-d537f00f75d5" (UID: "ef20f352-fa9c-4bc8-875d-d537f00f75d5"). InnerVolumeSpecName "kube-api-access-7nmzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.237467 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef20f352-fa9c-4bc8-875d-d537f00f75d5-scripts" (OuterVolumeSpecName: "scripts") pod "ef20f352-fa9c-4bc8-875d-d537f00f75d5" (UID: "ef20f352-fa9c-4bc8-875d-d537f00f75d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.261038 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef20f352-fa9c-4bc8-875d-d537f00f75d5-config-data" (OuterVolumeSpecName: "config-data") pod "ef20f352-fa9c-4bc8-875d-d537f00f75d5" (UID: "ef20f352-fa9c-4bc8-875d-d537f00f75d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.265959 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef20f352-fa9c-4bc8-875d-d537f00f75d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef20f352-fa9c-4bc8-875d-d537f00f75d5" (UID: "ef20f352-fa9c-4bc8-875d-d537f00f75d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.327889 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef20f352-fa9c-4bc8-875d-d537f00f75d5-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.327922 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nmzl\" (UniqueName: \"kubernetes.io/projected/ef20f352-fa9c-4bc8-875d-d537f00f75d5-kube-api-access-7nmzl\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.327936 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef20f352-fa9c-4bc8-875d-d537f00f75d5-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.327945 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef20f352-fa9c-4bc8-875d-d537f00f75d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.706970 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.715295 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.784492 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vx5jv" Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.785048 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vx5jv" event={"ID":"ef20f352-fa9c-4bc8-875d-d537f00f75d5","Type":"ContainerDied","Data":"eb24292f1cef0da184d4ae6d21a79311c33325b5ebc6fce8d6d4a49689b83c5c"} Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.785074 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb24292f1cef0da184d4ae6d21a79311c33325b5ebc6fce8d6d4a49689b83c5c" Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.813914 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.814045 4699 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.841183 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.889469 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 11:33:28 crc kubenswrapper[4699]: E0226 11:33:28.890020 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef20f352-fa9c-4bc8-875d-d537f00f75d5" containerName="nova-cell0-conductor-db-sync" Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.890050 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef20f352-fa9c-4bc8-875d-d537f00f75d5" containerName="nova-cell0-conductor-db-sync" Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.890280 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef20f352-fa9c-4bc8-875d-d537f00f75d5" containerName="nova-cell0-conductor-db-sync" Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.891066 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.896692 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.896937 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-wwbkn" Feb 26 11:33:28 crc kubenswrapper[4699]: I0226 11:33:28.900441 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 11:33:29 crc kubenswrapper[4699]: I0226 11:33:29.042029 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg4kv\" (UniqueName: \"kubernetes.io/projected/2ff15a2d-962f-421b-be00-e3bf6ef22612-kube-api-access-xg4kv\") pod \"nova-cell0-conductor-0\" (UID: \"2ff15a2d-962f-421b-be00-e3bf6ef22612\") " pod="openstack/nova-cell0-conductor-0" Feb 26 11:33:29 crc kubenswrapper[4699]: I0226 11:33:29.042127 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff15a2d-962f-421b-be00-e3bf6ef22612-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2ff15a2d-962f-421b-be00-e3bf6ef22612\") " pod="openstack/nova-cell0-conductor-0" Feb 26 11:33:29 crc kubenswrapper[4699]: I0226 11:33:29.042197 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff15a2d-962f-421b-be00-e3bf6ef22612-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2ff15a2d-962f-421b-be00-e3bf6ef22612\") " pod="openstack/nova-cell0-conductor-0" Feb 26 11:33:29 crc kubenswrapper[4699]: I0226 11:33:29.144411 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg4kv\" (UniqueName: \"kubernetes.io/projected/2ff15a2d-962f-421b-be00-e3bf6ef22612-kube-api-access-xg4kv\") pod \"nova-cell0-conductor-0\" (UID: \"2ff15a2d-962f-421b-be00-e3bf6ef22612\") " pod="openstack/nova-cell0-conductor-0" Feb 26 11:33:29 crc kubenswrapper[4699]: I0226 11:33:29.144476 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff15a2d-962f-421b-be00-e3bf6ef22612-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2ff15a2d-962f-421b-be00-e3bf6ef22612\") " pod="openstack/nova-cell0-conductor-0" Feb 26 11:33:29 crc kubenswrapper[4699]: I0226 11:33:29.144563 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff15a2d-962f-421b-be00-e3bf6ef22612-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2ff15a2d-962f-421b-be00-e3bf6ef22612\") " pod="openstack/nova-cell0-conductor-0" Feb 26 11:33:29 crc kubenswrapper[4699]: I0226 11:33:29.158922 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff15a2d-962f-421b-be00-e3bf6ef22612-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2ff15a2d-962f-421b-be00-e3bf6ef22612\") " pod="openstack/nova-cell0-conductor-0" Feb 26 11:33:29 crc kubenswrapper[4699]: I0226 11:33:29.160928 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff15a2d-962f-421b-be00-e3bf6ef22612-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2ff15a2d-962f-421b-be00-e3bf6ef22612\") " pod="openstack/nova-cell0-conductor-0" Feb 26 11:33:29 crc kubenswrapper[4699]: I0226 11:33:29.178771 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg4kv\" (UniqueName: \"kubernetes.io/projected/2ff15a2d-962f-421b-be00-e3bf6ef22612-kube-api-access-xg4kv\") pod \"nova-cell0-conductor-0\" (UID: \"2ff15a2d-962f-421b-be00-e3bf6ef22612\") " pod="openstack/nova-cell0-conductor-0" Feb 26 11:33:29 crc kubenswrapper[4699]: I0226 11:33:29.230598 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 26 11:33:29 crc kubenswrapper[4699]: I0226 11:33:29.734048 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 11:33:29 crc kubenswrapper[4699]: I0226 11:33:29.804578 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2ff15a2d-962f-421b-be00-e3bf6ef22612","Type":"ContainerStarted","Data":"b17a43ba82df7a299d8ba1b054c19792fda9485d32c2d22c81f4c0723103e26b"} Feb 26 11:33:30 crc kubenswrapper[4699]: I0226 11:33:30.816548 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2ff15a2d-962f-421b-be00-e3bf6ef22612","Type":"ContainerStarted","Data":"abb700147106b7c9a2ad04b5cc3a70a9bca9d60fc1eb88d1c997133fc2921acb"} Feb 26 11:33:30 crc kubenswrapper[4699]: I0226 11:33:30.818326 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 26 11:33:30 crc kubenswrapper[4699]: I0226 11:33:30.847561 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.8475409369999998 podStartE2EDuration="2.847540937s" podCreationTimestamp="2026-02-26 11:33:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:33:30.835352283 +0000 UTC m=+1356.646178707" watchObservedRunningTime="2026-02-26 11:33:30.847540937 +0000 UTC m=+1356.658367371" Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.256505 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.797259 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-mcdml"] Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.802520 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mcdml" Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.810805 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.811250 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.834588 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mcdml"] Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.861557 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f528c9c1-4318-4d46-9b02-43f955e04009-scripts\") pod \"nova-cell0-cell-mapping-mcdml\" (UID: \"f528c9c1-4318-4d46-9b02-43f955e04009\") " pod="openstack/nova-cell0-cell-mapping-mcdml" Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.861617 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lsct\" (UniqueName: \"kubernetes.io/projected/f528c9c1-4318-4d46-9b02-43f955e04009-kube-api-access-4lsct\") pod \"nova-cell0-cell-mapping-mcdml\" (UID: \"f528c9c1-4318-4d46-9b02-43f955e04009\") " pod="openstack/nova-cell0-cell-mapping-mcdml" Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.861654 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f528c9c1-4318-4d46-9b02-43f955e04009-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mcdml\" (UID: \"f528c9c1-4318-4d46-9b02-43f955e04009\") " pod="openstack/nova-cell0-cell-mapping-mcdml" Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.861687 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f528c9c1-4318-4d46-9b02-43f955e04009-config-data\") pod \"nova-cell0-cell-mapping-mcdml\" (UID: \"f528c9c1-4318-4d46-9b02-43f955e04009\") " pod="openstack/nova-cell0-cell-mapping-mcdml" Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.914162 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.915958 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.920142 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.930638 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.963789 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp7zp\" (UniqueName: \"kubernetes.io/projected/6d5f37fe-0099-471b-9192-5f52735977b1-kube-api-access-mp7zp\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d5f37fe-0099-471b-9192-5f52735977b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.963839 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f528c9c1-4318-4d46-9b02-43f955e04009-scripts\") pod \"nova-cell0-cell-mapping-mcdml\" (UID: \"f528c9c1-4318-4d46-9b02-43f955e04009\") " pod="openstack/nova-cell0-cell-mapping-mcdml" Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.963874 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lsct\" (UniqueName: \"kubernetes.io/projected/f528c9c1-4318-4d46-9b02-43f955e04009-kube-api-access-4lsct\") pod \"nova-cell0-cell-mapping-mcdml\" (UID: \"f528c9c1-4318-4d46-9b02-43f955e04009\") " pod="openstack/nova-cell0-cell-mapping-mcdml" Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.963910 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d5f37fe-0099-471b-9192-5f52735977b1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d5f37fe-0099-471b-9192-5f52735977b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.963930 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f528c9c1-4318-4d46-9b02-43f955e04009-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mcdml\" (UID: \"f528c9c1-4318-4d46-9b02-43f955e04009\") " pod="openstack/nova-cell0-cell-mapping-mcdml" Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.963966 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f528c9c1-4318-4d46-9b02-43f955e04009-config-data\") pod \"nova-cell0-cell-mapping-mcdml\" (UID: \"f528c9c1-4318-4d46-9b02-43f955e04009\") " pod="openstack/nova-cell0-cell-mapping-mcdml" Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.964017 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d5f37fe-0099-471b-9192-5f52735977b1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d5f37fe-0099-471b-9192-5f52735977b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.973970 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f528c9c1-4318-4d46-9b02-43f955e04009-config-data\") pod \"nova-cell0-cell-mapping-mcdml\" (UID: \"f528c9c1-4318-4d46-9b02-43f955e04009\") " pod="openstack/nova-cell0-cell-mapping-mcdml" Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.976070 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f528c9c1-4318-4d46-9b02-43f955e04009-scripts\") pod \"nova-cell0-cell-mapping-mcdml\" (UID: \"f528c9c1-4318-4d46-9b02-43f955e04009\") " pod="openstack/nova-cell0-cell-mapping-mcdml" Feb 26 11:33:34 crc kubenswrapper[4699]: I0226 11:33:34.978776 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f528c9c1-4318-4d46-9b02-43f955e04009-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mcdml\" (UID: \"f528c9c1-4318-4d46-9b02-43f955e04009\") " pod="openstack/nova-cell0-cell-mapping-mcdml" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.006578 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lsct\" (UniqueName: \"kubernetes.io/projected/f528c9c1-4318-4d46-9b02-43f955e04009-kube-api-access-4lsct\") pod \"nova-cell0-cell-mapping-mcdml\" (UID: \"f528c9c1-4318-4d46-9b02-43f955e04009\") " pod="openstack/nova-cell0-cell-mapping-mcdml" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.017318 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.025889 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.037645 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.042512 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.043982 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.046632 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.058804 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.065950 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003dad7c-8300-49a9-80d0-99dcad71fa84-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"003dad7c-8300-49a9-80d0-99dcad71fa84\") " pod="openstack/nova-scheduler-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.065986 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003dad7c-8300-49a9-80d0-99dcad71fa84-config-data\") pod \"nova-scheduler-0\" (UID: \"003dad7c-8300-49a9-80d0-99dcad71fa84\") " pod="openstack/nova-scheduler-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.066009 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znmv6\" (UniqueName: \"kubernetes.io/projected/003dad7c-8300-49a9-80d0-99dcad71fa84-kube-api-access-znmv6\") pod \"nova-scheduler-0\" (UID: \"003dad7c-8300-49a9-80d0-99dcad71fa84\") " pod="openstack/nova-scheduler-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.066077 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp7zp\" (UniqueName: \"kubernetes.io/projected/6d5f37fe-0099-471b-9192-5f52735977b1-kube-api-access-mp7zp\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d5f37fe-0099-471b-9192-5f52735977b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.066122 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d5f37fe-0099-471b-9192-5f52735977b1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d5f37fe-0099-471b-9192-5f52735977b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.066200 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d5f37fe-0099-471b-9192-5f52735977b1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d5f37fe-0099-471b-9192-5f52735977b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.076624 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.098292 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d5f37fe-0099-471b-9192-5f52735977b1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d5f37fe-0099-471b-9192-5f52735977b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.114485 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.115143 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d5f37fe-0099-471b-9192-5f52735977b1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d5f37fe-0099-471b-9192-5f52735977b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.130618 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp7zp\" (UniqueName: \"kubernetes.io/projected/6d5f37fe-0099-471b-9192-5f52735977b1-kube-api-access-mp7zp\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d5f37fe-0099-471b-9192-5f52735977b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.131286 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.137552 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mcdml" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.138490 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.146656 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.183893 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrt7n\" (UniqueName: \"kubernetes.io/projected/3da58d42-6c34-4a38-b9dc-eeeb20542955-kube-api-access-qrt7n\") pod \"nova-api-0\" (UID: \"3da58d42-6c34-4a38-b9dc-eeeb20542955\") " pod="openstack/nova-api-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.184302 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-logs\") pod \"nova-metadata-0\" (UID: \"fd4545ea-b0c7-4fd6-9636-a826457d4e3a\") " pod="openstack/nova-metadata-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.184508 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-config-data\") pod \"nova-metadata-0\" (UID: \"fd4545ea-b0c7-4fd6-9636-a826457d4e3a\") " pod="openstack/nova-metadata-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.184661 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3da58d42-6c34-4a38-b9dc-eeeb20542955-config-data\") pod \"nova-api-0\" (UID: \"3da58d42-6c34-4a38-b9dc-eeeb20542955\") " pod="openstack/nova-api-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.184778 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64fjz\" (UniqueName: \"kubernetes.io/projected/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-kube-api-access-64fjz\") pod \"nova-metadata-0\" (UID: \"fd4545ea-b0c7-4fd6-9636-a826457d4e3a\") " pod="openstack/nova-metadata-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.184910 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3da58d42-6c34-4a38-b9dc-eeeb20542955-logs\") pod \"nova-api-0\" (UID: \"3da58d42-6c34-4a38-b9dc-eeeb20542955\") " pod="openstack/nova-api-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.185022 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003dad7c-8300-49a9-80d0-99dcad71fa84-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"003dad7c-8300-49a9-80d0-99dcad71fa84\") " pod="openstack/nova-scheduler-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.185153 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003dad7c-8300-49a9-80d0-99dcad71fa84-config-data\") pod \"nova-scheduler-0\" (UID: \"003dad7c-8300-49a9-80d0-99dcad71fa84\") " pod="openstack/nova-scheduler-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.185259 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znmv6\" (UniqueName: \"kubernetes.io/projected/003dad7c-8300-49a9-80d0-99dcad71fa84-kube-api-access-znmv6\") pod \"nova-scheduler-0\" (UID: \"003dad7c-8300-49a9-80d0-99dcad71fa84\") " pod="openstack/nova-scheduler-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.185356 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da58d42-6c34-4a38-b9dc-eeeb20542955-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3da58d42-6c34-4a38-b9dc-eeeb20542955\") " pod="openstack/nova-api-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.185464 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fd4545ea-b0c7-4fd6-9636-a826457d4e3a\") " pod="openstack/nova-metadata-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.197895 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003dad7c-8300-49a9-80d0-99dcad71fa84-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"003dad7c-8300-49a9-80d0-99dcad71fa84\") " pod="openstack/nova-scheduler-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.202483 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-5jmd5"] Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.208218 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.210950 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-5jmd5"] Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.213060 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003dad7c-8300-49a9-80d0-99dcad71fa84-config-data\") pod \"nova-scheduler-0\" (UID: \"003dad7c-8300-49a9-80d0-99dcad71fa84\") " pod="openstack/nova-scheduler-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.225374 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znmv6\" (UniqueName: \"kubernetes.io/projected/003dad7c-8300-49a9-80d0-99dcad71fa84-kube-api-access-znmv6\") pod \"nova-scheduler-0\" (UID: \"003dad7c-8300-49a9-80d0-99dcad71fa84\") " pod="openstack/nova-scheduler-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.247422 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.267853 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.286941 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3da58d42-6c34-4a38-b9dc-eeeb20542955-logs\") pod \"nova-api-0\" (UID: \"3da58d42-6c34-4a38-b9dc-eeeb20542955\") " pod="openstack/nova-api-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.287012 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-5jmd5\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.287041 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gpwp\" (UniqueName: \"kubernetes.io/projected/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-kube-api-access-6gpwp\") pod \"dnsmasq-dns-865f5d856f-5jmd5\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.287564 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-5jmd5\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.287598 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da58d42-6c34-4a38-b9dc-eeeb20542955-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3da58d42-6c34-4a38-b9dc-eeeb20542955\") " pod="openstack/nova-api-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.288008 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fd4545ea-b0c7-4fd6-9636-a826457d4e3a\") " pod="openstack/nova-metadata-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.288079 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrt7n\" (UniqueName: \"kubernetes.io/projected/3da58d42-6c34-4a38-b9dc-eeeb20542955-kube-api-access-qrt7n\") pod \"nova-api-0\" (UID: \"3da58d42-6c34-4a38-b9dc-eeeb20542955\") " pod="openstack/nova-api-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.288096 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-config\") pod \"dnsmasq-dns-865f5d856f-5jmd5\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.288200 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-logs\") pod \"nova-metadata-0\" (UID: \"fd4545ea-b0c7-4fd6-9636-a826457d4e3a\") " pod="openstack/nova-metadata-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.289238 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-config-data\") pod \"nova-metadata-0\" (UID: \"fd4545ea-b0c7-4fd6-9636-a826457d4e3a\") " pod="openstack/nova-metadata-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.288677 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3da58d42-6c34-4a38-b9dc-eeeb20542955-logs\") pod \"nova-api-0\" (UID: \"3da58d42-6c34-4a38-b9dc-eeeb20542955\") " pod="openstack/nova-api-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.289333 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-5jmd5\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.289191 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-logs\") pod \"nova-metadata-0\" (UID: \"fd4545ea-b0c7-4fd6-9636-a826457d4e3a\") " pod="openstack/nova-metadata-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.289403 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3da58d42-6c34-4a38-b9dc-eeeb20542955-config-data\") pod \"nova-api-0\" (UID: \"3da58d42-6c34-4a38-b9dc-eeeb20542955\") " pod="openstack/nova-api-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.289455 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-dns-svc\") pod \"dnsmasq-dns-865f5d856f-5jmd5\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.289480 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64fjz\" (UniqueName: \"kubernetes.io/projected/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-kube-api-access-64fjz\") pod \"nova-metadata-0\" (UID: \"fd4545ea-b0c7-4fd6-9636-a826457d4e3a\") " pod="openstack/nova-metadata-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.292462 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da58d42-6c34-4a38-b9dc-eeeb20542955-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3da58d42-6c34-4a38-b9dc-eeeb20542955\") " pod="openstack/nova-api-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.292879 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fd4545ea-b0c7-4fd6-9636-a826457d4e3a\") " pod="openstack/nova-metadata-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.293478 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3da58d42-6c34-4a38-b9dc-eeeb20542955-config-data\") pod \"nova-api-0\" (UID: \"3da58d42-6c34-4a38-b9dc-eeeb20542955\") " pod="openstack/nova-api-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.294487 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-config-data\") pod \"nova-metadata-0\" (UID: \"fd4545ea-b0c7-4fd6-9636-a826457d4e3a\") " pod="openstack/nova-metadata-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.309321 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64fjz\" (UniqueName: \"kubernetes.io/projected/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-kube-api-access-64fjz\") pod \"nova-metadata-0\" (UID: \"fd4545ea-b0c7-4fd6-9636-a826457d4e3a\") " pod="openstack/nova-metadata-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.312665 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrt7n\" (UniqueName: \"kubernetes.io/projected/3da58d42-6c34-4a38-b9dc-eeeb20542955-kube-api-access-qrt7n\") pod \"nova-api-0\" (UID: \"3da58d42-6c34-4a38-b9dc-eeeb20542955\") " pod="openstack/nova-api-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.394085 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-dns-svc\") pod \"dnsmasq-dns-865f5d856f-5jmd5\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.394359 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-5jmd5\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.394391 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gpwp\" (UniqueName: \"kubernetes.io/projected/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-kube-api-access-6gpwp\") pod \"dnsmasq-dns-865f5d856f-5jmd5\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.394416 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-5jmd5\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.394510 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-config\") pod \"dnsmasq-dns-865f5d856f-5jmd5\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.394589 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-5jmd5\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.396286 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-dns-svc\") pod \"dnsmasq-dns-865f5d856f-5jmd5\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.397289 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-5jmd5\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.397619 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-config\") pod \"dnsmasq-dns-865f5d856f-5jmd5\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.406868 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-5jmd5\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.409946 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-5jmd5\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.414860 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gpwp\" (UniqueName: \"kubernetes.io/projected/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-kube-api-access-6gpwp\") pod \"dnsmasq-dns-865f5d856f-5jmd5\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.583657 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.598252 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.614684 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.804194 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.824330 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mcdml"] Feb 26 11:33:35 crc kubenswrapper[4699]: W0226 11:33:35.830658 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod003dad7c_8300_49a9_80d0_99dcad71fa84.slice/crio-1ffae0e8d9c34eeb99720f9aac223ffd7663c8e780138c2fbc15f472417c5fda WatchSource:0}: Error finding container 1ffae0e8d9c34eeb99720f9aac223ffd7663c8e780138c2fbc15f472417c5fda: Status 404 returned error can't find the container with id 1ffae0e8d9c34eeb99720f9aac223ffd7663c8e780138c2fbc15f472417c5fda Feb 26 11:33:35 crc kubenswrapper[4699]: W0226 11:33:35.887800 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf528c9c1_4318_4d46_9b02_43f955e04009.slice/crio-53909e629fc2641cca7ffd773dbe454e7b1a7fac09f9589c009aa88c45e195ad WatchSource:0}: Error finding container 53909e629fc2641cca7ffd773dbe454e7b1a7fac09f9589c009aa88c45e195ad: Status 404 returned error can't find the container with id 53909e629fc2641cca7ffd773dbe454e7b1a7fac09f9589c009aa88c45e195ad Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.896854 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"003dad7c-8300-49a9-80d0-99dcad71fa84","Type":"ContainerStarted","Data":"1ffae0e8d9c34eeb99720f9aac223ffd7663c8e780138c2fbc15f472417c5fda"} Feb 26 11:33:35 crc kubenswrapper[4699]: I0226 11:33:35.939566 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.102695 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dz84d"] Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.104245 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dz84d" Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.106844 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.106938 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.112700 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dz84d"] Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.160416 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.225492 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-config-data\") pod \"nova-cell1-conductor-db-sync-dz84d\" (UID: \"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a\") " pod="openstack/nova-cell1-conductor-db-sync-dz84d" Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.225599 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5wlj\" (UniqueName: \"kubernetes.io/projected/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-kube-api-access-v5wlj\") pod \"nova-cell1-conductor-db-sync-dz84d\" (UID: \"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a\") " pod="openstack/nova-cell1-conductor-db-sync-dz84d" Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.225697 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-scripts\") pod \"nova-cell1-conductor-db-sync-dz84d\" (UID: \"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a\") " pod="openstack/nova-cell1-conductor-db-sync-dz84d" Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.225718 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dz84d\" (UID: \"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a\") " pod="openstack/nova-cell1-conductor-db-sync-dz84d" Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.328262 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dz84d\" (UID: \"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a\") " pod="openstack/nova-cell1-conductor-db-sync-dz84d" Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.328319 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-scripts\") pod \"nova-cell1-conductor-db-sync-dz84d\" (UID: \"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a\") " pod="openstack/nova-cell1-conductor-db-sync-dz84d" Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.328375 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-config-data\") pod \"nova-cell1-conductor-db-sync-dz84d\" (UID: \"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a\") " pod="openstack/nova-cell1-conductor-db-sync-dz84d" Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.328460 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5wlj\" (UniqueName: \"kubernetes.io/projected/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-kube-api-access-v5wlj\") pod \"nova-cell1-conductor-db-sync-dz84d\" (UID: \"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a\") " pod="openstack/nova-cell1-conductor-db-sync-dz84d" Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.340618 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dz84d\" (UID: \"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a\") " pod="openstack/nova-cell1-conductor-db-sync-dz84d" Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.347793 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-5jmd5"] Feb 26 11:33:36 crc kubenswrapper[4699]: W0226 11:33:36.348242 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd4545ea_b0c7_4fd6_9636_a826457d4e3a.slice/crio-45f061eac629f86d341067b2a329a3980fab0dd355c7b346b4b639f44c038307 WatchSource:0}: Error finding container 45f061eac629f86d341067b2a329a3980fab0dd355c7b346b4b639f44c038307: Status 404 returned error can't find the container with id 45f061eac629f86d341067b2a329a3980fab0dd355c7b346b4b639f44c038307 Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.350089 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-config-data\") pod \"nova-cell1-conductor-db-sync-dz84d\" (UID: \"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a\") " pod="openstack/nova-cell1-conductor-db-sync-dz84d" Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.350445 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-scripts\") pod \"nova-cell1-conductor-db-sync-dz84d\" (UID: \"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a\") " pod="openstack/nova-cell1-conductor-db-sync-dz84d" Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.356033 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5wlj\" (UniqueName: \"kubernetes.io/projected/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-kube-api-access-v5wlj\") pod \"nova-cell1-conductor-db-sync-dz84d\" (UID: \"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a\") " pod="openstack/nova-cell1-conductor-db-sync-dz84d" Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.364986 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.610849 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dz84d" Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.951292 4699 generic.go:334] "Generic (PLEG): container finished" podID="90cd25a3-8ac5-49d2-b3a1-79c773a0b394" containerID="9bee82430e4d84a9497e3680da14bb7fec649ba1905937229370f30514994319" exitCode=0 Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.951622 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" event={"ID":"90cd25a3-8ac5-49d2-b3a1-79c773a0b394","Type":"ContainerDied","Data":"9bee82430e4d84a9497e3680da14bb7fec649ba1905937229370f30514994319"} Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.951663 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" event={"ID":"90cd25a3-8ac5-49d2-b3a1-79c773a0b394","Type":"ContainerStarted","Data":"e1b123469f14c639c8594d09af4903ba398bf0ca95a50aeadc71f0627b95230b"} Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.957242 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6d5f37fe-0099-471b-9192-5f52735977b1","Type":"ContainerStarted","Data":"891a5f2f8e4df93b7d5e317f0bde0ca23ec3dcb73c4f3ad638024da213a38a6c"} Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.966959 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mcdml" event={"ID":"f528c9c1-4318-4d46-9b02-43f955e04009","Type":"ContainerStarted","Data":"2cee4e67f7ca1be08a16734a80281eca2dc16bb5d20a6d285f430706b65292fe"} Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.967019 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mcdml" event={"ID":"f528c9c1-4318-4d46-9b02-43f955e04009","Type":"ContainerStarted","Data":"53909e629fc2641cca7ffd773dbe454e7b1a7fac09f9589c009aa88c45e195ad"} Feb 26 11:33:36 crc kubenswrapper[4699]: I0226 11:33:36.996138 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3da58d42-6c34-4a38-b9dc-eeeb20542955","Type":"ContainerStarted","Data":"0001ca71eaea85ec4a7157192b885fb03750c2a30c308dc7404b715439e990b4"} Feb 26 11:33:37 crc kubenswrapper[4699]: I0226 11:33:37.010422 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-mcdml" podStartSLOduration=3.010398302 podStartE2EDuration="3.010398302s" podCreationTimestamp="2026-02-26 11:33:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:33:36.993061839 +0000 UTC m=+1362.803888273" watchObservedRunningTime="2026-02-26 11:33:37.010398302 +0000 UTC m=+1362.821224746" Feb 26 11:33:37 crc kubenswrapper[4699]: I0226 11:33:37.017105 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fd4545ea-b0c7-4fd6-9636-a826457d4e3a","Type":"ContainerStarted","Data":"45f061eac629f86d341067b2a329a3980fab0dd355c7b346b4b639f44c038307"} Feb 26 11:33:37 crc kubenswrapper[4699]: I0226 11:33:37.146802 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dz84d"] Feb 26 11:33:37 crc kubenswrapper[4699]: W0226 11:33:37.664327 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5fb37dd_bd18_4ada_97c4_3ff3e3555d8a.slice/crio-eb1129c3d9136c15c6047e7fc0342dfc057c4e5270886b2bb6ec9a038a974469 WatchSource:0}: Error finding container eb1129c3d9136c15c6047e7fc0342dfc057c4e5270886b2bb6ec9a038a974469: Status 404 returned error can't find the container with id eb1129c3d9136c15c6047e7fc0342dfc057c4e5270886b2bb6ec9a038a974469 Feb 26 11:33:38 crc kubenswrapper[4699]: I0226 11:33:38.027630 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dz84d" event={"ID":"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a","Type":"ContainerStarted","Data":"eb1129c3d9136c15c6047e7fc0342dfc057c4e5270886b2bb6ec9a038a974469"} Feb 26 11:33:38 crc kubenswrapper[4699]: I0226 11:33:38.515668 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 11:33:38 crc kubenswrapper[4699]: I0226 11:33:38.558012 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 11:33:40 crc kubenswrapper[4699]: I0226 11:33:40.062561 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fd4545ea-b0c7-4fd6-9636-a826457d4e3a","Type":"ContainerStarted","Data":"3da1172a5fa169e6833849831b27d81d421a9de5b2685cc3e9f6f1b36d5c03a5"} Feb 26 11:33:40 crc kubenswrapper[4699]: I0226 11:33:40.065384 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" event={"ID":"90cd25a3-8ac5-49d2-b3a1-79c773a0b394","Type":"ContainerStarted","Data":"50c7ddb03e58cd9791ab6f41d1755213bce0ea0826aec0f5b6934548dfaf9782"} Feb 26 11:33:40 crc kubenswrapper[4699]: I0226 11:33:40.065567 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:33:40 crc kubenswrapper[4699]: I0226 11:33:40.071100 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dz84d" event={"ID":"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a","Type":"ContainerStarted","Data":"6a0914a3db1c0b6e1b3a5a9cf2e1d8ac0e44a6dc0eb35fc159954e4b3f365a3d"} Feb 26 11:33:40 crc kubenswrapper[4699]: I0226 11:33:40.076678 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="6d5f37fe-0099-471b-9192-5f52735977b1" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://9a9565174025b2a66d72042b489f69a27808267d25f5c7efcc2fce77583169c7" gracePeriod=30 Feb 26 11:33:40 crc kubenswrapper[4699]: I0226 11:33:40.076771 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6d5f37fe-0099-471b-9192-5f52735977b1","Type":"ContainerStarted","Data":"9a9565174025b2a66d72042b489f69a27808267d25f5c7efcc2fce77583169c7"} Feb 26 11:33:40 crc kubenswrapper[4699]: I0226 11:33:40.094192 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" podStartSLOduration=5.094176468 podStartE2EDuration="5.094176468s" podCreationTimestamp="2026-02-26 11:33:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:33:40.084190418 +0000 UTC m=+1365.895016872" watchObservedRunningTime="2026-02-26 11:33:40.094176468 +0000 UTC m=+1365.905002902" Feb 26 11:33:40 crc kubenswrapper[4699]: I0226 11:33:40.100008 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"003dad7c-8300-49a9-80d0-99dcad71fa84","Type":"ContainerStarted","Data":"24ef43097973fb5b65cb505842a287bfdce1031cc145b5b783382e2c06e63cf6"} Feb 26 11:33:40 crc kubenswrapper[4699]: I0226 11:33:40.103228 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3da58d42-6c34-4a38-b9dc-eeeb20542955","Type":"ContainerStarted","Data":"1c92a80645afa4e46d1addaafa746637c845088f3cba1406f81a67f4dfe1af22"} Feb 26 11:33:40 crc kubenswrapper[4699]: I0226 11:33:40.105893 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-dz84d" podStartSLOduration=4.105874878 podStartE2EDuration="4.105874878s" podCreationTimestamp="2026-02-26 11:33:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:33:40.096905237 +0000 UTC m=+1365.907731671" watchObservedRunningTime="2026-02-26 11:33:40.105874878 +0000 UTC m=+1365.916701312" Feb 26 11:33:40 crc kubenswrapper[4699]: I0226 11:33:40.125663 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.77719021 podStartE2EDuration="6.125646262s" podCreationTimestamp="2026-02-26 11:33:34 +0000 UTC" firstStartedPulling="2026-02-26 11:33:36.018213178 +0000 UTC m=+1361.829039602" lastFinishedPulling="2026-02-26 11:33:39.36666922 +0000 UTC m=+1365.177495654" observedRunningTime="2026-02-26 11:33:40.113337474 +0000 UTC m=+1365.924163908" watchObservedRunningTime="2026-02-26 11:33:40.125646262 +0000 UTC m=+1365.936472706" Feb 26 11:33:40 crc kubenswrapper[4699]: I0226 11:33:40.138014 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.623248039 podStartE2EDuration="6.13799201s" podCreationTimestamp="2026-02-26 11:33:34 +0000 UTC" firstStartedPulling="2026-02-26 11:33:35.852447144 +0000 UTC m=+1361.663273578" lastFinishedPulling="2026-02-26 11:33:39.367191115 +0000 UTC m=+1365.178017549" observedRunningTime="2026-02-26 11:33:40.130223085 +0000 UTC m=+1365.941049519" watchObservedRunningTime="2026-02-26 11:33:40.13799201 +0000 UTC m=+1365.948818444" Feb 26 11:33:40 crc kubenswrapper[4699]: I0226 11:33:40.250338 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:33:40 crc kubenswrapper[4699]: I0226 11:33:40.271748 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 26 11:33:41 crc kubenswrapper[4699]: I0226 11:33:41.169410 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3da58d42-6c34-4a38-b9dc-eeeb20542955","Type":"ContainerStarted","Data":"9d7c08d80dbb246a87244e1998a0a8a3673755ff4e13d9dbf8e24611acf9af5c"} Feb 26 11:33:41 crc kubenswrapper[4699]: I0226 11:33:41.181879 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fd4545ea-b0c7-4fd6-9636-a826457d4e3a","Type":"ContainerStarted","Data":"179c94fbb230260f4209c907fc41a84062208454b917138ea5aac422ebf727bd"} Feb 26 11:33:41 crc kubenswrapper[4699]: I0226 11:33:41.182257 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fd4545ea-b0c7-4fd6-9636-a826457d4e3a" containerName="nova-metadata-metadata" containerID="cri-o://179c94fbb230260f4209c907fc41a84062208454b917138ea5aac422ebf727bd" gracePeriod=30 Feb 26 11:33:41 crc kubenswrapper[4699]: I0226 11:33:41.182247 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fd4545ea-b0c7-4fd6-9636-a826457d4e3a" containerName="nova-metadata-log" containerID="cri-o://3da1172a5fa169e6833849831b27d81d421a9de5b2685cc3e9f6f1b36d5c03a5" gracePeriod=30 Feb 26 11:33:41 crc kubenswrapper[4699]: I0226 11:33:41.213252 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.019004042 podStartE2EDuration="6.213227946s" podCreationTimestamp="2026-02-26 11:33:35 +0000 UTC" firstStartedPulling="2026-02-26 11:33:36.173979571 +0000 UTC m=+1361.984806005" lastFinishedPulling="2026-02-26 11:33:39.368203465 +0000 UTC m=+1365.179029909" observedRunningTime="2026-02-26 11:33:41.201910618 +0000 UTC m=+1367.012737062" watchObservedRunningTime="2026-02-26 11:33:41.213227946 +0000 UTC m=+1367.024054400" Feb 26 11:33:41 crc kubenswrapper[4699]: I0226 11:33:41.242274 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.221559766 podStartE2EDuration="7.242249509s" podCreationTimestamp="2026-02-26 11:33:34 +0000 UTC" firstStartedPulling="2026-02-26 11:33:36.350462247 +0000 UTC m=+1362.161288681" lastFinishedPulling="2026-02-26 11:33:39.37115199 +0000 UTC m=+1365.181978424" observedRunningTime="2026-02-26 11:33:41.234773532 +0000 UTC m=+1367.045599966" watchObservedRunningTime="2026-02-26 11:33:41.242249509 +0000 UTC m=+1367.053075943" Feb 26 11:33:41 crc kubenswrapper[4699]: I0226 11:33:41.585580 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:33:41 crc kubenswrapper[4699]: I0226 11:33:41.585652 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.178231 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.193584 4699 generic.go:334] "Generic (PLEG): container finished" podID="fd4545ea-b0c7-4fd6-9636-a826457d4e3a" containerID="179c94fbb230260f4209c907fc41a84062208454b917138ea5aac422ebf727bd" exitCode=0 Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.193636 4699 generic.go:334] "Generic (PLEG): container finished" podID="fd4545ea-b0c7-4fd6-9636-a826457d4e3a" containerID="3da1172a5fa169e6833849831b27d81d421a9de5b2685cc3e9f6f1b36d5c03a5" exitCode=143 Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.193667 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.193684 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fd4545ea-b0c7-4fd6-9636-a826457d4e3a","Type":"ContainerDied","Data":"179c94fbb230260f4209c907fc41a84062208454b917138ea5aac422ebf727bd"} Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.193802 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fd4545ea-b0c7-4fd6-9636-a826457d4e3a","Type":"ContainerDied","Data":"3da1172a5fa169e6833849831b27d81d421a9de5b2685cc3e9f6f1b36d5c03a5"} Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.193819 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fd4545ea-b0c7-4fd6-9636-a826457d4e3a","Type":"ContainerDied","Data":"45f061eac629f86d341067b2a329a3980fab0dd355c7b346b4b639f44c038307"} Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.193852 4699 scope.go:117] "RemoveContainer" containerID="179c94fbb230260f4209c907fc41a84062208454b917138ea5aac422ebf727bd" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.228958 4699 scope.go:117] "RemoveContainer" containerID="3da1172a5fa169e6833849831b27d81d421a9de5b2685cc3e9f6f1b36d5c03a5" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.250943 4699 scope.go:117] "RemoveContainer" containerID="179c94fbb230260f4209c907fc41a84062208454b917138ea5aac422ebf727bd" Feb 26 11:33:42 crc kubenswrapper[4699]: E0226 11:33:42.251520 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"179c94fbb230260f4209c907fc41a84062208454b917138ea5aac422ebf727bd\": container with ID starting with 179c94fbb230260f4209c907fc41a84062208454b917138ea5aac422ebf727bd not found: ID does not exist" containerID="179c94fbb230260f4209c907fc41a84062208454b917138ea5aac422ebf727bd" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.251555 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"179c94fbb230260f4209c907fc41a84062208454b917138ea5aac422ebf727bd"} err="failed to get container status \"179c94fbb230260f4209c907fc41a84062208454b917138ea5aac422ebf727bd\": rpc error: code = NotFound desc = could not find container \"179c94fbb230260f4209c907fc41a84062208454b917138ea5aac422ebf727bd\": container with ID starting with 179c94fbb230260f4209c907fc41a84062208454b917138ea5aac422ebf727bd not found: ID does not exist" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.251577 4699 scope.go:117] "RemoveContainer" containerID="3da1172a5fa169e6833849831b27d81d421a9de5b2685cc3e9f6f1b36d5c03a5" Feb 26 11:33:42 crc kubenswrapper[4699]: E0226 11:33:42.251824 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3da1172a5fa169e6833849831b27d81d421a9de5b2685cc3e9f6f1b36d5c03a5\": container with ID starting with 3da1172a5fa169e6833849831b27d81d421a9de5b2685cc3e9f6f1b36d5c03a5 not found: ID does not exist" containerID="3da1172a5fa169e6833849831b27d81d421a9de5b2685cc3e9f6f1b36d5c03a5" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.251861 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3da1172a5fa169e6833849831b27d81d421a9de5b2685cc3e9f6f1b36d5c03a5"} err="failed to get container status \"3da1172a5fa169e6833849831b27d81d421a9de5b2685cc3e9f6f1b36d5c03a5\": rpc error: code = NotFound desc = could not find container \"3da1172a5fa169e6833849831b27d81d421a9de5b2685cc3e9f6f1b36d5c03a5\": container with ID starting with 3da1172a5fa169e6833849831b27d81d421a9de5b2685cc3e9f6f1b36d5c03a5 not found: ID does not exist" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.251877 4699 scope.go:117] "RemoveContainer" containerID="179c94fbb230260f4209c907fc41a84062208454b917138ea5aac422ebf727bd" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.252255 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"179c94fbb230260f4209c907fc41a84062208454b917138ea5aac422ebf727bd"} err="failed to get container status \"179c94fbb230260f4209c907fc41a84062208454b917138ea5aac422ebf727bd\": rpc error: code = NotFound desc = could not find container \"179c94fbb230260f4209c907fc41a84062208454b917138ea5aac422ebf727bd\": container with ID starting with 179c94fbb230260f4209c907fc41a84062208454b917138ea5aac422ebf727bd not found: ID does not exist" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.252306 4699 scope.go:117] "RemoveContainer" containerID="3da1172a5fa169e6833849831b27d81d421a9de5b2685cc3e9f6f1b36d5c03a5" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.252743 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3da1172a5fa169e6833849831b27d81d421a9de5b2685cc3e9f6f1b36d5c03a5"} err="failed to get container status \"3da1172a5fa169e6833849831b27d81d421a9de5b2685cc3e9f6f1b36d5c03a5\": rpc error: code = NotFound desc = could not find container \"3da1172a5fa169e6833849831b27d81d421a9de5b2685cc3e9f6f1b36d5c03a5\": container with ID starting with 3da1172a5fa169e6833849831b27d81d421a9de5b2685cc3e9f6f1b36d5c03a5 not found: ID does not exist" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.366229 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-config-data\") pod \"fd4545ea-b0c7-4fd6-9636-a826457d4e3a\" (UID: \"fd4545ea-b0c7-4fd6-9636-a826457d4e3a\") " Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.366349 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-combined-ca-bundle\") pod \"fd4545ea-b0c7-4fd6-9636-a826457d4e3a\" (UID: \"fd4545ea-b0c7-4fd6-9636-a826457d4e3a\") " Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.366376 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64fjz\" (UniqueName: \"kubernetes.io/projected/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-kube-api-access-64fjz\") pod \"fd4545ea-b0c7-4fd6-9636-a826457d4e3a\" (UID: \"fd4545ea-b0c7-4fd6-9636-a826457d4e3a\") " Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.367388 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-logs\") pod \"fd4545ea-b0c7-4fd6-9636-a826457d4e3a\" (UID: \"fd4545ea-b0c7-4fd6-9636-a826457d4e3a\") " Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.367769 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-logs" (OuterVolumeSpecName: "logs") pod "fd4545ea-b0c7-4fd6-9636-a826457d4e3a" (UID: "fd4545ea-b0c7-4fd6-9636-a826457d4e3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.367989 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-logs\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.373197 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-kube-api-access-64fjz" (OuterVolumeSpecName: "kube-api-access-64fjz") pod "fd4545ea-b0c7-4fd6-9636-a826457d4e3a" (UID: "fd4545ea-b0c7-4fd6-9636-a826457d4e3a"). InnerVolumeSpecName "kube-api-access-64fjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.402298 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd4545ea-b0c7-4fd6-9636-a826457d4e3a" (UID: "fd4545ea-b0c7-4fd6-9636-a826457d4e3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.412997 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-config-data" (OuterVolumeSpecName: "config-data") pod "fd4545ea-b0c7-4fd6-9636-a826457d4e3a" (UID: "fd4545ea-b0c7-4fd6-9636-a826457d4e3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.470074 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.470130 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.470147 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64fjz\" (UniqueName: \"kubernetes.io/projected/fd4545ea-b0c7-4fd6-9636-a826457d4e3a-kube-api-access-64fjz\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.527751 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.549604 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.561834 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 11:33:42 crc kubenswrapper[4699]: E0226 11:33:42.562629 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd4545ea-b0c7-4fd6-9636-a826457d4e3a" containerName="nova-metadata-log" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.562652 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd4545ea-b0c7-4fd6-9636-a826457d4e3a" containerName="nova-metadata-log" Feb 26 11:33:42 crc kubenswrapper[4699]: E0226 11:33:42.562680 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd4545ea-b0c7-4fd6-9636-a826457d4e3a" containerName="nova-metadata-metadata" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.562688 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd4545ea-b0c7-4fd6-9636-a826457d4e3a" containerName="nova-metadata-metadata" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.562864 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd4545ea-b0c7-4fd6-9636-a826457d4e3a" containerName="nova-metadata-log" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.562881 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd4545ea-b0c7-4fd6-9636-a826457d4e3a" containerName="nova-metadata-metadata" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.563928 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.567936 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.568056 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.572424 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0740e570-d27e-4d97-b511-315a9ad45022-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0740e570-d27e-4d97-b511-315a9ad45022\") " pod="openstack/nova-metadata-0" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.572705 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0740e570-d27e-4d97-b511-315a9ad45022-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0740e570-d27e-4d97-b511-315a9ad45022\") " pod="openstack/nova-metadata-0" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.572753 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0740e570-d27e-4d97-b511-315a9ad45022-logs\") pod \"nova-metadata-0\" (UID: \"0740e570-d27e-4d97-b511-315a9ad45022\") " pod="openstack/nova-metadata-0" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.572785 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxwjp\" (UniqueName: \"kubernetes.io/projected/0740e570-d27e-4d97-b511-315a9ad45022-kube-api-access-wxwjp\") pod \"nova-metadata-0\" (UID: \"0740e570-d27e-4d97-b511-315a9ad45022\") " pod="openstack/nova-metadata-0" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.572916 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0740e570-d27e-4d97-b511-315a9ad45022-config-data\") pod \"nova-metadata-0\" (UID: \"0740e570-d27e-4d97-b511-315a9ad45022\") " pod="openstack/nova-metadata-0" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.601245 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.673639 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0740e570-d27e-4d97-b511-315a9ad45022-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0740e570-d27e-4d97-b511-315a9ad45022\") " pod="openstack/nova-metadata-0" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.673716 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0740e570-d27e-4d97-b511-315a9ad45022-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0740e570-d27e-4d97-b511-315a9ad45022\") " pod="openstack/nova-metadata-0" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.673737 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0740e570-d27e-4d97-b511-315a9ad45022-logs\") pod \"nova-metadata-0\" (UID: \"0740e570-d27e-4d97-b511-315a9ad45022\") " pod="openstack/nova-metadata-0" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.673754 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxwjp\" (UniqueName: \"kubernetes.io/projected/0740e570-d27e-4d97-b511-315a9ad45022-kube-api-access-wxwjp\") pod \"nova-metadata-0\" (UID: \"0740e570-d27e-4d97-b511-315a9ad45022\") " pod="openstack/nova-metadata-0" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.673794 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0740e570-d27e-4d97-b511-315a9ad45022-config-data\") pod \"nova-metadata-0\" (UID: \"0740e570-d27e-4d97-b511-315a9ad45022\") " pod="openstack/nova-metadata-0" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.674311 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0740e570-d27e-4d97-b511-315a9ad45022-logs\") pod \"nova-metadata-0\" (UID: \"0740e570-d27e-4d97-b511-315a9ad45022\") " pod="openstack/nova-metadata-0" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.678008 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0740e570-d27e-4d97-b511-315a9ad45022-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0740e570-d27e-4d97-b511-315a9ad45022\") " pod="openstack/nova-metadata-0" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.678413 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0740e570-d27e-4d97-b511-315a9ad45022-config-data\") pod \"nova-metadata-0\" (UID: \"0740e570-d27e-4d97-b511-315a9ad45022\") " pod="openstack/nova-metadata-0" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.678473 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0740e570-d27e-4d97-b511-315a9ad45022-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0740e570-d27e-4d97-b511-315a9ad45022\") " pod="openstack/nova-metadata-0" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.703810 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxwjp\" (UniqueName: \"kubernetes.io/projected/0740e570-d27e-4d97-b511-315a9ad45022-kube-api-access-wxwjp\") pod \"nova-metadata-0\" (UID: \"0740e570-d27e-4d97-b511-315a9ad45022\") " pod="openstack/nova-metadata-0" Feb 26 11:33:42 crc kubenswrapper[4699]: I0226 11:33:42.894713 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 11:33:43 crc kubenswrapper[4699]: I0226 11:33:43.366073 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 11:33:43 crc kubenswrapper[4699]: W0226 11:33:43.370464 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0740e570_d27e_4d97_b511_315a9ad45022.slice/crio-599226ea072da23c2b7a52b3304a9265a01ddf31d06132414e61defb38999e48 WatchSource:0}: Error finding container 599226ea072da23c2b7a52b3304a9265a01ddf31d06132414e61defb38999e48: Status 404 returned error can't find the container with id 599226ea072da23c2b7a52b3304a9265a01ddf31d06132414e61defb38999e48 Feb 26 11:33:44 crc kubenswrapper[4699]: I0226 11:33:44.215592 4699 generic.go:334] "Generic (PLEG): container finished" podID="f528c9c1-4318-4d46-9b02-43f955e04009" containerID="2cee4e67f7ca1be08a16734a80281eca2dc16bb5d20a6d285f430706b65292fe" exitCode=0 Feb 26 11:33:44 crc kubenswrapper[4699]: I0226 11:33:44.215634 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mcdml" event={"ID":"f528c9c1-4318-4d46-9b02-43f955e04009","Type":"ContainerDied","Data":"2cee4e67f7ca1be08a16734a80281eca2dc16bb5d20a6d285f430706b65292fe"} Feb 26 11:33:44 crc kubenswrapper[4699]: I0226 11:33:44.218190 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0740e570-d27e-4d97-b511-315a9ad45022","Type":"ContainerStarted","Data":"56bcb9d42e1b3abd08748801a926fbec8d7021ec641d0c2f8df7fcff3ae44555"} Feb 26 11:33:44 crc kubenswrapper[4699]: I0226 11:33:44.218251 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0740e570-d27e-4d97-b511-315a9ad45022","Type":"ContainerStarted","Data":"2064f2d54f436918410771cf684160b85d090a94e656dd353921b47f48e5a9bd"} Feb 26 11:33:44 crc kubenswrapper[4699]: I0226 11:33:44.218265 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0740e570-d27e-4d97-b511-315a9ad45022","Type":"ContainerStarted","Data":"599226ea072da23c2b7a52b3304a9265a01ddf31d06132414e61defb38999e48"} Feb 26 11:33:44 crc kubenswrapper[4699]: I0226 11:33:44.258109 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.258079541 podStartE2EDuration="2.258079541s" podCreationTimestamp="2026-02-26 11:33:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:33:44.248216244 +0000 UTC m=+1370.059042698" watchObservedRunningTime="2026-02-26 11:33:44.258079541 +0000 UTC m=+1370.068905995" Feb 26 11:33:44 crc kubenswrapper[4699]: I0226 11:33:44.271705 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd4545ea-b0c7-4fd6-9636-a826457d4e3a" path="/var/lib/kubelet/pods/fd4545ea-b0c7-4fd6-9636-a826457d4e3a/volumes" Feb 26 11:33:45 crc kubenswrapper[4699]: I0226 11:33:45.269161 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 26 11:33:45 crc kubenswrapper[4699]: I0226 11:33:45.307454 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 26 11:33:45 crc kubenswrapper[4699]: I0226 11:33:45.603363 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 11:33:45 crc kubenswrapper[4699]: I0226 11:33:45.603404 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 11:33:45 crc kubenswrapper[4699]: I0226 11:33:45.616810 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:33:45 crc kubenswrapper[4699]: I0226 11:33:45.629692 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mcdml" Feb 26 11:33:45 crc kubenswrapper[4699]: I0226 11:33:45.643940 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f528c9c1-4318-4d46-9b02-43f955e04009-combined-ca-bundle\") pod \"f528c9c1-4318-4d46-9b02-43f955e04009\" (UID: \"f528c9c1-4318-4d46-9b02-43f955e04009\") " Feb 26 11:33:45 crc kubenswrapper[4699]: I0226 11:33:45.644166 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f528c9c1-4318-4d46-9b02-43f955e04009-scripts\") pod \"f528c9c1-4318-4d46-9b02-43f955e04009\" (UID: \"f528c9c1-4318-4d46-9b02-43f955e04009\") " Feb 26 11:33:45 crc kubenswrapper[4699]: I0226 11:33:45.644281 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f528c9c1-4318-4d46-9b02-43f955e04009-config-data\") pod \"f528c9c1-4318-4d46-9b02-43f955e04009\" (UID: \"f528c9c1-4318-4d46-9b02-43f955e04009\") " Feb 26 11:33:45 crc kubenswrapper[4699]: I0226 11:33:45.644381 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lsct\" (UniqueName: \"kubernetes.io/projected/f528c9c1-4318-4d46-9b02-43f955e04009-kube-api-access-4lsct\") pod \"f528c9c1-4318-4d46-9b02-43f955e04009\" (UID: \"f528c9c1-4318-4d46-9b02-43f955e04009\") " Feb 26 11:33:45 crc kubenswrapper[4699]: I0226 11:33:45.654972 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f528c9c1-4318-4d46-9b02-43f955e04009-scripts" (OuterVolumeSpecName: "scripts") pod "f528c9c1-4318-4d46-9b02-43f955e04009" (UID: "f528c9c1-4318-4d46-9b02-43f955e04009"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:45 crc kubenswrapper[4699]: I0226 11:33:45.671699 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f528c9c1-4318-4d46-9b02-43f955e04009-kube-api-access-4lsct" (OuterVolumeSpecName: "kube-api-access-4lsct") pod "f528c9c1-4318-4d46-9b02-43f955e04009" (UID: "f528c9c1-4318-4d46-9b02-43f955e04009"). InnerVolumeSpecName "kube-api-access-4lsct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:33:45 crc kubenswrapper[4699]: I0226 11:33:45.711727 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f528c9c1-4318-4d46-9b02-43f955e04009-config-data" (OuterVolumeSpecName: "config-data") pod "f528c9c1-4318-4d46-9b02-43f955e04009" (UID: "f528c9c1-4318-4d46-9b02-43f955e04009"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:45 crc kubenswrapper[4699]: I0226 11:33:45.715379 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-vlzrl"] Feb 26 11:33:45 crc kubenswrapper[4699]: I0226 11:33:45.716051 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" podUID="9fa27ea0-52eb-406f-8256-68b4a471e452" containerName="dnsmasq-dns" containerID="cri-o://b6434f4c6dfd6f80388c5828582829cf6f0feba6a3ec19ba253ae96d727b8a21" gracePeriod=10 Feb 26 11:33:45 crc kubenswrapper[4699]: I0226 11:33:45.729533 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f528c9c1-4318-4d46-9b02-43f955e04009-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f528c9c1-4318-4d46-9b02-43f955e04009" (UID: "f528c9c1-4318-4d46-9b02-43f955e04009"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:45 crc kubenswrapper[4699]: I0226 11:33:45.750836 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f528c9c1-4318-4d46-9b02-43f955e04009-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:45 crc kubenswrapper[4699]: I0226 11:33:45.750878 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lsct\" (UniqueName: \"kubernetes.io/projected/f528c9c1-4318-4d46-9b02-43f955e04009-kube-api-access-4lsct\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:45 crc kubenswrapper[4699]: I0226 11:33:45.750893 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f528c9c1-4318-4d46-9b02-43f955e04009-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:45 crc kubenswrapper[4699]: I0226 11:33:45.750906 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f528c9c1-4318-4d46-9b02-43f955e04009-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.234535 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.237364 4699 generic.go:334] "Generic (PLEG): container finished" podID="9fa27ea0-52eb-406f-8256-68b4a471e452" containerID="b6434f4c6dfd6f80388c5828582829cf6f0feba6a3ec19ba253ae96d727b8a21" exitCode=0 Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.237412 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" event={"ID":"9fa27ea0-52eb-406f-8256-68b4a471e452","Type":"ContainerDied","Data":"b6434f4c6dfd6f80388c5828582829cf6f0feba6a3ec19ba253ae96d727b8a21"} Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.237438 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.237466 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-vlzrl" event={"ID":"9fa27ea0-52eb-406f-8256-68b4a471e452","Type":"ContainerDied","Data":"096662e32232c28cf3046778c91211f7c3482d79260670ba5c8b5347692e739f"} Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.237492 4699 scope.go:117] "RemoveContainer" containerID="b6434f4c6dfd6f80388c5828582829cf6f0feba6a3ec19ba253ae96d727b8a21" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.240942 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mcdml" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.246496 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mcdml" event={"ID":"f528c9c1-4318-4d46-9b02-43f955e04009","Type":"ContainerDied","Data":"53909e629fc2641cca7ffd773dbe454e7b1a7fac09f9589c009aa88c45e195ad"} Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.246569 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53909e629fc2641cca7ffd773dbe454e7b1a7fac09f9589c009aa88c45e195ad" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.258986 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-dns-swift-storage-0\") pod \"9fa27ea0-52eb-406f-8256-68b4a471e452\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.259222 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-config\") pod \"9fa27ea0-52eb-406f-8256-68b4a471e452\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.259309 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-ovsdbserver-sb\") pod \"9fa27ea0-52eb-406f-8256-68b4a471e452\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.259341 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-ovsdbserver-nb\") pod \"9fa27ea0-52eb-406f-8256-68b4a471e452\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.259405 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-dns-svc\") pod \"9fa27ea0-52eb-406f-8256-68b4a471e452\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.259449 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x97ms\" (UniqueName: \"kubernetes.io/projected/9fa27ea0-52eb-406f-8256-68b4a471e452-kube-api-access-x97ms\") pod \"9fa27ea0-52eb-406f-8256-68b4a471e452\" (UID: \"9fa27ea0-52eb-406f-8256-68b4a471e452\") " Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.271824 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fa27ea0-52eb-406f-8256-68b4a471e452-kube-api-access-x97ms" (OuterVolumeSpecName: "kube-api-access-x97ms") pod "9fa27ea0-52eb-406f-8256-68b4a471e452" (UID: "9fa27ea0-52eb-406f-8256-68b4a471e452"). InnerVolumeSpecName "kube-api-access-x97ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.274300 4699 scope.go:117] "RemoveContainer" containerID="c85028e72c3a0e7a111082381d8b68740bba7121da4a177d5d4e87fed9daf888" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.316554 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.327489 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9fa27ea0-52eb-406f-8256-68b4a471e452" (UID: "9fa27ea0-52eb-406f-8256-68b4a471e452"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.337709 4699 scope.go:117] "RemoveContainer" containerID="b6434f4c6dfd6f80388c5828582829cf6f0feba6a3ec19ba253ae96d727b8a21" Feb 26 11:33:46 crc kubenswrapper[4699]: E0226 11:33:46.338223 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6434f4c6dfd6f80388c5828582829cf6f0feba6a3ec19ba253ae96d727b8a21\": container with ID starting with b6434f4c6dfd6f80388c5828582829cf6f0feba6a3ec19ba253ae96d727b8a21 not found: ID does not exist" containerID="b6434f4c6dfd6f80388c5828582829cf6f0feba6a3ec19ba253ae96d727b8a21" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.338264 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6434f4c6dfd6f80388c5828582829cf6f0feba6a3ec19ba253ae96d727b8a21"} err="failed to get container status \"b6434f4c6dfd6f80388c5828582829cf6f0feba6a3ec19ba253ae96d727b8a21\": rpc error: code = NotFound desc = could not find container \"b6434f4c6dfd6f80388c5828582829cf6f0feba6a3ec19ba253ae96d727b8a21\": container with ID starting with b6434f4c6dfd6f80388c5828582829cf6f0feba6a3ec19ba253ae96d727b8a21 not found: ID does not exist" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.338289 4699 scope.go:117] "RemoveContainer" containerID="c85028e72c3a0e7a111082381d8b68740bba7121da4a177d5d4e87fed9daf888" Feb 26 11:33:46 crc kubenswrapper[4699]: E0226 11:33:46.339406 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c85028e72c3a0e7a111082381d8b68740bba7121da4a177d5d4e87fed9daf888\": container with ID starting with c85028e72c3a0e7a111082381d8b68740bba7121da4a177d5d4e87fed9daf888 not found: ID does not exist" containerID="c85028e72c3a0e7a111082381d8b68740bba7121da4a177d5d4e87fed9daf888" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.339442 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c85028e72c3a0e7a111082381d8b68740bba7121da4a177d5d4e87fed9daf888"} err="failed to get container status \"c85028e72c3a0e7a111082381d8b68740bba7121da4a177d5d4e87fed9daf888\": rpc error: code = NotFound desc = could not find container \"c85028e72c3a0e7a111082381d8b68740bba7121da4a177d5d4e87fed9daf888\": container with ID starting with c85028e72c3a0e7a111082381d8b68740bba7121da4a177d5d4e87fed9daf888 not found: ID does not exist" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.344069 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9fa27ea0-52eb-406f-8256-68b4a471e452" (UID: "9fa27ea0-52eb-406f-8256-68b4a471e452"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.359651 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9fa27ea0-52eb-406f-8256-68b4a471e452" (UID: "9fa27ea0-52eb-406f-8256-68b4a471e452"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.362071 4699 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.362106 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.362131 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.362140 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x97ms\" (UniqueName: \"kubernetes.io/projected/9fa27ea0-52eb-406f-8256-68b4a471e452-kube-api-access-x97ms\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.382679 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-config" (OuterVolumeSpecName: "config") pod "9fa27ea0-52eb-406f-8256-68b4a471e452" (UID: "9fa27ea0-52eb-406f-8256-68b4a471e452"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.404082 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9fa27ea0-52eb-406f-8256-68b4a471e452" (UID: "9fa27ea0-52eb-406f-8256-68b4a471e452"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.463928 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.463960 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fa27ea0-52eb-406f-8256-68b4a471e452-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.519794 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.520658 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3da58d42-6c34-4a38-b9dc-eeeb20542955" containerName="nova-api-api" containerID="cri-o://9d7c08d80dbb246a87244e1998a0a8a3673755ff4e13d9dbf8e24611acf9af5c" gracePeriod=30 Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.520831 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3da58d42-6c34-4a38-b9dc-eeeb20542955" containerName="nova-api-log" containerID="cri-o://1c92a80645afa4e46d1addaafa746637c845088f3cba1406f81a67f4dfe1af22" gracePeriod=30 Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.529715 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3da58d42-6c34-4a38-b9dc-eeeb20542955" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": EOF" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.529921 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3da58d42-6c34-4a38-b9dc-eeeb20542955" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": EOF" Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.573081 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.577626 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0740e570-d27e-4d97-b511-315a9ad45022" containerName="nova-metadata-metadata" containerID="cri-o://56bcb9d42e1b3abd08748801a926fbec8d7021ec641d0c2f8df7fcff3ae44555" gracePeriod=30 Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.577596 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0740e570-d27e-4d97-b511-315a9ad45022" containerName="nova-metadata-log" containerID="cri-o://2064f2d54f436918410771cf684160b85d090a94e656dd353921b47f48e5a9bd" gracePeriod=30 Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.653052 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-vlzrl"] Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.675218 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-vlzrl"] Feb 26 11:33:46 crc kubenswrapper[4699]: I0226 11:33:46.834450 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 11:33:47 crc kubenswrapper[4699]: I0226 11:33:47.286409 4699 generic.go:334] "Generic (PLEG): container finished" podID="0740e570-d27e-4d97-b511-315a9ad45022" containerID="56bcb9d42e1b3abd08748801a926fbec8d7021ec641d0c2f8df7fcff3ae44555" exitCode=0 Feb 26 11:33:47 crc kubenswrapper[4699]: I0226 11:33:47.286440 4699 generic.go:334] "Generic (PLEG): container finished" podID="0740e570-d27e-4d97-b511-315a9ad45022" containerID="2064f2d54f436918410771cf684160b85d090a94e656dd353921b47f48e5a9bd" exitCode=143 Feb 26 11:33:47 crc kubenswrapper[4699]: I0226 11:33:47.286458 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0740e570-d27e-4d97-b511-315a9ad45022","Type":"ContainerDied","Data":"56bcb9d42e1b3abd08748801a926fbec8d7021ec641d0c2f8df7fcff3ae44555"} Feb 26 11:33:47 crc kubenswrapper[4699]: I0226 11:33:47.286511 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0740e570-d27e-4d97-b511-315a9ad45022","Type":"ContainerDied","Data":"2064f2d54f436918410771cf684160b85d090a94e656dd353921b47f48e5a9bd"} Feb 26 11:33:47 crc kubenswrapper[4699]: I0226 11:33:47.291949 4699 generic.go:334] "Generic (PLEG): container finished" podID="3da58d42-6c34-4a38-b9dc-eeeb20542955" containerID="1c92a80645afa4e46d1addaafa746637c845088f3cba1406f81a67f4dfe1af22" exitCode=143 Feb 26 11:33:47 crc kubenswrapper[4699]: I0226 11:33:47.292008 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3da58d42-6c34-4a38-b9dc-eeeb20542955","Type":"ContainerDied","Data":"1c92a80645afa4e46d1addaafa746637c845088f3cba1406f81a67f4dfe1af22"} Feb 26 11:33:47 crc kubenswrapper[4699]: I0226 11:33:47.515197 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 11:33:47 crc kubenswrapper[4699]: I0226 11:33:47.688410 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0740e570-d27e-4d97-b511-315a9ad45022-combined-ca-bundle\") pod \"0740e570-d27e-4d97-b511-315a9ad45022\" (UID: \"0740e570-d27e-4d97-b511-315a9ad45022\") " Feb 26 11:33:47 crc kubenswrapper[4699]: I0226 11:33:47.688560 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0740e570-d27e-4d97-b511-315a9ad45022-nova-metadata-tls-certs\") pod \"0740e570-d27e-4d97-b511-315a9ad45022\" (UID: \"0740e570-d27e-4d97-b511-315a9ad45022\") " Feb 26 11:33:47 crc kubenswrapper[4699]: I0226 11:33:47.688597 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0740e570-d27e-4d97-b511-315a9ad45022-config-data\") pod \"0740e570-d27e-4d97-b511-315a9ad45022\" (UID: \"0740e570-d27e-4d97-b511-315a9ad45022\") " Feb 26 11:33:47 crc kubenswrapper[4699]: I0226 11:33:47.688652 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0740e570-d27e-4d97-b511-315a9ad45022-logs\") pod \"0740e570-d27e-4d97-b511-315a9ad45022\" (UID: \"0740e570-d27e-4d97-b511-315a9ad45022\") " Feb 26 11:33:47 crc kubenswrapper[4699]: I0226 11:33:47.688810 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxwjp\" (UniqueName: \"kubernetes.io/projected/0740e570-d27e-4d97-b511-315a9ad45022-kube-api-access-wxwjp\") pod \"0740e570-d27e-4d97-b511-315a9ad45022\" (UID: \"0740e570-d27e-4d97-b511-315a9ad45022\") " Feb 26 11:33:47 crc kubenswrapper[4699]: I0226 11:33:47.690948 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0740e570-d27e-4d97-b511-315a9ad45022-logs" (OuterVolumeSpecName: "logs") pod "0740e570-d27e-4d97-b511-315a9ad45022" (UID: "0740e570-d27e-4d97-b511-315a9ad45022"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:33:47 crc kubenswrapper[4699]: I0226 11:33:47.698581 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0740e570-d27e-4d97-b511-315a9ad45022-kube-api-access-wxwjp" (OuterVolumeSpecName: "kube-api-access-wxwjp") pod "0740e570-d27e-4d97-b511-315a9ad45022" (UID: "0740e570-d27e-4d97-b511-315a9ad45022"). InnerVolumeSpecName "kube-api-access-wxwjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:33:47 crc kubenswrapper[4699]: I0226 11:33:47.714323 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0740e570-d27e-4d97-b511-315a9ad45022-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0740e570-d27e-4d97-b511-315a9ad45022" (UID: "0740e570-d27e-4d97-b511-315a9ad45022"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:47 crc kubenswrapper[4699]: I0226 11:33:47.731965 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0740e570-d27e-4d97-b511-315a9ad45022-config-data" (OuterVolumeSpecName: "config-data") pod "0740e570-d27e-4d97-b511-315a9ad45022" (UID: "0740e570-d27e-4d97-b511-315a9ad45022"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:47 crc kubenswrapper[4699]: I0226 11:33:47.758914 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0740e570-d27e-4d97-b511-315a9ad45022-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0740e570-d27e-4d97-b511-315a9ad45022" (UID: "0740e570-d27e-4d97-b511-315a9ad45022"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:47 crc kubenswrapper[4699]: I0226 11:33:47.791752 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0740e570-d27e-4d97-b511-315a9ad45022-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:47 crc kubenswrapper[4699]: I0226 11:33:47.791803 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0740e570-d27e-4d97-b511-315a9ad45022-logs\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:47 crc kubenswrapper[4699]: I0226 11:33:47.791819 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxwjp\" (UniqueName: \"kubernetes.io/projected/0740e570-d27e-4d97-b511-315a9ad45022-kube-api-access-wxwjp\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:47 crc kubenswrapper[4699]: I0226 11:33:47.791831 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0740e570-d27e-4d97-b511-315a9ad45022-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:47 crc kubenswrapper[4699]: I0226 11:33:47.791841 4699 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0740e570-d27e-4d97-b511-315a9ad45022-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.285503 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fa27ea0-52eb-406f-8256-68b4a471e452" path="/var/lib/kubelet/pods/9fa27ea0-52eb-406f-8256-68b4a471e452/volumes" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.320377 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="003dad7c-8300-49a9-80d0-99dcad71fa84" containerName="nova-scheduler-scheduler" containerID="cri-o://24ef43097973fb5b65cb505842a287bfdce1031cc145b5b783382e2c06e63cf6" gracePeriod=30 Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.321173 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.321182 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0740e570-d27e-4d97-b511-315a9ad45022","Type":"ContainerDied","Data":"599226ea072da23c2b7a52b3304a9265a01ddf31d06132414e61defb38999e48"} Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.321502 4699 scope.go:117] "RemoveContainer" containerID="56bcb9d42e1b3abd08748801a926fbec8d7021ec641d0c2f8df7fcff3ae44555" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.353267 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.362182 4699 scope.go:117] "RemoveContainer" containerID="2064f2d54f436918410771cf684160b85d090a94e656dd353921b47f48e5a9bd" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.373456 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.388565 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.406295 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 11:33:48 crc kubenswrapper[4699]: E0226 11:33:48.406815 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fa27ea0-52eb-406f-8256-68b4a471e452" containerName="dnsmasq-dns" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.406828 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa27ea0-52eb-406f-8256-68b4a471e452" containerName="dnsmasq-dns" Feb 26 11:33:48 crc kubenswrapper[4699]: E0226 11:33:48.406840 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0740e570-d27e-4d97-b511-315a9ad45022" containerName="nova-metadata-log" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.406847 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0740e570-d27e-4d97-b511-315a9ad45022" containerName="nova-metadata-log" Feb 26 11:33:48 crc kubenswrapper[4699]: E0226 11:33:48.406875 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fa27ea0-52eb-406f-8256-68b4a471e452" containerName="init" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.406881 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa27ea0-52eb-406f-8256-68b4a471e452" containerName="init" Feb 26 11:33:48 crc kubenswrapper[4699]: E0226 11:33:48.406897 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f528c9c1-4318-4d46-9b02-43f955e04009" containerName="nova-manage" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.406905 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f528c9c1-4318-4d46-9b02-43f955e04009" containerName="nova-manage" Feb 26 11:33:48 crc kubenswrapper[4699]: E0226 11:33:48.406918 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0740e570-d27e-4d97-b511-315a9ad45022" containerName="nova-metadata-metadata" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.406924 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0740e570-d27e-4d97-b511-315a9ad45022" containerName="nova-metadata-metadata" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.407097 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="f528c9c1-4318-4d46-9b02-43f955e04009" containerName="nova-manage" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.407128 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="0740e570-d27e-4d97-b511-315a9ad45022" containerName="nova-metadata-log" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.407152 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fa27ea0-52eb-406f-8256-68b4a471e452" containerName="dnsmasq-dns" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.407161 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="0740e570-d27e-4d97-b511-315a9ad45022" containerName="nova-metadata-metadata" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.408293 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.415730 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.415933 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.420897 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 11:33:48 crc kubenswrapper[4699]: E0226 11:33:48.498408 4699 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0740e570_d27e_4d97_b511_315a9ad45022.slice\": RecentStats: unable to find data in memory cache]" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.514297 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj8g6\" (UniqueName: \"kubernetes.io/projected/c847caf4-446a-4738-88a8-26d1628c91f7-kube-api-access-rj8g6\") pod \"nova-metadata-0\" (UID: \"c847caf4-446a-4738-88a8-26d1628c91f7\") " pod="openstack/nova-metadata-0" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.514400 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c847caf4-446a-4738-88a8-26d1628c91f7-config-data\") pod \"nova-metadata-0\" (UID: \"c847caf4-446a-4738-88a8-26d1628c91f7\") " pod="openstack/nova-metadata-0" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.514419 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c847caf4-446a-4738-88a8-26d1628c91f7-logs\") pod \"nova-metadata-0\" (UID: \"c847caf4-446a-4738-88a8-26d1628c91f7\") " pod="openstack/nova-metadata-0" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.514468 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c847caf4-446a-4738-88a8-26d1628c91f7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c847caf4-446a-4738-88a8-26d1628c91f7\") " pod="openstack/nova-metadata-0" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.514530 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c847caf4-446a-4738-88a8-26d1628c91f7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c847caf4-446a-4738-88a8-26d1628c91f7\") " pod="openstack/nova-metadata-0" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.616894 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c847caf4-446a-4738-88a8-26d1628c91f7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c847caf4-446a-4738-88a8-26d1628c91f7\") " pod="openstack/nova-metadata-0" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.617549 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj8g6\" (UniqueName: \"kubernetes.io/projected/c847caf4-446a-4738-88a8-26d1628c91f7-kube-api-access-rj8g6\") pod \"nova-metadata-0\" (UID: \"c847caf4-446a-4738-88a8-26d1628c91f7\") " pod="openstack/nova-metadata-0" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.617752 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c847caf4-446a-4738-88a8-26d1628c91f7-config-data\") pod \"nova-metadata-0\" (UID: \"c847caf4-446a-4738-88a8-26d1628c91f7\") " pod="openstack/nova-metadata-0" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.617868 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c847caf4-446a-4738-88a8-26d1628c91f7-logs\") pod \"nova-metadata-0\" (UID: \"c847caf4-446a-4738-88a8-26d1628c91f7\") " pod="openstack/nova-metadata-0" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.618016 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c847caf4-446a-4738-88a8-26d1628c91f7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c847caf4-446a-4738-88a8-26d1628c91f7\") " pod="openstack/nova-metadata-0" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.618342 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c847caf4-446a-4738-88a8-26d1628c91f7-logs\") pod \"nova-metadata-0\" (UID: \"c847caf4-446a-4738-88a8-26d1628c91f7\") " pod="openstack/nova-metadata-0" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.622418 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c847caf4-446a-4738-88a8-26d1628c91f7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c847caf4-446a-4738-88a8-26d1628c91f7\") " pod="openstack/nova-metadata-0" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.623280 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c847caf4-446a-4738-88a8-26d1628c91f7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c847caf4-446a-4738-88a8-26d1628c91f7\") " pod="openstack/nova-metadata-0" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.626552 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c847caf4-446a-4738-88a8-26d1628c91f7-config-data\") pod \"nova-metadata-0\" (UID: \"c847caf4-446a-4738-88a8-26d1628c91f7\") " pod="openstack/nova-metadata-0" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.638736 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj8g6\" (UniqueName: \"kubernetes.io/projected/c847caf4-446a-4738-88a8-26d1628c91f7-kube-api-access-rj8g6\") pod \"nova-metadata-0\" (UID: \"c847caf4-446a-4738-88a8-26d1628c91f7\") " pod="openstack/nova-metadata-0" Feb 26 11:33:48 crc kubenswrapper[4699]: I0226 11:33:48.732955 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 11:33:49 crc kubenswrapper[4699]: I0226 11:33:49.293238 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 11:33:49 crc kubenswrapper[4699]: W0226 11:33:49.308645 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc847caf4_446a_4738_88a8_26d1628c91f7.slice/crio-3e0522c5f3203ed953a2188504efca913e34205ca0ad66ee7025a214978a16b0 WatchSource:0}: Error finding container 3e0522c5f3203ed953a2188504efca913e34205ca0ad66ee7025a214978a16b0: Status 404 returned error can't find the container with id 3e0522c5f3203ed953a2188504efca913e34205ca0ad66ee7025a214978a16b0 Feb 26 11:33:49 crc kubenswrapper[4699]: I0226 11:33:49.333685 4699 generic.go:334] "Generic (PLEG): container finished" podID="b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a" containerID="6a0914a3db1c0b6e1b3a5a9cf2e1d8ac0e44a6dc0eb35fc159954e4b3f365a3d" exitCode=0 Feb 26 11:33:49 crc kubenswrapper[4699]: I0226 11:33:49.333818 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dz84d" event={"ID":"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a","Type":"ContainerDied","Data":"6a0914a3db1c0b6e1b3a5a9cf2e1d8ac0e44a6dc0eb35fc159954e4b3f365a3d"} Feb 26 11:33:49 crc kubenswrapper[4699]: I0226 11:33:49.338442 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c847caf4-446a-4738-88a8-26d1628c91f7","Type":"ContainerStarted","Data":"3e0522c5f3203ed953a2188504efca913e34205ca0ad66ee7025a214978a16b0"} Feb 26 11:33:50 crc kubenswrapper[4699]: E0226 11:33:50.271212 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24ef43097973fb5b65cb505842a287bfdce1031cc145b5b783382e2c06e63cf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 11:33:50 crc kubenswrapper[4699]: I0226 11:33:50.272612 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0740e570-d27e-4d97-b511-315a9ad45022" path="/var/lib/kubelet/pods/0740e570-d27e-4d97-b511-315a9ad45022/volumes" Feb 26 11:33:50 crc kubenswrapper[4699]: E0226 11:33:50.273153 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24ef43097973fb5b65cb505842a287bfdce1031cc145b5b783382e2c06e63cf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 11:33:50 crc kubenswrapper[4699]: E0226 11:33:50.275436 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24ef43097973fb5b65cb505842a287bfdce1031cc145b5b783382e2c06e63cf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 11:33:50 crc kubenswrapper[4699]: E0226 11:33:50.275471 4699 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="003dad7c-8300-49a9-80d0-99dcad71fa84" containerName="nova-scheduler-scheduler" Feb 26 11:33:50 crc kubenswrapper[4699]: I0226 11:33:50.357909 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c847caf4-446a-4738-88a8-26d1628c91f7","Type":"ContainerStarted","Data":"36d1cb8833b256faa223eaa6a7298ffa4e21878d531a7626347d9cfebc03ef81"} Feb 26 11:33:50 crc kubenswrapper[4699]: I0226 11:33:50.357970 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c847caf4-446a-4738-88a8-26d1628c91f7","Type":"ContainerStarted","Data":"7421fb2591f05611aff351133a3564dd072d71f5d14ccfd25934a2a6a477aeaa"} Feb 26 11:33:50 crc kubenswrapper[4699]: I0226 11:33:50.382693 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.382671304 podStartE2EDuration="2.382671304s" podCreationTimestamp="2026-02-26 11:33:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:33:50.377832273 +0000 UTC m=+1376.188658727" watchObservedRunningTime="2026-02-26 11:33:50.382671304 +0000 UTC m=+1376.193497738" Feb 26 11:33:50 crc kubenswrapper[4699]: I0226 11:33:50.682548 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dz84d" Feb 26 11:33:50 crc kubenswrapper[4699]: I0226 11:33:50.762346 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5wlj\" (UniqueName: \"kubernetes.io/projected/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-kube-api-access-v5wlj\") pod \"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a\" (UID: \"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a\") " Feb 26 11:33:50 crc kubenswrapper[4699]: I0226 11:33:50.762510 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-scripts\") pod \"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a\" (UID: \"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a\") " Feb 26 11:33:50 crc kubenswrapper[4699]: I0226 11:33:50.762561 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-config-data\") pod \"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a\" (UID: \"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a\") " Feb 26 11:33:50 crc kubenswrapper[4699]: I0226 11:33:50.762597 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-combined-ca-bundle\") pod \"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a\" (UID: \"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a\") " Feb 26 11:33:50 crc kubenswrapper[4699]: I0226 11:33:50.780141 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-kube-api-access-v5wlj" (OuterVolumeSpecName: "kube-api-access-v5wlj") pod "b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a" (UID: "b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a"). InnerVolumeSpecName "kube-api-access-v5wlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:33:50 crc kubenswrapper[4699]: I0226 11:33:50.786571 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-scripts" (OuterVolumeSpecName: "scripts") pod "b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a" (UID: "b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:50 crc kubenswrapper[4699]: I0226 11:33:50.798098 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a" (UID: "b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:50 crc kubenswrapper[4699]: I0226 11:33:50.819941 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-config-data" (OuterVolumeSpecName: "config-data") pod "b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a" (UID: "b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:50 crc kubenswrapper[4699]: I0226 11:33:50.866511 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:50 crc kubenswrapper[4699]: I0226 11:33:50.866561 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:50 crc kubenswrapper[4699]: I0226 11:33:50.866573 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:50 crc kubenswrapper[4699]: I0226 11:33:50.866583 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5wlj\" (UniqueName: \"kubernetes.io/projected/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a-kube-api-access-v5wlj\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.369055 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dz84d" event={"ID":"b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a","Type":"ContainerDied","Data":"eb1129c3d9136c15c6047e7fc0342dfc057c4e5270886b2bb6ec9a038a974469"} Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.369102 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb1129c3d9136c15c6047e7fc0342dfc057c4e5270886b2bb6ec9a038a974469" Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.369104 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dz84d" Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.423477 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 11:33:51 crc kubenswrapper[4699]: E0226 11:33:51.423967 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a" containerName="nova-cell1-conductor-db-sync" Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.423991 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a" containerName="nova-cell1-conductor-db-sync" Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.424261 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a" containerName="nova-cell1-conductor-db-sync" Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.425028 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.427276 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.437089 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.476618 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2b3846-c197-4cc6-a442-0f466d97d53d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ff2b3846-c197-4cc6-a442-0f466d97d53d\") " pod="openstack/nova-cell1-conductor-0" Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.476683 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff2b3846-c197-4cc6-a442-0f466d97d53d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ff2b3846-c197-4cc6-a442-0f466d97d53d\") " pod="openstack/nova-cell1-conductor-0" Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.477498 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbd65\" (UniqueName: \"kubernetes.io/projected/ff2b3846-c197-4cc6-a442-0f466d97d53d-kube-api-access-kbd65\") pod \"nova-cell1-conductor-0\" (UID: \"ff2b3846-c197-4cc6-a442-0f466d97d53d\") " pod="openstack/nova-cell1-conductor-0" Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.579594 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbd65\" (UniqueName: \"kubernetes.io/projected/ff2b3846-c197-4cc6-a442-0f466d97d53d-kube-api-access-kbd65\") pod \"nova-cell1-conductor-0\" (UID: \"ff2b3846-c197-4cc6-a442-0f466d97d53d\") " pod="openstack/nova-cell1-conductor-0" Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.579710 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2b3846-c197-4cc6-a442-0f466d97d53d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ff2b3846-c197-4cc6-a442-0f466d97d53d\") " pod="openstack/nova-cell1-conductor-0" Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.579736 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff2b3846-c197-4cc6-a442-0f466d97d53d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ff2b3846-c197-4cc6-a442-0f466d97d53d\") " pod="openstack/nova-cell1-conductor-0" Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.586303 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2b3846-c197-4cc6-a442-0f466d97d53d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ff2b3846-c197-4cc6-a442-0f466d97d53d\") " pod="openstack/nova-cell1-conductor-0" Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.588947 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff2b3846-c197-4cc6-a442-0f466d97d53d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ff2b3846-c197-4cc6-a442-0f466d97d53d\") " pod="openstack/nova-cell1-conductor-0" Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.601052 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbd65\" (UniqueName: \"kubernetes.io/projected/ff2b3846-c197-4cc6-a442-0f466d97d53d-kube-api-access-kbd65\") pod \"nova-cell1-conductor-0\" (UID: \"ff2b3846-c197-4cc6-a442-0f466d97d53d\") " pod="openstack/nova-cell1-conductor-0" Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.807303 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.922465 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.987424 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003dad7c-8300-49a9-80d0-99dcad71fa84-config-data\") pod \"003dad7c-8300-49a9-80d0-99dcad71fa84\" (UID: \"003dad7c-8300-49a9-80d0-99dcad71fa84\") " Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.988612 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znmv6\" (UniqueName: \"kubernetes.io/projected/003dad7c-8300-49a9-80d0-99dcad71fa84-kube-api-access-znmv6\") pod \"003dad7c-8300-49a9-80d0-99dcad71fa84\" (UID: \"003dad7c-8300-49a9-80d0-99dcad71fa84\") " Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.988706 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003dad7c-8300-49a9-80d0-99dcad71fa84-combined-ca-bundle\") pod \"003dad7c-8300-49a9-80d0-99dcad71fa84\" (UID: \"003dad7c-8300-49a9-80d0-99dcad71fa84\") " Feb 26 11:33:51 crc kubenswrapper[4699]: I0226 11:33:51.996076 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/003dad7c-8300-49a9-80d0-99dcad71fa84-kube-api-access-znmv6" (OuterVolumeSpecName: "kube-api-access-znmv6") pod "003dad7c-8300-49a9-80d0-99dcad71fa84" (UID: "003dad7c-8300-49a9-80d0-99dcad71fa84"). InnerVolumeSpecName "kube-api-access-znmv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.024270 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/003dad7c-8300-49a9-80d0-99dcad71fa84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "003dad7c-8300-49a9-80d0-99dcad71fa84" (UID: "003dad7c-8300-49a9-80d0-99dcad71fa84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.030301 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/003dad7c-8300-49a9-80d0-99dcad71fa84-config-data" (OuterVolumeSpecName: "config-data") pod "003dad7c-8300-49a9-80d0-99dcad71fa84" (UID: "003dad7c-8300-49a9-80d0-99dcad71fa84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.092425 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003dad7c-8300-49a9-80d0-99dcad71fa84-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.092742 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znmv6\" (UniqueName: \"kubernetes.io/projected/003dad7c-8300-49a9-80d0-99dcad71fa84-kube-api-access-znmv6\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.092825 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003dad7c-8300-49a9-80d0-99dcad71fa84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.273323 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 11:33:52 crc kubenswrapper[4699]: W0226 11:33:52.276759 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff2b3846_c197_4cc6_a442_0f466d97d53d.slice/crio-27a0aa6703826927aa2df442d9029c896cbbf644f8979f33f9b71125f0032906 WatchSource:0}: Error finding container 27a0aa6703826927aa2df442d9029c896cbbf644f8979f33f9b71125f0032906: Status 404 returned error can't find the container with id 27a0aa6703826927aa2df442d9029c896cbbf644f8979f33f9b71125f0032906 Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.382310 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.382488 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf" containerName="kube-state-metrics" containerID="cri-o://d2f101e4ea8ddf069d258e4d0b303b0807a2a990a84939a9396f47359ac82f3b" gracePeriod=30 Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.384649 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ff2b3846-c197-4cc6-a442-0f466d97d53d","Type":"ContainerStarted","Data":"27a0aa6703826927aa2df442d9029c896cbbf644f8979f33f9b71125f0032906"} Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.387827 4699 generic.go:334] "Generic (PLEG): container finished" podID="003dad7c-8300-49a9-80d0-99dcad71fa84" containerID="24ef43097973fb5b65cb505842a287bfdce1031cc145b5b783382e2c06e63cf6" exitCode=0 Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.387967 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"003dad7c-8300-49a9-80d0-99dcad71fa84","Type":"ContainerDied","Data":"24ef43097973fb5b65cb505842a287bfdce1031cc145b5b783382e2c06e63cf6"} Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.388047 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"003dad7c-8300-49a9-80d0-99dcad71fa84","Type":"ContainerDied","Data":"1ffae0e8d9c34eeb99720f9aac223ffd7663c8e780138c2fbc15f472417c5fda"} Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.388142 4699 scope.go:117] "RemoveContainer" containerID="24ef43097973fb5b65cb505842a287bfdce1031cc145b5b783382e2c06e63cf6" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.388321 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.412437 4699 scope.go:117] "RemoveContainer" containerID="24ef43097973fb5b65cb505842a287bfdce1031cc145b5b783382e2c06e63cf6" Feb 26 11:33:52 crc kubenswrapper[4699]: E0226 11:33:52.412989 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24ef43097973fb5b65cb505842a287bfdce1031cc145b5b783382e2c06e63cf6\": container with ID starting with 24ef43097973fb5b65cb505842a287bfdce1031cc145b5b783382e2c06e63cf6 not found: ID does not exist" containerID="24ef43097973fb5b65cb505842a287bfdce1031cc145b5b783382e2c06e63cf6" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.413090 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24ef43097973fb5b65cb505842a287bfdce1031cc145b5b783382e2c06e63cf6"} err="failed to get container status \"24ef43097973fb5b65cb505842a287bfdce1031cc145b5b783382e2c06e63cf6\": rpc error: code = NotFound desc = could not find container \"24ef43097973fb5b65cb505842a287bfdce1031cc145b5b783382e2c06e63cf6\": container with ID starting with 24ef43097973fb5b65cb505842a287bfdce1031cc145b5b783382e2c06e63cf6 not found: ID does not exist" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.494284 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.505067 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.513917 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 11:33:52 crc kubenswrapper[4699]: E0226 11:33:52.515604 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="003dad7c-8300-49a9-80d0-99dcad71fa84" containerName="nova-scheduler-scheduler" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.515626 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="003dad7c-8300-49a9-80d0-99dcad71fa84" containerName="nova-scheduler-scheduler" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.515799 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="003dad7c-8300-49a9-80d0-99dcad71fa84" containerName="nova-scheduler-scheduler" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.516424 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.519362 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.535474 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.603043 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8bf50ee-a389-4a35-8899-81d885e1ec38-config-data\") pod \"nova-scheduler-0\" (UID: \"a8bf50ee-a389-4a35-8899-81d885e1ec38\") " pod="openstack/nova-scheduler-0" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.603208 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8bf50ee-a389-4a35-8899-81d885e1ec38-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a8bf50ee-a389-4a35-8899-81d885e1ec38\") " pod="openstack/nova-scheduler-0" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.603264 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85xhh\" (UniqueName: \"kubernetes.io/projected/a8bf50ee-a389-4a35-8899-81d885e1ec38-kube-api-access-85xhh\") pod \"nova-scheduler-0\" (UID: \"a8bf50ee-a389-4a35-8899-81d885e1ec38\") " pod="openstack/nova-scheduler-0" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.706609 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85xhh\" (UniqueName: \"kubernetes.io/projected/a8bf50ee-a389-4a35-8899-81d885e1ec38-kube-api-access-85xhh\") pod \"nova-scheduler-0\" (UID: \"a8bf50ee-a389-4a35-8899-81d885e1ec38\") " pod="openstack/nova-scheduler-0" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.706781 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8bf50ee-a389-4a35-8899-81d885e1ec38-config-data\") pod \"nova-scheduler-0\" (UID: \"a8bf50ee-a389-4a35-8899-81d885e1ec38\") " pod="openstack/nova-scheduler-0" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.706884 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8bf50ee-a389-4a35-8899-81d885e1ec38-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a8bf50ee-a389-4a35-8899-81d885e1ec38\") " pod="openstack/nova-scheduler-0" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.711779 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8bf50ee-a389-4a35-8899-81d885e1ec38-config-data\") pod \"nova-scheduler-0\" (UID: \"a8bf50ee-a389-4a35-8899-81d885e1ec38\") " pod="openstack/nova-scheduler-0" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.714820 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8bf50ee-a389-4a35-8899-81d885e1ec38-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a8bf50ee-a389-4a35-8899-81d885e1ec38\") " pod="openstack/nova-scheduler-0" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.751005 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85xhh\" (UniqueName: \"kubernetes.io/projected/a8bf50ee-a389-4a35-8899-81d885e1ec38-kube-api-access-85xhh\") pod \"nova-scheduler-0\" (UID: \"a8bf50ee-a389-4a35-8899-81d885e1ec38\") " pod="openstack/nova-scheduler-0" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.844783 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 11:33:52 crc kubenswrapper[4699]: I0226 11:33:52.922343 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.012036 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4vnz\" (UniqueName: \"kubernetes.io/projected/2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf-kube-api-access-r4vnz\") pod \"2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf\" (UID: \"2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf\") " Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.016799 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf-kube-api-access-r4vnz" (OuterVolumeSpecName: "kube-api-access-r4vnz") pod "2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf" (UID: "2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf"). InnerVolumeSpecName "kube-api-access-r4vnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.114080 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4vnz\" (UniqueName: \"kubernetes.io/projected/2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf-kube-api-access-r4vnz\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.393418 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.406517 4699 generic.go:334] "Generic (PLEG): container finished" podID="3da58d42-6c34-4a38-b9dc-eeeb20542955" containerID="9d7c08d80dbb246a87244e1998a0a8a3673755ff4e13d9dbf8e24611acf9af5c" exitCode=0 Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.406908 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3da58d42-6c34-4a38-b9dc-eeeb20542955","Type":"ContainerDied","Data":"9d7c08d80dbb246a87244e1998a0a8a3673755ff4e13d9dbf8e24611acf9af5c"} Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.411549 4699 generic.go:334] "Generic (PLEG): container finished" podID="2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf" containerID="d2f101e4ea8ddf069d258e4d0b303b0807a2a990a84939a9396f47359ac82f3b" exitCode=2 Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.411728 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.411879 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf","Type":"ContainerDied","Data":"d2f101e4ea8ddf069d258e4d0b303b0807a2a990a84939a9396f47359ac82f3b"} Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.411979 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf","Type":"ContainerDied","Data":"d71534977c30792b789d4e1ac180ec5af3f9ed3738ad0ab651747396010424ea"} Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.412070 4699 scope.go:117] "RemoveContainer" containerID="d2f101e4ea8ddf069d258e4d0b303b0807a2a990a84939a9396f47359ac82f3b" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.433416 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ff2b3846-c197-4cc6-a442-0f466d97d53d","Type":"ContainerStarted","Data":"9dd8780b5b90628e97e8e5acf91fb3f6e703343d5528f5eff51fd3ebf041878e"} Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.433701 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.435258 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.465610 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.465580155 podStartE2EDuration="2.465580155s" podCreationTimestamp="2026-02-26 11:33:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:33:53.450716434 +0000 UTC m=+1379.261542878" watchObservedRunningTime="2026-02-26 11:33:53.465580155 +0000 UTC m=+1379.276406599" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.477552 4699 scope.go:117] "RemoveContainer" containerID="d2f101e4ea8ddf069d258e4d0b303b0807a2a990a84939a9396f47359ac82f3b" Feb 26 11:33:53 crc kubenswrapper[4699]: E0226 11:33:53.480717 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2f101e4ea8ddf069d258e4d0b303b0807a2a990a84939a9396f47359ac82f3b\": container with ID starting with d2f101e4ea8ddf069d258e4d0b303b0807a2a990a84939a9396f47359ac82f3b not found: ID does not exist" containerID="d2f101e4ea8ddf069d258e4d0b303b0807a2a990a84939a9396f47359ac82f3b" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.480790 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2f101e4ea8ddf069d258e4d0b303b0807a2a990a84939a9396f47359ac82f3b"} err="failed to get container status \"d2f101e4ea8ddf069d258e4d0b303b0807a2a990a84939a9396f47359ac82f3b\": rpc error: code = NotFound desc = could not find container \"d2f101e4ea8ddf069d258e4d0b303b0807a2a990a84939a9396f47359ac82f3b\": container with ID starting with d2f101e4ea8ddf069d258e4d0b303b0807a2a990a84939a9396f47359ac82f3b not found: ID does not exist" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.513311 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.524239 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3da58d42-6c34-4a38-b9dc-eeeb20542955-config-data\") pod \"3da58d42-6c34-4a38-b9dc-eeeb20542955\" (UID: \"3da58d42-6c34-4a38-b9dc-eeeb20542955\") " Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.524685 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3da58d42-6c34-4a38-b9dc-eeeb20542955-logs\") pod \"3da58d42-6c34-4a38-b9dc-eeeb20542955\" (UID: \"3da58d42-6c34-4a38-b9dc-eeeb20542955\") " Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.524860 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrt7n\" (UniqueName: \"kubernetes.io/projected/3da58d42-6c34-4a38-b9dc-eeeb20542955-kube-api-access-qrt7n\") pod \"3da58d42-6c34-4a38-b9dc-eeeb20542955\" (UID: \"3da58d42-6c34-4a38-b9dc-eeeb20542955\") " Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.525004 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da58d42-6c34-4a38-b9dc-eeeb20542955-combined-ca-bundle\") pod \"3da58d42-6c34-4a38-b9dc-eeeb20542955\" (UID: \"3da58d42-6c34-4a38-b9dc-eeeb20542955\") " Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.526346 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.527276 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3da58d42-6c34-4a38-b9dc-eeeb20542955-logs" (OuterVolumeSpecName: "logs") pod "3da58d42-6c34-4a38-b9dc-eeeb20542955" (UID: "3da58d42-6c34-4a38-b9dc-eeeb20542955"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.531158 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3da58d42-6c34-4a38-b9dc-eeeb20542955-kube-api-access-qrt7n" (OuterVolumeSpecName: "kube-api-access-qrt7n") pod "3da58d42-6c34-4a38-b9dc-eeeb20542955" (UID: "3da58d42-6c34-4a38-b9dc-eeeb20542955"). InnerVolumeSpecName "kube-api-access-qrt7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.534422 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 11:33:53 crc kubenswrapper[4699]: E0226 11:33:53.534983 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf" containerName="kube-state-metrics" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.535010 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf" containerName="kube-state-metrics" Feb 26 11:33:53 crc kubenswrapper[4699]: E0226 11:33:53.535052 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da58d42-6c34-4a38-b9dc-eeeb20542955" containerName="nova-api-log" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.535061 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da58d42-6c34-4a38-b9dc-eeeb20542955" containerName="nova-api-log" Feb 26 11:33:53 crc kubenswrapper[4699]: E0226 11:33:53.535083 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da58d42-6c34-4a38-b9dc-eeeb20542955" containerName="nova-api-api" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.535091 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da58d42-6c34-4a38-b9dc-eeeb20542955" containerName="nova-api-api" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.535329 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="3da58d42-6c34-4a38-b9dc-eeeb20542955" containerName="nova-api-log" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.535352 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="3da58d42-6c34-4a38-b9dc-eeeb20542955" containerName="nova-api-api" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.535363 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf" containerName="kube-state-metrics" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.536440 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.538620 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.539019 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.545586 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.568730 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da58d42-6c34-4a38-b9dc-eeeb20542955-config-data" (OuterVolumeSpecName: "config-data") pod "3da58d42-6c34-4a38-b9dc-eeeb20542955" (UID: "3da58d42-6c34-4a38-b9dc-eeeb20542955"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.568992 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da58d42-6c34-4a38-b9dc-eeeb20542955-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3da58d42-6c34-4a38-b9dc-eeeb20542955" (UID: "3da58d42-6c34-4a38-b9dc-eeeb20542955"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.627073 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c685fadd-b283-40bc-9de2-3372317b9875-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c685fadd-b283-40bc-9de2-3372317b9875\") " pod="openstack/kube-state-metrics-0" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.627232 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c685fadd-b283-40bc-9de2-3372317b9875-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c685fadd-b283-40bc-9de2-3372317b9875\") " pod="openstack/kube-state-metrics-0" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.627282 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs6jm\" (UniqueName: \"kubernetes.io/projected/c685fadd-b283-40bc-9de2-3372317b9875-kube-api-access-fs6jm\") pod \"kube-state-metrics-0\" (UID: \"c685fadd-b283-40bc-9de2-3372317b9875\") " pod="openstack/kube-state-metrics-0" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.627340 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c685fadd-b283-40bc-9de2-3372317b9875-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c685fadd-b283-40bc-9de2-3372317b9875\") " pod="openstack/kube-state-metrics-0" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.627425 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3da58d42-6c34-4a38-b9dc-eeeb20542955-logs\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.627438 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrt7n\" (UniqueName: \"kubernetes.io/projected/3da58d42-6c34-4a38-b9dc-eeeb20542955-kube-api-access-qrt7n\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.627447 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da58d42-6c34-4a38-b9dc-eeeb20542955-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.627455 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3da58d42-6c34-4a38-b9dc-eeeb20542955-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.729431 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c685fadd-b283-40bc-9de2-3372317b9875-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c685fadd-b283-40bc-9de2-3372317b9875\") " pod="openstack/kube-state-metrics-0" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.729817 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs6jm\" (UniqueName: \"kubernetes.io/projected/c685fadd-b283-40bc-9de2-3372317b9875-kube-api-access-fs6jm\") pod \"kube-state-metrics-0\" (UID: \"c685fadd-b283-40bc-9de2-3372317b9875\") " pod="openstack/kube-state-metrics-0" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.729895 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c685fadd-b283-40bc-9de2-3372317b9875-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c685fadd-b283-40bc-9de2-3372317b9875\") " pod="openstack/kube-state-metrics-0" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.729964 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c685fadd-b283-40bc-9de2-3372317b9875-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c685fadd-b283-40bc-9de2-3372317b9875\") " pod="openstack/kube-state-metrics-0" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.733291 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.733324 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.734679 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c685fadd-b283-40bc-9de2-3372317b9875-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c685fadd-b283-40bc-9de2-3372317b9875\") " pod="openstack/kube-state-metrics-0" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.735901 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c685fadd-b283-40bc-9de2-3372317b9875-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c685fadd-b283-40bc-9de2-3372317b9875\") " pod="openstack/kube-state-metrics-0" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.743770 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c685fadd-b283-40bc-9de2-3372317b9875-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c685fadd-b283-40bc-9de2-3372317b9875\") " pod="openstack/kube-state-metrics-0" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.753318 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs6jm\" (UniqueName: \"kubernetes.io/projected/c685fadd-b283-40bc-9de2-3372317b9875-kube-api-access-fs6jm\") pod \"kube-state-metrics-0\" (UID: \"c685fadd-b283-40bc-9de2-3372317b9875\") " pod="openstack/kube-state-metrics-0" Feb 26 11:33:53 crc kubenswrapper[4699]: I0226 11:33:53.855620 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.271421 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="003dad7c-8300-49a9-80d0-99dcad71fa84" path="/var/lib/kubelet/pods/003dad7c-8300-49a9-80d0-99dcad71fa84/volumes" Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.272201 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf" path="/var/lib/kubelet/pods/2ef8c2b0-a7bb-41f7-9b6e-e0e8283428cf/volumes" Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.299787 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 11:33:54 crc kubenswrapper[4699]: W0226 11:33:54.301221 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc685fadd_b283_40bc_9de2_3372317b9875.slice/crio-5063064ca1b59884e4dbb2b88c753d4b5430e522d28b2e9ed2abe45f1c48a096 WatchSource:0}: Error finding container 5063064ca1b59884e4dbb2b88c753d4b5430e522d28b2e9ed2abe45f1c48a096: Status 404 returned error can't find the container with id 5063064ca1b59884e4dbb2b88c753d4b5430e522d28b2e9ed2abe45f1c48a096 Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.398634 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.398976 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b59e03f-0c75-40b0-9eb3-d5113163f420" containerName="ceilometer-central-agent" containerID="cri-o://04dcf7e8e201497d1cf45ed5c29abe7b3a178bdefc6cc1cf11f3cbae4131ffe7" gracePeriod=30 Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.399128 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b59e03f-0c75-40b0-9eb3-d5113163f420" containerName="sg-core" containerID="cri-o://0a5d89be89958727d068c6f547173e1db9e09eeaa55949f9e4b10646a2418098" gracePeriod=30 Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.399185 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b59e03f-0c75-40b0-9eb3-d5113163f420" containerName="proxy-httpd" containerID="cri-o://69f610b3a3266f67627b13f99326d06ba576d343b85ee61005b092a805c73f19" gracePeriod=30 Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.399134 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b59e03f-0c75-40b0-9eb3-d5113163f420" containerName="ceilometer-notification-agent" containerID="cri-o://1e8f4ef353b62f20e3fa0c0b216ab5527d39228a1eccacac6ba930465493a7ed" gracePeriod=30 Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.445385 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3da58d42-6c34-4a38-b9dc-eeeb20542955","Type":"ContainerDied","Data":"0001ca71eaea85ec4a7157192b885fb03750c2a30c308dc7404b715439e990b4"} Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.445432 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.445453 4699 scope.go:117] "RemoveContainer" containerID="9d7c08d80dbb246a87244e1998a0a8a3673755ff4e13d9dbf8e24611acf9af5c" Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.453572 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c685fadd-b283-40bc-9de2-3372317b9875","Type":"ContainerStarted","Data":"5063064ca1b59884e4dbb2b88c753d4b5430e522d28b2e9ed2abe45f1c48a096"} Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.455267 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a8bf50ee-a389-4a35-8899-81d885e1ec38","Type":"ContainerStarted","Data":"c97d92e712559c7220f14c08b215ac1ea015fa517ad26257d3edff7ee08e2ec0"} Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.455334 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a8bf50ee-a389-4a35-8899-81d885e1ec38","Type":"ContainerStarted","Data":"35561c1ff93cac360e0003512da7f67e357d0a40bd9387c2cdd037287561205d"} Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.470328 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.474789 4699 scope.go:117] "RemoveContainer" containerID="1c92a80645afa4e46d1addaafa746637c845088f3cba1406f81a67f4dfe1af22" Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.482328 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.491194 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.493067 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.493049423 podStartE2EDuration="2.493049423s" podCreationTimestamp="2026-02-26 11:33:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:33:54.481681093 +0000 UTC m=+1380.292507527" watchObservedRunningTime="2026-02-26 11:33:54.493049423 +0000 UTC m=+1380.303875847" Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.493181 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.497026 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.523737 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.548452 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5783da86-2f9d-42da-ae1e-7df1f4190892-logs\") pod \"nova-api-0\" (UID: \"5783da86-2f9d-42da-ae1e-7df1f4190892\") " pod="openstack/nova-api-0" Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.548541 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvq5n\" (UniqueName: \"kubernetes.io/projected/5783da86-2f9d-42da-ae1e-7df1f4190892-kube-api-access-wvq5n\") pod \"nova-api-0\" (UID: \"5783da86-2f9d-42da-ae1e-7df1f4190892\") " pod="openstack/nova-api-0" Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.548692 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5783da86-2f9d-42da-ae1e-7df1f4190892-config-data\") pod \"nova-api-0\" (UID: \"5783da86-2f9d-42da-ae1e-7df1f4190892\") " pod="openstack/nova-api-0" Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.548744 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5783da86-2f9d-42da-ae1e-7df1f4190892-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5783da86-2f9d-42da-ae1e-7df1f4190892\") " pod="openstack/nova-api-0" Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.650921 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5783da86-2f9d-42da-ae1e-7df1f4190892-logs\") pod \"nova-api-0\" (UID: \"5783da86-2f9d-42da-ae1e-7df1f4190892\") " pod="openstack/nova-api-0" Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.651243 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvq5n\" (UniqueName: \"kubernetes.io/projected/5783da86-2f9d-42da-ae1e-7df1f4190892-kube-api-access-wvq5n\") pod \"nova-api-0\" (UID: \"5783da86-2f9d-42da-ae1e-7df1f4190892\") " pod="openstack/nova-api-0" Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.651422 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5783da86-2f9d-42da-ae1e-7df1f4190892-config-data\") pod \"nova-api-0\" (UID: \"5783da86-2f9d-42da-ae1e-7df1f4190892\") " pod="openstack/nova-api-0" Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.651545 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5783da86-2f9d-42da-ae1e-7df1f4190892-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5783da86-2f9d-42da-ae1e-7df1f4190892\") " pod="openstack/nova-api-0" Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.653049 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5783da86-2f9d-42da-ae1e-7df1f4190892-logs\") pod \"nova-api-0\" (UID: \"5783da86-2f9d-42da-ae1e-7df1f4190892\") " pod="openstack/nova-api-0" Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.662206 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5783da86-2f9d-42da-ae1e-7df1f4190892-config-data\") pod \"nova-api-0\" (UID: \"5783da86-2f9d-42da-ae1e-7df1f4190892\") " pod="openstack/nova-api-0" Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.662542 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5783da86-2f9d-42da-ae1e-7df1f4190892-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5783da86-2f9d-42da-ae1e-7df1f4190892\") " pod="openstack/nova-api-0" Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.669560 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvq5n\" (UniqueName: \"kubernetes.io/projected/5783da86-2f9d-42da-ae1e-7df1f4190892-kube-api-access-wvq5n\") pod \"nova-api-0\" (UID: \"5783da86-2f9d-42da-ae1e-7df1f4190892\") " pod="openstack/nova-api-0" Feb 26 11:33:54 crc kubenswrapper[4699]: I0226 11:33:54.823721 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 11:33:55 crc kubenswrapper[4699]: I0226 11:33:55.395891 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 11:33:55 crc kubenswrapper[4699]: W0226 11:33:55.396808 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5783da86_2f9d_42da_ae1e_7df1f4190892.slice/crio-643f5d9494a203696bd97ca6ac87808480d0591a9fdea1382c0442db74ffeabf WatchSource:0}: Error finding container 643f5d9494a203696bd97ca6ac87808480d0591a9fdea1382c0442db74ffeabf: Status 404 returned error can't find the container with id 643f5d9494a203696bd97ca6ac87808480d0591a9fdea1382c0442db74ffeabf Feb 26 11:33:55 crc kubenswrapper[4699]: I0226 11:33:55.470435 4699 generic.go:334] "Generic (PLEG): container finished" podID="4b59e03f-0c75-40b0-9eb3-d5113163f420" containerID="69f610b3a3266f67627b13f99326d06ba576d343b85ee61005b092a805c73f19" exitCode=0 Feb 26 11:33:55 crc kubenswrapper[4699]: I0226 11:33:55.471940 4699 generic.go:334] "Generic (PLEG): container finished" podID="4b59e03f-0c75-40b0-9eb3-d5113163f420" containerID="0a5d89be89958727d068c6f547173e1db9e09eeaa55949f9e4b10646a2418098" exitCode=2 Feb 26 11:33:55 crc kubenswrapper[4699]: I0226 11:33:55.472044 4699 generic.go:334] "Generic (PLEG): container finished" podID="4b59e03f-0c75-40b0-9eb3-d5113163f420" containerID="04dcf7e8e201497d1cf45ed5c29abe7b3a178bdefc6cc1cf11f3cbae4131ffe7" exitCode=0 Feb 26 11:33:55 crc kubenswrapper[4699]: I0226 11:33:55.470606 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b59e03f-0c75-40b0-9eb3-d5113163f420","Type":"ContainerDied","Data":"69f610b3a3266f67627b13f99326d06ba576d343b85ee61005b092a805c73f19"} Feb 26 11:33:55 crc kubenswrapper[4699]: I0226 11:33:55.472320 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b59e03f-0c75-40b0-9eb3-d5113163f420","Type":"ContainerDied","Data":"0a5d89be89958727d068c6f547173e1db9e09eeaa55949f9e4b10646a2418098"} Feb 26 11:33:55 crc kubenswrapper[4699]: I0226 11:33:55.472447 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b59e03f-0c75-40b0-9eb3-d5113163f420","Type":"ContainerDied","Data":"04dcf7e8e201497d1cf45ed5c29abe7b3a178bdefc6cc1cf11f3cbae4131ffe7"} Feb 26 11:33:55 crc kubenswrapper[4699]: I0226 11:33:55.474949 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c685fadd-b283-40bc-9de2-3372317b9875","Type":"ContainerStarted","Data":"e9e80eb50f4f804f9b27d0cd5128479b0efd273dc49d2eace097d699b1117db5"} Feb 26 11:33:55 crc kubenswrapper[4699]: I0226 11:33:55.475173 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 26 11:33:55 crc kubenswrapper[4699]: I0226 11:33:55.478593 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5783da86-2f9d-42da-ae1e-7df1f4190892","Type":"ContainerStarted","Data":"643f5d9494a203696bd97ca6ac87808480d0591a9fdea1382c0442db74ffeabf"} Feb 26 11:33:55 crc kubenswrapper[4699]: I0226 11:33:55.492761 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.081343088 podStartE2EDuration="2.492722135s" podCreationTimestamp="2026-02-26 11:33:53 +0000 UTC" firstStartedPulling="2026-02-26 11:33:54.303827378 +0000 UTC m=+1380.114653812" lastFinishedPulling="2026-02-26 11:33:54.715206425 +0000 UTC m=+1380.526032859" observedRunningTime="2026-02-26 11:33:55.492332764 +0000 UTC m=+1381.303159218" watchObservedRunningTime="2026-02-26 11:33:55.492722135 +0000 UTC m=+1381.303548589" Feb 26 11:33:56 crc kubenswrapper[4699]: I0226 11:33:56.272064 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3da58d42-6c34-4a38-b9dc-eeeb20542955" path="/var/lib/kubelet/pods/3da58d42-6c34-4a38-b9dc-eeeb20542955/volumes" Feb 26 11:33:56 crc kubenswrapper[4699]: I0226 11:33:56.502967 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5783da86-2f9d-42da-ae1e-7df1f4190892","Type":"ContainerStarted","Data":"7419fb8defa328e64d4ff51073bc24c52256ef3871192c86a3186a5335afa3a4"} Feb 26 11:33:56 crc kubenswrapper[4699]: I0226 11:33:56.503008 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5783da86-2f9d-42da-ae1e-7df1f4190892","Type":"ContainerStarted","Data":"5da9136f3d1044f7ed5915a0bc35b164a1ef4c4d70bffc4d5337bc060879e9cd"} Feb 26 11:33:56 crc kubenswrapper[4699]: I0226 11:33:56.521580 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.521555224 podStartE2EDuration="2.521555224s" podCreationTimestamp="2026-02-26 11:33:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:33:56.521168143 +0000 UTC m=+1382.331994577" watchObservedRunningTime="2026-02-26 11:33:56.521555224 +0000 UTC m=+1382.332381668" Feb 26 11:33:57 crc kubenswrapper[4699]: I0226 11:33:57.845108 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.522623 4699 generic.go:334] "Generic (PLEG): container finished" podID="4b59e03f-0c75-40b0-9eb3-d5113163f420" containerID="1e8f4ef353b62f20e3fa0c0b216ab5527d39228a1eccacac6ba930465493a7ed" exitCode=0 Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.522847 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b59e03f-0c75-40b0-9eb3-d5113163f420","Type":"ContainerDied","Data":"1e8f4ef353b62f20e3fa0c0b216ab5527d39228a1eccacac6ba930465493a7ed"} Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.687376 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.736179 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.736235 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.739667 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-config-data\") pod \"4b59e03f-0c75-40b0-9eb3-d5113163f420\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.739744 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-combined-ca-bundle\") pod \"4b59e03f-0c75-40b0-9eb3-d5113163f420\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.739845 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-sg-core-conf-yaml\") pod \"4b59e03f-0c75-40b0-9eb3-d5113163f420\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.739976 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-scripts\") pod \"4b59e03f-0c75-40b0-9eb3-d5113163f420\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.740075 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b59e03f-0c75-40b0-9eb3-d5113163f420-run-httpd\") pod \"4b59e03f-0c75-40b0-9eb3-d5113163f420\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.740175 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwdvm\" (UniqueName: \"kubernetes.io/projected/4b59e03f-0c75-40b0-9eb3-d5113163f420-kube-api-access-jwdvm\") pod \"4b59e03f-0c75-40b0-9eb3-d5113163f420\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.740367 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b59e03f-0c75-40b0-9eb3-d5113163f420-log-httpd\") pod \"4b59e03f-0c75-40b0-9eb3-d5113163f420\" (UID: \"4b59e03f-0c75-40b0-9eb3-d5113163f420\") " Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.743091 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b59e03f-0c75-40b0-9eb3-d5113163f420-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4b59e03f-0c75-40b0-9eb3-d5113163f420" (UID: "4b59e03f-0c75-40b0-9eb3-d5113163f420"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.743841 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b59e03f-0c75-40b0-9eb3-d5113163f420-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4b59e03f-0c75-40b0-9eb3-d5113163f420" (UID: "4b59e03f-0c75-40b0-9eb3-d5113163f420"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.757820 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-scripts" (OuterVolumeSpecName: "scripts") pod "4b59e03f-0c75-40b0-9eb3-d5113163f420" (UID: "4b59e03f-0c75-40b0-9eb3-d5113163f420"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.770583 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b59e03f-0c75-40b0-9eb3-d5113163f420-kube-api-access-jwdvm" (OuterVolumeSpecName: "kube-api-access-jwdvm") pod "4b59e03f-0c75-40b0-9eb3-d5113163f420" (UID: "4b59e03f-0c75-40b0-9eb3-d5113163f420"). InnerVolumeSpecName "kube-api-access-jwdvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.794944 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4b59e03f-0c75-40b0-9eb3-d5113163f420" (UID: "4b59e03f-0c75-40b0-9eb3-d5113163f420"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.845668 4699 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.845709 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.845722 4699 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b59e03f-0c75-40b0-9eb3-d5113163f420-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.845734 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwdvm\" (UniqueName: \"kubernetes.io/projected/4b59e03f-0c75-40b0-9eb3-d5113163f420-kube-api-access-jwdvm\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.845747 4699 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b59e03f-0c75-40b0-9eb3-d5113163f420-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.851964 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b59e03f-0c75-40b0-9eb3-d5113163f420" (UID: "4b59e03f-0c75-40b0-9eb3-d5113163f420"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.883894 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-config-data" (OuterVolumeSpecName: "config-data") pod "4b59e03f-0c75-40b0-9eb3-d5113163f420" (UID: "4b59e03f-0c75-40b0-9eb3-d5113163f420"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.947483 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:58 crc kubenswrapper[4699]: I0226 11:33:58.947531 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b59e03f-0c75-40b0-9eb3-d5113163f420-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.534496 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b59e03f-0c75-40b0-9eb3-d5113163f420","Type":"ContainerDied","Data":"0e70588ca2c32d2c8bf18e61605cae154752eb6909030e1d1477c1cf1b1f9f0c"} Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.534585 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.534828 4699 scope.go:117] "RemoveContainer" containerID="69f610b3a3266f67627b13f99326d06ba576d343b85ee61005b092a805c73f19" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.554882 4699 scope.go:117] "RemoveContainer" containerID="0a5d89be89958727d068c6f547173e1db9e09eeaa55949f9e4b10646a2418098" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.583325 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.584210 4699 scope.go:117] "RemoveContainer" containerID="1e8f4ef353b62f20e3fa0c0b216ab5527d39228a1eccacac6ba930465493a7ed" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.591282 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.612932 4699 scope.go:117] "RemoveContainer" containerID="04dcf7e8e201497d1cf45ed5c29abe7b3a178bdefc6cc1cf11f3cbae4131ffe7" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.643174 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:33:59 crc kubenswrapper[4699]: E0226 11:33:59.643601 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b59e03f-0c75-40b0-9eb3-d5113163f420" containerName="sg-core" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.643618 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b59e03f-0c75-40b0-9eb3-d5113163f420" containerName="sg-core" Feb 26 11:33:59 crc kubenswrapper[4699]: E0226 11:33:59.643642 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b59e03f-0c75-40b0-9eb3-d5113163f420" containerName="proxy-httpd" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.643649 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b59e03f-0c75-40b0-9eb3-d5113163f420" containerName="proxy-httpd" Feb 26 11:33:59 crc kubenswrapper[4699]: E0226 11:33:59.643665 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b59e03f-0c75-40b0-9eb3-d5113163f420" containerName="ceilometer-central-agent" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.643673 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b59e03f-0c75-40b0-9eb3-d5113163f420" containerName="ceilometer-central-agent" Feb 26 11:33:59 crc kubenswrapper[4699]: E0226 11:33:59.643684 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b59e03f-0c75-40b0-9eb3-d5113163f420" containerName="ceilometer-notification-agent" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.643691 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b59e03f-0c75-40b0-9eb3-d5113163f420" containerName="ceilometer-notification-agent" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.643917 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b59e03f-0c75-40b0-9eb3-d5113163f420" containerName="sg-core" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.643942 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b59e03f-0c75-40b0-9eb3-d5113163f420" containerName="proxy-httpd" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.643960 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b59e03f-0c75-40b0-9eb3-d5113163f420" containerName="ceilometer-central-agent" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.643977 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b59e03f-0c75-40b0-9eb3-d5113163f420" containerName="ceilometer-notification-agent" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.647589 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.651316 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.654530 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.654769 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.655477 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.762900 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.762972 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-config-data\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.763040 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-scripts\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.763080 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.763152 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl5n4\" (UniqueName: \"kubernetes.io/projected/48cbc02a-15d3-4ae1-852f-24658804939b-kube-api-access-jl5n4\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.763195 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.763237 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48cbc02a-15d3-4ae1-852f-24658804939b-log-httpd\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.763275 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48cbc02a-15d3-4ae1-852f-24658804939b-run-httpd\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.770863 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c847caf4-446a-4738-88a8-26d1628c91f7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.770843 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c847caf4-446a-4738-88a8-26d1628c91f7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.866550 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.866633 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48cbc02a-15d3-4ae1-852f-24658804939b-log-httpd\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.866665 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48cbc02a-15d3-4ae1-852f-24658804939b-run-httpd\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.866724 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.866751 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-config-data\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.866793 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-scripts\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.866823 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.866859 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl5n4\" (UniqueName: \"kubernetes.io/projected/48cbc02a-15d3-4ae1-852f-24658804939b-kube-api-access-jl5n4\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.868703 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48cbc02a-15d3-4ae1-852f-24658804939b-log-httpd\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.868811 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48cbc02a-15d3-4ae1-852f-24658804939b-run-httpd\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.872918 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.872938 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.874099 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-config-data\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.874732 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.887513 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-scripts\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.891961 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl5n4\" (UniqueName: \"kubernetes.io/projected/48cbc02a-15d3-4ae1-852f-24658804939b-kube-api-access-jl5n4\") pod \"ceilometer-0\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " pod="openstack/ceilometer-0" Feb 26 11:33:59 crc kubenswrapper[4699]: I0226 11:33:59.966595 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:34:00 crc kubenswrapper[4699]: I0226 11:34:00.137606 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535094-ccf5t"] Feb 26 11:34:00 crc kubenswrapper[4699]: I0226 11:34:00.140018 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535094-ccf5t" Feb 26 11:34:00 crc kubenswrapper[4699]: I0226 11:34:00.143279 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 11:34:00 crc kubenswrapper[4699]: I0226 11:34:00.143464 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 11:34:00 crc kubenswrapper[4699]: I0226 11:34:00.143581 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 11:34:00 crc kubenswrapper[4699]: I0226 11:34:00.150986 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535094-ccf5t"] Feb 26 11:34:00 crc kubenswrapper[4699]: I0226 11:34:00.182890 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfbd9\" (UniqueName: \"kubernetes.io/projected/a63cbb99-64c1-46fe-99eb-0d06cc310cba-kube-api-access-xfbd9\") pod \"auto-csr-approver-29535094-ccf5t\" (UID: \"a63cbb99-64c1-46fe-99eb-0d06cc310cba\") " pod="openshift-infra/auto-csr-approver-29535094-ccf5t" Feb 26 11:34:00 crc kubenswrapper[4699]: I0226 11:34:00.275186 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b59e03f-0c75-40b0-9eb3-d5113163f420" path="/var/lib/kubelet/pods/4b59e03f-0c75-40b0-9eb3-d5113163f420/volumes" Feb 26 11:34:00 crc kubenswrapper[4699]: I0226 11:34:00.285961 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfbd9\" (UniqueName: \"kubernetes.io/projected/a63cbb99-64c1-46fe-99eb-0d06cc310cba-kube-api-access-xfbd9\") pod \"auto-csr-approver-29535094-ccf5t\" (UID: \"a63cbb99-64c1-46fe-99eb-0d06cc310cba\") " pod="openshift-infra/auto-csr-approver-29535094-ccf5t" Feb 26 11:34:00 crc kubenswrapper[4699]: I0226 11:34:00.305844 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfbd9\" (UniqueName: \"kubernetes.io/projected/a63cbb99-64c1-46fe-99eb-0d06cc310cba-kube-api-access-xfbd9\") pod \"auto-csr-approver-29535094-ccf5t\" (UID: \"a63cbb99-64c1-46fe-99eb-0d06cc310cba\") " pod="openshift-infra/auto-csr-approver-29535094-ccf5t" Feb 26 11:34:00 crc kubenswrapper[4699]: I0226 11:34:00.471752 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535094-ccf5t" Feb 26 11:34:00 crc kubenswrapper[4699]: I0226 11:34:00.536560 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:34:00 crc kubenswrapper[4699]: I0226 11:34:00.947370 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535094-ccf5t"] Feb 26 11:34:01 crc kubenswrapper[4699]: I0226 11:34:01.554964 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48cbc02a-15d3-4ae1-852f-24658804939b","Type":"ContainerStarted","Data":"370c564c94459e8d8c4a181df1cfad272d11acf7f598532418862df6abf6a7c3"} Feb 26 11:34:01 crc kubenswrapper[4699]: I0226 11:34:01.555399 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48cbc02a-15d3-4ae1-852f-24658804939b","Type":"ContainerStarted","Data":"68d0a1dbee3e4e680859b8d4e019b458f07850a3366f9a21d8ad3957b8f3d34a"} Feb 26 11:34:01 crc kubenswrapper[4699]: I0226 11:34:01.556668 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535094-ccf5t" event={"ID":"a63cbb99-64c1-46fe-99eb-0d06cc310cba","Type":"ContainerStarted","Data":"e03c8581bed6014bcc595dc5801f6720cf5965259cb812efd65758bd0cf0dcb7"} Feb 26 11:34:01 crc kubenswrapper[4699]: I0226 11:34:01.842259 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 26 11:34:02 crc kubenswrapper[4699]: I0226 11:34:02.567338 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48cbc02a-15d3-4ae1-852f-24658804939b","Type":"ContainerStarted","Data":"f9988ddb231f2367ed1aeb8133431d53f6c03ac1bb83b203e17a77ec850b07c5"} Feb 26 11:34:02 crc kubenswrapper[4699]: I0226 11:34:02.568926 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535094-ccf5t" event={"ID":"a63cbb99-64c1-46fe-99eb-0d06cc310cba","Type":"ContainerStarted","Data":"2fbcb8eac2ddc22c3ecc04313ce75c8a329d85e31714a8bfe7dae5bd6310f0ad"} Feb 26 11:34:02 crc kubenswrapper[4699]: I0226 11:34:02.586950 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535094-ccf5t" podStartSLOduration=1.4889258810000001 podStartE2EDuration="2.586919258s" podCreationTimestamp="2026-02-26 11:34:00 +0000 UTC" firstStartedPulling="2026-02-26 11:34:00.943416438 +0000 UTC m=+1386.754242862" lastFinishedPulling="2026-02-26 11:34:02.041409805 +0000 UTC m=+1387.852236239" observedRunningTime="2026-02-26 11:34:02.583037145 +0000 UTC m=+1388.393863609" watchObservedRunningTime="2026-02-26 11:34:02.586919258 +0000 UTC m=+1388.397745702" Feb 26 11:34:02 crc kubenswrapper[4699]: I0226 11:34:02.845163 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 26 11:34:02 crc kubenswrapper[4699]: I0226 11:34:02.875866 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 26 11:34:03 crc kubenswrapper[4699]: I0226 11:34:03.579872 4699 generic.go:334] "Generic (PLEG): container finished" podID="a63cbb99-64c1-46fe-99eb-0d06cc310cba" containerID="2fbcb8eac2ddc22c3ecc04313ce75c8a329d85e31714a8bfe7dae5bd6310f0ad" exitCode=0 Feb 26 11:34:03 crc kubenswrapper[4699]: I0226 11:34:03.579952 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535094-ccf5t" event={"ID":"a63cbb99-64c1-46fe-99eb-0d06cc310cba","Type":"ContainerDied","Data":"2fbcb8eac2ddc22c3ecc04313ce75c8a329d85e31714a8bfe7dae5bd6310f0ad"} Feb 26 11:34:03 crc kubenswrapper[4699]: I0226 11:34:03.581975 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48cbc02a-15d3-4ae1-852f-24658804939b","Type":"ContainerStarted","Data":"742583c4f8071616ca7933fdc484002228afd074d9f17ea694ae2c324eb67a8b"} Feb 26 11:34:03 crc kubenswrapper[4699]: I0226 11:34:03.623789 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 26 11:34:03 crc kubenswrapper[4699]: I0226 11:34:03.864426 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 26 11:34:04 crc kubenswrapper[4699]: I0226 11:34:04.825072 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 11:34:04 crc kubenswrapper[4699]: I0226 11:34:04.825429 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 11:34:04 crc kubenswrapper[4699]: I0226 11:34:04.996203 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535094-ccf5t" Feb 26 11:34:05 crc kubenswrapper[4699]: I0226 11:34:05.085025 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfbd9\" (UniqueName: \"kubernetes.io/projected/a63cbb99-64c1-46fe-99eb-0d06cc310cba-kube-api-access-xfbd9\") pod \"a63cbb99-64c1-46fe-99eb-0d06cc310cba\" (UID: \"a63cbb99-64c1-46fe-99eb-0d06cc310cba\") " Feb 26 11:34:05 crc kubenswrapper[4699]: I0226 11:34:05.091319 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a63cbb99-64c1-46fe-99eb-0d06cc310cba-kube-api-access-xfbd9" (OuterVolumeSpecName: "kube-api-access-xfbd9") pod "a63cbb99-64c1-46fe-99eb-0d06cc310cba" (UID: "a63cbb99-64c1-46fe-99eb-0d06cc310cba"). InnerVolumeSpecName "kube-api-access-xfbd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:34:05 crc kubenswrapper[4699]: I0226 11:34:05.187785 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfbd9\" (UniqueName: \"kubernetes.io/projected/a63cbb99-64c1-46fe-99eb-0d06cc310cba-kube-api-access-xfbd9\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:05 crc kubenswrapper[4699]: I0226 11:34:05.625744 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535094-ccf5t" Feb 26 11:34:05 crc kubenswrapper[4699]: I0226 11:34:05.626251 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535094-ccf5t" event={"ID":"a63cbb99-64c1-46fe-99eb-0d06cc310cba","Type":"ContainerDied","Data":"e03c8581bed6014bcc595dc5801f6720cf5965259cb812efd65758bd0cf0dcb7"} Feb 26 11:34:05 crc kubenswrapper[4699]: I0226 11:34:05.626327 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e03c8581bed6014bcc595dc5801f6720cf5965259cb812efd65758bd0cf0dcb7" Feb 26 11:34:05 crc kubenswrapper[4699]: I0226 11:34:05.646290 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48cbc02a-15d3-4ae1-852f-24658804939b","Type":"ContainerStarted","Data":"7497611bc316a363e6faede71364bc2badf1e82a9e7163a9b0d3885e64e69569"} Feb 26 11:34:05 crc kubenswrapper[4699]: I0226 11:34:05.646808 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 11:34:05 crc kubenswrapper[4699]: I0226 11:34:05.708740 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535088-rwpx5"] Feb 26 11:34:05 crc kubenswrapper[4699]: I0226 11:34:05.721729 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535088-rwpx5"] Feb 26 11:34:05 crc kubenswrapper[4699]: I0226 11:34:05.731641 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.783543847 podStartE2EDuration="6.731612972s" podCreationTimestamp="2026-02-26 11:33:59 +0000 UTC" firstStartedPulling="2026-02-26 11:34:00.557069459 +0000 UTC m=+1386.367895893" lastFinishedPulling="2026-02-26 11:34:04.505138584 +0000 UTC m=+1390.315965018" observedRunningTime="2026-02-26 11:34:05.675768141 +0000 UTC m=+1391.486594585" watchObservedRunningTime="2026-02-26 11:34:05.731612972 +0000 UTC m=+1391.542439416" Feb 26 11:34:05 crc kubenswrapper[4699]: I0226 11:34:05.908326 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5783da86-2f9d-42da-ae1e-7df1f4190892" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.208:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 11:34:05 crc kubenswrapper[4699]: I0226 11:34:05.908339 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5783da86-2f9d-42da-ae1e-7df1f4190892" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.208:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 11:34:06 crc kubenswrapper[4699]: I0226 11:34:06.272744 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d05b2b3d-2906-4acc-aaa2-2f2674e46f27" path="/var/lib/kubelet/pods/d05b2b3d-2906-4acc-aaa2-2f2674e46f27/volumes" Feb 26 11:34:08 crc kubenswrapper[4699]: I0226 11:34:08.738840 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 11:34:08 crc kubenswrapper[4699]: I0226 11:34:08.741493 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 11:34:08 crc kubenswrapper[4699]: I0226 11:34:08.745647 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 11:34:09 crc kubenswrapper[4699]: I0226 11:34:09.699834 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.535461 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.598102 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp7zp\" (UniqueName: \"kubernetes.io/projected/6d5f37fe-0099-471b-9192-5f52735977b1-kube-api-access-mp7zp\") pod \"6d5f37fe-0099-471b-9192-5f52735977b1\" (UID: \"6d5f37fe-0099-471b-9192-5f52735977b1\") " Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.598363 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d5f37fe-0099-471b-9192-5f52735977b1-config-data\") pod \"6d5f37fe-0099-471b-9192-5f52735977b1\" (UID: \"6d5f37fe-0099-471b-9192-5f52735977b1\") " Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.598420 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d5f37fe-0099-471b-9192-5f52735977b1-combined-ca-bundle\") pod \"6d5f37fe-0099-471b-9192-5f52735977b1\" (UID: \"6d5f37fe-0099-471b-9192-5f52735977b1\") " Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.620000 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d5f37fe-0099-471b-9192-5f52735977b1-kube-api-access-mp7zp" (OuterVolumeSpecName: "kube-api-access-mp7zp") pod "6d5f37fe-0099-471b-9192-5f52735977b1" (UID: "6d5f37fe-0099-471b-9192-5f52735977b1"). InnerVolumeSpecName "kube-api-access-mp7zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.632002 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d5f37fe-0099-471b-9192-5f52735977b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d5f37fe-0099-471b-9192-5f52735977b1" (UID: "6d5f37fe-0099-471b-9192-5f52735977b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.656379 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d5f37fe-0099-471b-9192-5f52735977b1-config-data" (OuterVolumeSpecName: "config-data") pod "6d5f37fe-0099-471b-9192-5f52735977b1" (UID: "6d5f37fe-0099-471b-9192-5f52735977b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.700498 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d5f37fe-0099-471b-9192-5f52735977b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.700526 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp7zp\" (UniqueName: \"kubernetes.io/projected/6d5f37fe-0099-471b-9192-5f52735977b1-kube-api-access-mp7zp\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.700553 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d5f37fe-0099-471b-9192-5f52735977b1-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.707246 4699 generic.go:334] "Generic (PLEG): container finished" podID="6d5f37fe-0099-471b-9192-5f52735977b1" containerID="9a9565174025b2a66d72042b489f69a27808267d25f5c7efcc2fce77583169c7" exitCode=137 Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.707388 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6d5f37fe-0099-471b-9192-5f52735977b1","Type":"ContainerDied","Data":"9a9565174025b2a66d72042b489f69a27808267d25f5c7efcc2fce77583169c7"} Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.707429 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.707484 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6d5f37fe-0099-471b-9192-5f52735977b1","Type":"ContainerDied","Data":"891a5f2f8e4df93b7d5e317f0bde0ca23ec3dcb73c4f3ad638024da213a38a6c"} Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.707517 4699 scope.go:117] "RemoveContainer" containerID="9a9565174025b2a66d72042b489f69a27808267d25f5c7efcc2fce77583169c7" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.741596 4699 scope.go:117] "RemoveContainer" containerID="9a9565174025b2a66d72042b489f69a27808267d25f5c7efcc2fce77583169c7" Feb 26 11:34:11 crc kubenswrapper[4699]: E0226 11:34:10.743105 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a9565174025b2a66d72042b489f69a27808267d25f5c7efcc2fce77583169c7\": container with ID starting with 9a9565174025b2a66d72042b489f69a27808267d25f5c7efcc2fce77583169c7 not found: ID does not exist" containerID="9a9565174025b2a66d72042b489f69a27808267d25f5c7efcc2fce77583169c7" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.743183 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a9565174025b2a66d72042b489f69a27808267d25f5c7efcc2fce77583169c7"} err="failed to get container status \"9a9565174025b2a66d72042b489f69a27808267d25f5c7efcc2fce77583169c7\": rpc error: code = NotFound desc = could not find container \"9a9565174025b2a66d72042b489f69a27808267d25f5c7efcc2fce77583169c7\": container with ID starting with 9a9565174025b2a66d72042b489f69a27808267d25f5c7efcc2fce77583169c7 not found: ID does not exist" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.754604 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.773617 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.787601 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 11:34:11 crc kubenswrapper[4699]: E0226 11:34:10.788080 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d5f37fe-0099-471b-9192-5f52735977b1" containerName="nova-cell1-novncproxy-novncproxy" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.788096 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d5f37fe-0099-471b-9192-5f52735977b1" containerName="nova-cell1-novncproxy-novncproxy" Feb 26 11:34:11 crc kubenswrapper[4699]: E0226 11:34:10.788105 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a63cbb99-64c1-46fe-99eb-0d06cc310cba" containerName="oc" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.788126 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="a63cbb99-64c1-46fe-99eb-0d06cc310cba" containerName="oc" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.788328 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="a63cbb99-64c1-46fe-99eb-0d06cc310cba" containerName="oc" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.788348 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d5f37fe-0099-471b-9192-5f52735977b1" containerName="nova-cell1-novncproxy-novncproxy" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.789064 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.791270 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.798458 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.798737 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.799418 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.904257 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bb28763-ceae-456c-a0d6-5df33b478106-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8bb28763-ceae-456c-a0d6-5df33b478106\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.904617 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb28763-ceae-456c-a0d6-5df33b478106-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8bb28763-ceae-456c-a0d6-5df33b478106\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.904698 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bb28763-ceae-456c-a0d6-5df33b478106-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8bb28763-ceae-456c-a0d6-5df33b478106\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.904898 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26vgd\" (UniqueName: \"kubernetes.io/projected/8bb28763-ceae-456c-a0d6-5df33b478106-kube-api-access-26vgd\") pod \"nova-cell1-novncproxy-0\" (UID: \"8bb28763-ceae-456c-a0d6-5df33b478106\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:10.904959 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bb28763-ceae-456c-a0d6-5df33b478106-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8bb28763-ceae-456c-a0d6-5df33b478106\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:11.007018 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb28763-ceae-456c-a0d6-5df33b478106-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8bb28763-ceae-456c-a0d6-5df33b478106\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:11.007084 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bb28763-ceae-456c-a0d6-5df33b478106-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8bb28763-ceae-456c-a0d6-5df33b478106\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:11.007166 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26vgd\" (UniqueName: \"kubernetes.io/projected/8bb28763-ceae-456c-a0d6-5df33b478106-kube-api-access-26vgd\") pod \"nova-cell1-novncproxy-0\" (UID: \"8bb28763-ceae-456c-a0d6-5df33b478106\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:11.007190 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bb28763-ceae-456c-a0d6-5df33b478106-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8bb28763-ceae-456c-a0d6-5df33b478106\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:11.007236 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bb28763-ceae-456c-a0d6-5df33b478106-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8bb28763-ceae-456c-a0d6-5df33b478106\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:11.011246 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bb28763-ceae-456c-a0d6-5df33b478106-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8bb28763-ceae-456c-a0d6-5df33b478106\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:11.011262 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bb28763-ceae-456c-a0d6-5df33b478106-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8bb28763-ceae-456c-a0d6-5df33b478106\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:11.011808 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bb28763-ceae-456c-a0d6-5df33b478106-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8bb28763-ceae-456c-a0d6-5df33b478106\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:11.013270 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb28763-ceae-456c-a0d6-5df33b478106-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8bb28763-ceae-456c-a0d6-5df33b478106\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:11.023902 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26vgd\" (UniqueName: \"kubernetes.io/projected/8bb28763-ceae-456c-a0d6-5df33b478106-kube-api-access-26vgd\") pod \"nova-cell1-novncproxy-0\" (UID: \"8bb28763-ceae-456c-a0d6-5df33b478106\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:11.123985 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:11.585102 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:11.585399 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:11.585441 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:11.586157 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e281597aa593fa5c9ddd67a617de4ed4d3363a8c5b9ebcaaf78cd70cd013eef6"} pod="openshift-machine-config-operator/machine-config-daemon-28p79" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:11.586213 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" containerID="cri-o://e281597aa593fa5c9ddd67a617de4ed4d3363a8c5b9ebcaaf78cd70cd013eef6" gracePeriod=600 Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:11.718590 4699 generic.go:334] "Generic (PLEG): container finished" podID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerID="e281597aa593fa5c9ddd67a617de4ed4d3363a8c5b9ebcaaf78cd70cd013eef6" exitCode=0 Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:11.718637 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerDied","Data":"e281597aa593fa5c9ddd67a617de4ed4d3363a8c5b9ebcaaf78cd70cd013eef6"} Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:11.718681 4699 scope.go:117] "RemoveContainer" containerID="2c2d25c558a927e58d9962b6f55de97dac3f222cb5bc89a35791fca832759b03" Feb 26 11:34:11 crc kubenswrapper[4699]: I0226 11:34:11.908551 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 11:34:12 crc kubenswrapper[4699]: I0226 11:34:12.273632 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d5f37fe-0099-471b-9192-5f52735977b1" path="/var/lib/kubelet/pods/6d5f37fe-0099-471b-9192-5f52735977b1/volumes" Feb 26 11:34:12 crc kubenswrapper[4699]: I0226 11:34:12.730386 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerStarted","Data":"44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99"} Feb 26 11:34:12 crc kubenswrapper[4699]: I0226 11:34:12.739357 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8bb28763-ceae-456c-a0d6-5df33b478106","Type":"ContainerStarted","Data":"4f166a9252cb921a215a88fe04068a2f30d2e2ae3cf00bbac0de70d3ed780392"} Feb 26 11:34:12 crc kubenswrapper[4699]: I0226 11:34:12.739418 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8bb28763-ceae-456c-a0d6-5df33b478106","Type":"ContainerStarted","Data":"0fe313e864cfef4cdb1bde0ae46f205943a65667d28cb0c5d56d3359e063c281"} Feb 26 11:34:12 crc kubenswrapper[4699]: I0226 11:34:12.785223 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.785193265 podStartE2EDuration="2.785193265s" podCreationTimestamp="2026-02-26 11:34:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:34:12.769842959 +0000 UTC m=+1398.580669393" watchObservedRunningTime="2026-02-26 11:34:12.785193265 +0000 UTC m=+1398.596019709" Feb 26 11:34:14 crc kubenswrapper[4699]: I0226 11:34:14.827847 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 11:34:14 crc kubenswrapper[4699]: I0226 11:34:14.829260 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 11:34:14 crc kubenswrapper[4699]: I0226 11:34:14.829763 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 11:34:14 crc kubenswrapper[4699]: I0226 11:34:14.829789 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 11:34:14 crc kubenswrapper[4699]: I0226 11:34:14.831921 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 11:34:14 crc kubenswrapper[4699]: I0226 11:34:14.832791 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 11:34:15 crc kubenswrapper[4699]: I0226 11:34:15.029160 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-n24ct"] Feb 26 11:34:15 crc kubenswrapper[4699]: I0226 11:34:15.033226 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:34:15 crc kubenswrapper[4699]: I0226 11:34:15.041024 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-n24ct"] Feb 26 11:34:15 crc kubenswrapper[4699]: I0226 11:34:15.191496 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-n24ct\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:34:15 crc kubenswrapper[4699]: I0226 11:34:15.191554 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr9gh\" (UniqueName: \"kubernetes.io/projected/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-kube-api-access-mr9gh\") pod \"dnsmasq-dns-5c7b6c5df9-n24ct\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:34:15 crc kubenswrapper[4699]: I0226 11:34:15.191933 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-config\") pod \"dnsmasq-dns-5c7b6c5df9-n24ct\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:34:15 crc kubenswrapper[4699]: I0226 11:34:15.192279 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-n24ct\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:34:15 crc kubenswrapper[4699]: I0226 11:34:15.192358 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-n24ct\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:34:15 crc kubenswrapper[4699]: I0226 11:34:15.193389 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-n24ct\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:34:15 crc kubenswrapper[4699]: I0226 11:34:15.296305 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-n24ct\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:34:15 crc kubenswrapper[4699]: I0226 11:34:15.296404 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-n24ct\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:34:15 crc kubenswrapper[4699]: I0226 11:34:15.296483 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-n24ct\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:34:15 crc kubenswrapper[4699]: I0226 11:34:15.296530 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr9gh\" (UniqueName: \"kubernetes.io/projected/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-kube-api-access-mr9gh\") pod \"dnsmasq-dns-5c7b6c5df9-n24ct\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:34:15 crc kubenswrapper[4699]: I0226 11:34:15.296678 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-config\") pod \"dnsmasq-dns-5c7b6c5df9-n24ct\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:34:15 crc kubenswrapper[4699]: I0226 11:34:15.296835 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-n24ct\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:34:15 crc kubenswrapper[4699]: I0226 11:34:15.297425 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-n24ct\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:34:15 crc kubenswrapper[4699]: I0226 11:34:15.298040 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-n24ct\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:34:15 crc kubenswrapper[4699]: I0226 11:34:15.298176 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-n24ct\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:34:15 crc kubenswrapper[4699]: I0226 11:34:15.298208 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-config\") pod \"dnsmasq-dns-5c7b6c5df9-n24ct\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:34:15 crc kubenswrapper[4699]: I0226 11:34:15.298386 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-n24ct\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:34:15 crc kubenswrapper[4699]: I0226 11:34:15.328920 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr9gh\" (UniqueName: \"kubernetes.io/projected/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-kube-api-access-mr9gh\") pod \"dnsmasq-dns-5c7b6c5df9-n24ct\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:34:15 crc kubenswrapper[4699]: I0226 11:34:15.352072 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:34:15 crc kubenswrapper[4699]: I0226 11:34:15.882840 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-n24ct"] Feb 26 11:34:16 crc kubenswrapper[4699]: I0226 11:34:16.124778 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:34:16 crc kubenswrapper[4699]: I0226 11:34:16.779350 4699 generic.go:334] "Generic (PLEG): container finished" podID="a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8" containerID="ac279ccad47adb2f6ab2c9bfda803625849869922644f32045786543361b143f" exitCode=0 Feb 26 11:34:16 crc kubenswrapper[4699]: I0226 11:34:16.779416 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" event={"ID":"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8","Type":"ContainerDied","Data":"ac279ccad47adb2f6ab2c9bfda803625849869922644f32045786543361b143f"} Feb 26 11:34:16 crc kubenswrapper[4699]: I0226 11:34:16.779761 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" event={"ID":"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8","Type":"ContainerStarted","Data":"d1d082240eaff72440b2e6ab6682cc7abdf39c898255b3c76048247bf61866be"} Feb 26 11:34:17 crc kubenswrapper[4699]: I0226 11:34:17.035917 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:34:17 crc kubenswrapper[4699]: I0226 11:34:17.036739 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="48cbc02a-15d3-4ae1-852f-24658804939b" containerName="sg-core" containerID="cri-o://742583c4f8071616ca7933fdc484002228afd074d9f17ea694ae2c324eb67a8b" gracePeriod=30 Feb 26 11:34:17 crc kubenswrapper[4699]: I0226 11:34:17.036776 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="48cbc02a-15d3-4ae1-852f-24658804939b" containerName="proxy-httpd" containerID="cri-o://7497611bc316a363e6faede71364bc2badf1e82a9e7163a9b0d3885e64e69569" gracePeriod=30 Feb 26 11:34:17 crc kubenswrapper[4699]: I0226 11:34:17.036739 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="48cbc02a-15d3-4ae1-852f-24658804939b" containerName="ceilometer-notification-agent" containerID="cri-o://f9988ddb231f2367ed1aeb8133431d53f6c03ac1bb83b203e17a77ec850b07c5" gracePeriod=30 Feb 26 11:34:17 crc kubenswrapper[4699]: I0226 11:34:17.037330 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="48cbc02a-15d3-4ae1-852f-24658804939b" containerName="ceilometer-central-agent" containerID="cri-o://370c564c94459e8d8c4a181df1cfad272d11acf7f598532418862df6abf6a7c3" gracePeriod=30 Feb 26 11:34:17 crc kubenswrapper[4699]: I0226 11:34:17.048003 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="48cbc02a-15d3-4ae1-852f-24658804939b" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.209:3000/\": EOF" Feb 26 11:34:17 crc kubenswrapper[4699]: I0226 11:34:17.553402 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 11:34:17 crc kubenswrapper[4699]: I0226 11:34:17.794420 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" event={"ID":"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8","Type":"ContainerStarted","Data":"6eaae1ee8cf33fbf9c5f7338398f314d84ab95982df8c9ecfdd230c190623ca8"} Feb 26 11:34:17 crc kubenswrapper[4699]: I0226 11:34:17.794568 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:34:17 crc kubenswrapper[4699]: I0226 11:34:17.802759 4699 generic.go:334] "Generic (PLEG): container finished" podID="48cbc02a-15d3-4ae1-852f-24658804939b" containerID="7497611bc316a363e6faede71364bc2badf1e82a9e7163a9b0d3885e64e69569" exitCode=0 Feb 26 11:34:17 crc kubenswrapper[4699]: I0226 11:34:17.802927 4699 generic.go:334] "Generic (PLEG): container finished" podID="48cbc02a-15d3-4ae1-852f-24658804939b" containerID="742583c4f8071616ca7933fdc484002228afd074d9f17ea694ae2c324eb67a8b" exitCode=2 Feb 26 11:34:17 crc kubenswrapper[4699]: I0226 11:34:17.802983 4699 generic.go:334] "Generic (PLEG): container finished" podID="48cbc02a-15d3-4ae1-852f-24658804939b" containerID="370c564c94459e8d8c4a181df1cfad272d11acf7f598532418862df6abf6a7c3" exitCode=0 Feb 26 11:34:17 crc kubenswrapper[4699]: I0226 11:34:17.802866 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48cbc02a-15d3-4ae1-852f-24658804939b","Type":"ContainerDied","Data":"7497611bc316a363e6faede71364bc2badf1e82a9e7163a9b0d3885e64e69569"} Feb 26 11:34:17 crc kubenswrapper[4699]: I0226 11:34:17.803076 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48cbc02a-15d3-4ae1-852f-24658804939b","Type":"ContainerDied","Data":"742583c4f8071616ca7933fdc484002228afd074d9f17ea694ae2c324eb67a8b"} Feb 26 11:34:17 crc kubenswrapper[4699]: I0226 11:34:17.803092 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48cbc02a-15d3-4ae1-852f-24658804939b","Type":"ContainerDied","Data":"370c564c94459e8d8c4a181df1cfad272d11acf7f598532418862df6abf6a7c3"} Feb 26 11:34:17 crc kubenswrapper[4699]: I0226 11:34:17.803336 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5783da86-2f9d-42da-ae1e-7df1f4190892" containerName="nova-api-log" containerID="cri-o://5da9136f3d1044f7ed5915a0bc35b164a1ef4c4d70bffc4d5337bc060879e9cd" gracePeriod=30 Feb 26 11:34:17 crc kubenswrapper[4699]: I0226 11:34:17.803382 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5783da86-2f9d-42da-ae1e-7df1f4190892" containerName="nova-api-api" containerID="cri-o://7419fb8defa328e64d4ff51073bc24c52256ef3871192c86a3186a5335afa3a4" gracePeriod=30 Feb 26 11:34:17 crc kubenswrapper[4699]: I0226 11:34:17.829509 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" podStartSLOduration=2.829486137 podStartE2EDuration="2.829486137s" podCreationTimestamp="2026-02-26 11:34:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:34:17.817420286 +0000 UTC m=+1403.628246720" watchObservedRunningTime="2026-02-26 11:34:17.829486137 +0000 UTC m=+1403.640312571" Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.736670 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.770792 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-ceilometer-tls-certs\") pod \"48cbc02a-15d3-4ae1-852f-24658804939b\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.770868 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-config-data\") pod \"48cbc02a-15d3-4ae1-852f-24658804939b\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.770931 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jl5n4\" (UniqueName: \"kubernetes.io/projected/48cbc02a-15d3-4ae1-852f-24658804939b-kube-api-access-jl5n4\") pod \"48cbc02a-15d3-4ae1-852f-24658804939b\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.770963 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-sg-core-conf-yaml\") pod \"48cbc02a-15d3-4ae1-852f-24658804939b\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.771007 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48cbc02a-15d3-4ae1-852f-24658804939b-run-httpd\") pod \"48cbc02a-15d3-4ae1-852f-24658804939b\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.771064 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-combined-ca-bundle\") pod \"48cbc02a-15d3-4ae1-852f-24658804939b\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.771091 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-scripts\") pod \"48cbc02a-15d3-4ae1-852f-24658804939b\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.771133 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48cbc02a-15d3-4ae1-852f-24658804939b-log-httpd\") pod \"48cbc02a-15d3-4ae1-852f-24658804939b\" (UID: \"48cbc02a-15d3-4ae1-852f-24658804939b\") " Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.772470 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48cbc02a-15d3-4ae1-852f-24658804939b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "48cbc02a-15d3-4ae1-852f-24658804939b" (UID: "48cbc02a-15d3-4ae1-852f-24658804939b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.786676 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48cbc02a-15d3-4ae1-852f-24658804939b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "48cbc02a-15d3-4ae1-852f-24658804939b" (UID: "48cbc02a-15d3-4ae1-852f-24658804939b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.800339 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-scripts" (OuterVolumeSpecName: "scripts") pod "48cbc02a-15d3-4ae1-852f-24658804939b" (UID: "48cbc02a-15d3-4ae1-852f-24658804939b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.822486 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48cbc02a-15d3-4ae1-852f-24658804939b-kube-api-access-jl5n4" (OuterVolumeSpecName: "kube-api-access-jl5n4") pod "48cbc02a-15d3-4ae1-852f-24658804939b" (UID: "48cbc02a-15d3-4ae1-852f-24658804939b"). InnerVolumeSpecName "kube-api-access-jl5n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.873620 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jl5n4\" (UniqueName: \"kubernetes.io/projected/48cbc02a-15d3-4ae1-852f-24658804939b-kube-api-access-jl5n4\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.873647 4699 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48cbc02a-15d3-4ae1-852f-24658804939b-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.873656 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.873664 4699 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48cbc02a-15d3-4ae1-852f-24658804939b-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.887268 4699 generic.go:334] "Generic (PLEG): container finished" podID="5783da86-2f9d-42da-ae1e-7df1f4190892" containerID="5da9136f3d1044f7ed5915a0bc35b164a1ef4c4d70bffc4d5337bc060879e9cd" exitCode=143 Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.887454 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5783da86-2f9d-42da-ae1e-7df1f4190892","Type":"ContainerDied","Data":"5da9136f3d1044f7ed5915a0bc35b164a1ef4c4d70bffc4d5337bc060879e9cd"} Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.919331 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "48cbc02a-15d3-4ae1-852f-24658804939b" (UID: "48cbc02a-15d3-4ae1-852f-24658804939b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.937573 4699 generic.go:334] "Generic (PLEG): container finished" podID="48cbc02a-15d3-4ae1-852f-24658804939b" containerID="f9988ddb231f2367ed1aeb8133431d53f6c03ac1bb83b203e17a77ec850b07c5" exitCode=0 Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.938731 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.939297 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48cbc02a-15d3-4ae1-852f-24658804939b","Type":"ContainerDied","Data":"f9988ddb231f2367ed1aeb8133431d53f6c03ac1bb83b203e17a77ec850b07c5"} Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.939323 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48cbc02a-15d3-4ae1-852f-24658804939b","Type":"ContainerDied","Data":"68d0a1dbee3e4e680859b8d4e019b458f07850a3366f9a21d8ad3957b8f3d34a"} Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.939340 4699 scope.go:117] "RemoveContainer" containerID="7497611bc316a363e6faede71364bc2badf1e82a9e7163a9b0d3885e64e69569" Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.975734 4699 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:18 crc kubenswrapper[4699]: I0226 11:34:18.990997 4699 scope.go:117] "RemoveContainer" containerID="742583c4f8071616ca7933fdc484002228afd074d9f17ea694ae2c324eb67a8b" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.012265 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "48cbc02a-15d3-4ae1-852f-24658804939b" (UID: "48cbc02a-15d3-4ae1-852f-24658804939b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.020245 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48cbc02a-15d3-4ae1-852f-24658804939b" (UID: "48cbc02a-15d3-4ae1-852f-24658804939b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.024467 4699 scope.go:117] "RemoveContainer" containerID="f9988ddb231f2367ed1aeb8133431d53f6c03ac1bb83b203e17a77ec850b07c5" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.049951 4699 scope.go:117] "RemoveContainer" containerID="370c564c94459e8d8c4a181df1cfad272d11acf7f598532418862df6abf6a7c3" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.076379 4699 scope.go:117] "RemoveContainer" containerID="7497611bc316a363e6faede71364bc2badf1e82a9e7163a9b0d3885e64e69569" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.077730 4699 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.077748 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:19 crc kubenswrapper[4699]: E0226 11:34:19.078702 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7497611bc316a363e6faede71364bc2badf1e82a9e7163a9b0d3885e64e69569\": container with ID starting with 7497611bc316a363e6faede71364bc2badf1e82a9e7163a9b0d3885e64e69569 not found: ID does not exist" containerID="7497611bc316a363e6faede71364bc2badf1e82a9e7163a9b0d3885e64e69569" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.078771 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7497611bc316a363e6faede71364bc2badf1e82a9e7163a9b0d3885e64e69569"} err="failed to get container status \"7497611bc316a363e6faede71364bc2badf1e82a9e7163a9b0d3885e64e69569\": rpc error: code = NotFound desc = could not find container \"7497611bc316a363e6faede71364bc2badf1e82a9e7163a9b0d3885e64e69569\": container with ID starting with 7497611bc316a363e6faede71364bc2badf1e82a9e7163a9b0d3885e64e69569 not found: ID does not exist" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.078835 4699 scope.go:117] "RemoveContainer" containerID="742583c4f8071616ca7933fdc484002228afd074d9f17ea694ae2c324eb67a8b" Feb 26 11:34:19 crc kubenswrapper[4699]: E0226 11:34:19.079331 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"742583c4f8071616ca7933fdc484002228afd074d9f17ea694ae2c324eb67a8b\": container with ID starting with 742583c4f8071616ca7933fdc484002228afd074d9f17ea694ae2c324eb67a8b not found: ID does not exist" containerID="742583c4f8071616ca7933fdc484002228afd074d9f17ea694ae2c324eb67a8b" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.079366 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"742583c4f8071616ca7933fdc484002228afd074d9f17ea694ae2c324eb67a8b"} err="failed to get container status \"742583c4f8071616ca7933fdc484002228afd074d9f17ea694ae2c324eb67a8b\": rpc error: code = NotFound desc = could not find container \"742583c4f8071616ca7933fdc484002228afd074d9f17ea694ae2c324eb67a8b\": container with ID starting with 742583c4f8071616ca7933fdc484002228afd074d9f17ea694ae2c324eb67a8b not found: ID does not exist" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.079394 4699 scope.go:117] "RemoveContainer" containerID="f9988ddb231f2367ed1aeb8133431d53f6c03ac1bb83b203e17a77ec850b07c5" Feb 26 11:34:19 crc kubenswrapper[4699]: E0226 11:34:19.081950 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9988ddb231f2367ed1aeb8133431d53f6c03ac1bb83b203e17a77ec850b07c5\": container with ID starting with f9988ddb231f2367ed1aeb8133431d53f6c03ac1bb83b203e17a77ec850b07c5 not found: ID does not exist" containerID="f9988ddb231f2367ed1aeb8133431d53f6c03ac1bb83b203e17a77ec850b07c5" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.081978 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9988ddb231f2367ed1aeb8133431d53f6c03ac1bb83b203e17a77ec850b07c5"} err="failed to get container status \"f9988ddb231f2367ed1aeb8133431d53f6c03ac1bb83b203e17a77ec850b07c5\": rpc error: code = NotFound desc = could not find container \"f9988ddb231f2367ed1aeb8133431d53f6c03ac1bb83b203e17a77ec850b07c5\": container with ID starting with f9988ddb231f2367ed1aeb8133431d53f6c03ac1bb83b203e17a77ec850b07c5 not found: ID does not exist" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.081995 4699 scope.go:117] "RemoveContainer" containerID="370c564c94459e8d8c4a181df1cfad272d11acf7f598532418862df6abf6a7c3" Feb 26 11:34:19 crc kubenswrapper[4699]: E0226 11:34:19.082244 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"370c564c94459e8d8c4a181df1cfad272d11acf7f598532418862df6abf6a7c3\": container with ID starting with 370c564c94459e8d8c4a181df1cfad272d11acf7f598532418862df6abf6a7c3 not found: ID does not exist" containerID="370c564c94459e8d8c4a181df1cfad272d11acf7f598532418862df6abf6a7c3" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.082274 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"370c564c94459e8d8c4a181df1cfad272d11acf7f598532418862df6abf6a7c3"} err="failed to get container status \"370c564c94459e8d8c4a181df1cfad272d11acf7f598532418862df6abf6a7c3\": rpc error: code = NotFound desc = could not find container \"370c564c94459e8d8c4a181df1cfad272d11acf7f598532418862df6abf6a7c3\": container with ID starting with 370c564c94459e8d8c4a181df1cfad272d11acf7f598532418862df6abf6a7c3 not found: ID does not exist" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.097662 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-config-data" (OuterVolumeSpecName: "config-data") pod "48cbc02a-15d3-4ae1-852f-24658804939b" (UID: "48cbc02a-15d3-4ae1-852f-24658804939b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.179448 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48cbc02a-15d3-4ae1-852f-24658804939b-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.368989 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.388306 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.414270 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:34:19 crc kubenswrapper[4699]: E0226 11:34:19.414848 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48cbc02a-15d3-4ae1-852f-24658804939b" containerName="sg-core" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.415504 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="48cbc02a-15d3-4ae1-852f-24658804939b" containerName="sg-core" Feb 26 11:34:19 crc kubenswrapper[4699]: E0226 11:34:19.415760 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48cbc02a-15d3-4ae1-852f-24658804939b" containerName="ceilometer-central-agent" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.415822 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="48cbc02a-15d3-4ae1-852f-24658804939b" containerName="ceilometer-central-agent" Feb 26 11:34:19 crc kubenswrapper[4699]: E0226 11:34:19.416063 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48cbc02a-15d3-4ae1-852f-24658804939b" containerName="proxy-httpd" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.416202 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="48cbc02a-15d3-4ae1-852f-24658804939b" containerName="proxy-httpd" Feb 26 11:34:19 crc kubenswrapper[4699]: E0226 11:34:19.416270 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48cbc02a-15d3-4ae1-852f-24658804939b" containerName="ceilometer-notification-agent" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.416317 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="48cbc02a-15d3-4ae1-852f-24658804939b" containerName="ceilometer-notification-agent" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.416682 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="48cbc02a-15d3-4ae1-852f-24658804939b" containerName="proxy-httpd" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.416854 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="48cbc02a-15d3-4ae1-852f-24658804939b" containerName="ceilometer-central-agent" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.416913 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="48cbc02a-15d3-4ae1-852f-24658804939b" containerName="ceilometer-notification-agent" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.417031 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="48cbc02a-15d3-4ae1-852f-24658804939b" containerName="sg-core" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.420030 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.423079 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.423454 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.423430 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.425907 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.433770 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:34:19 crc kubenswrapper[4699]: E0226 11:34:19.454461 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-7trcj log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-7trcj log-httpd run-httpd scripts sg-core-conf-yaml]: context canceled" pod="openstack/ceilometer-0" podUID="cfc43627-a5fc-40fe-b7a4-6d04e80481dd" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.601510 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.601565 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-scripts\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.601655 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7trcj\" (UniqueName: \"kubernetes.io/projected/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-kube-api-access-7trcj\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.601928 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-run-httpd\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.602032 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.602098 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.602159 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-config-data\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.602269 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-log-httpd\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.703791 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-log-httpd\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.703885 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.703917 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-scripts\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.703961 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7trcj\" (UniqueName: \"kubernetes.io/projected/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-kube-api-access-7trcj\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.704011 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-run-httpd\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.704035 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.704090 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.704387 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-log-httpd\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.704534 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-run-httpd\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.704767 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-config-data\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.716282 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.716604 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-config-data\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.717939 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.718612 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-scripts\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.723347 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7trcj\" (UniqueName: \"kubernetes.io/projected/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-kube-api-access-7trcj\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.730454 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.946836 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:34:19 crc kubenswrapper[4699]: I0226 11:34:19.959186 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.111606 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-sg-core-conf-yaml\") pod \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.111991 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-ceilometer-tls-certs\") pod \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.112078 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-combined-ca-bundle\") pod \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.112141 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-scripts\") pod \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.112180 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-log-httpd\") pod \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.112231 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7trcj\" (UniqueName: \"kubernetes.io/projected/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-kube-api-access-7trcj\") pod \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.112281 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-config-data\") pod \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.112329 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-run-httpd\") pod \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\" (UID: \"cfc43627-a5fc-40fe-b7a4-6d04e80481dd\") " Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.112423 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cfc43627-a5fc-40fe-b7a4-6d04e80481dd" (UID: "cfc43627-a5fc-40fe-b7a4-6d04e80481dd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.112745 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cfc43627-a5fc-40fe-b7a4-6d04e80481dd" (UID: "cfc43627-a5fc-40fe-b7a4-6d04e80481dd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.113177 4699 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.113200 4699 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.117013 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-scripts" (OuterVolumeSpecName: "scripts") pod "cfc43627-a5fc-40fe-b7a4-6d04e80481dd" (UID: "cfc43627-a5fc-40fe-b7a4-6d04e80481dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.117724 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "cfc43627-a5fc-40fe-b7a4-6d04e80481dd" (UID: "cfc43627-a5fc-40fe-b7a4-6d04e80481dd"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.117752 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cfc43627-a5fc-40fe-b7a4-6d04e80481dd" (UID: "cfc43627-a5fc-40fe-b7a4-6d04e80481dd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.117738 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-kube-api-access-7trcj" (OuterVolumeSpecName: "kube-api-access-7trcj") pod "cfc43627-a5fc-40fe-b7a4-6d04e80481dd" (UID: "cfc43627-a5fc-40fe-b7a4-6d04e80481dd"). InnerVolumeSpecName "kube-api-access-7trcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.118006 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-config-data" (OuterVolumeSpecName: "config-data") pod "cfc43627-a5fc-40fe-b7a4-6d04e80481dd" (UID: "cfc43627-a5fc-40fe-b7a4-6d04e80481dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.118417 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfc43627-a5fc-40fe-b7a4-6d04e80481dd" (UID: "cfc43627-a5fc-40fe-b7a4-6d04e80481dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.215036 4699 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.215103 4699 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.215155 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.215169 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.215181 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7trcj\" (UniqueName: \"kubernetes.io/projected/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-kube-api-access-7trcj\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.215196 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfc43627-a5fc-40fe-b7a4-6d04e80481dd-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.271297 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48cbc02a-15d3-4ae1-852f-24658804939b" path="/var/lib/kubelet/pods/48cbc02a-15d3-4ae1-852f-24658804939b/volumes" Feb 26 11:34:20 crc kubenswrapper[4699]: I0226 11:34:20.956380 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.126388 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.127207 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.145488 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.157526 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.157573 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.160043 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.162042 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.162240 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.162607 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.173708 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.342475 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09a6eb79-27c3-465b-adae-b32d96c56b65-scripts\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.342532 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/09a6eb79-27c3-465b-adae-b32d96c56b65-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.342698 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09a6eb79-27c3-465b-adae-b32d96c56b65-run-httpd\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.342939 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a6eb79-27c3-465b-adae-b32d96c56b65-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.343014 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a6eb79-27c3-465b-adae-b32d96c56b65-config-data\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.343071 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x27wk\" (UniqueName: \"kubernetes.io/projected/09a6eb79-27c3-465b-adae-b32d96c56b65-kube-api-access-x27wk\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.343096 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09a6eb79-27c3-465b-adae-b32d96c56b65-log-httpd\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.343205 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/09a6eb79-27c3-465b-adae-b32d96c56b65-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.441058 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.444804 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a6eb79-27c3-465b-adae-b32d96c56b65-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.444871 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a6eb79-27c3-465b-adae-b32d96c56b65-config-data\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.444912 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x27wk\" (UniqueName: \"kubernetes.io/projected/09a6eb79-27c3-465b-adae-b32d96c56b65-kube-api-access-x27wk\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.445613 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09a6eb79-27c3-465b-adae-b32d96c56b65-log-httpd\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.445685 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/09a6eb79-27c3-465b-adae-b32d96c56b65-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.445742 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09a6eb79-27c3-465b-adae-b32d96c56b65-scripts\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.445777 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/09a6eb79-27c3-465b-adae-b32d96c56b65-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.445850 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09a6eb79-27c3-465b-adae-b32d96c56b65-run-httpd\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.445967 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09a6eb79-27c3-465b-adae-b32d96c56b65-log-httpd\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.446434 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09a6eb79-27c3-465b-adae-b32d96c56b65-run-httpd\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.450091 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09a6eb79-27c3-465b-adae-b32d96c56b65-scripts\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.450497 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/09a6eb79-27c3-465b-adae-b32d96c56b65-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.454050 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a6eb79-27c3-465b-adae-b32d96c56b65-config-data\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.458011 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/09a6eb79-27c3-465b-adae-b32d96c56b65-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.458050 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a6eb79-27c3-465b-adae-b32d96c56b65-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.469745 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x27wk\" (UniqueName: \"kubernetes.io/projected/09a6eb79-27c3-465b-adae-b32d96c56b65-kube-api-access-x27wk\") pod \"ceilometer-0\" (UID: \"09a6eb79-27c3-465b-adae-b32d96c56b65\") " pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.480703 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.547541 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvq5n\" (UniqueName: \"kubernetes.io/projected/5783da86-2f9d-42da-ae1e-7df1f4190892-kube-api-access-wvq5n\") pod \"5783da86-2f9d-42da-ae1e-7df1f4190892\" (UID: \"5783da86-2f9d-42da-ae1e-7df1f4190892\") " Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.547638 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5783da86-2f9d-42da-ae1e-7df1f4190892-logs\") pod \"5783da86-2f9d-42da-ae1e-7df1f4190892\" (UID: \"5783da86-2f9d-42da-ae1e-7df1f4190892\") " Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.547688 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5783da86-2f9d-42da-ae1e-7df1f4190892-config-data\") pod \"5783da86-2f9d-42da-ae1e-7df1f4190892\" (UID: \"5783da86-2f9d-42da-ae1e-7df1f4190892\") " Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.547717 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5783da86-2f9d-42da-ae1e-7df1f4190892-combined-ca-bundle\") pod \"5783da86-2f9d-42da-ae1e-7df1f4190892\" (UID: \"5783da86-2f9d-42da-ae1e-7df1f4190892\") " Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.548342 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5783da86-2f9d-42da-ae1e-7df1f4190892-logs" (OuterVolumeSpecName: "logs") pod "5783da86-2f9d-42da-ae1e-7df1f4190892" (UID: "5783da86-2f9d-42da-ae1e-7df1f4190892"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.558247 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5783da86-2f9d-42da-ae1e-7df1f4190892-kube-api-access-wvq5n" (OuterVolumeSpecName: "kube-api-access-wvq5n") pod "5783da86-2f9d-42da-ae1e-7df1f4190892" (UID: "5783da86-2f9d-42da-ae1e-7df1f4190892"). InnerVolumeSpecName "kube-api-access-wvq5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.582584 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5783da86-2f9d-42da-ae1e-7df1f4190892-config-data" (OuterVolumeSpecName: "config-data") pod "5783da86-2f9d-42da-ae1e-7df1f4190892" (UID: "5783da86-2f9d-42da-ae1e-7df1f4190892"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.604007 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5783da86-2f9d-42da-ae1e-7df1f4190892-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5783da86-2f9d-42da-ae1e-7df1f4190892" (UID: "5783da86-2f9d-42da-ae1e-7df1f4190892"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.650263 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvq5n\" (UniqueName: \"kubernetes.io/projected/5783da86-2f9d-42da-ae1e-7df1f4190892-kube-api-access-wvq5n\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.650299 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5783da86-2f9d-42da-ae1e-7df1f4190892-logs\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.650313 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5783da86-2f9d-42da-ae1e-7df1f4190892-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.650325 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5783da86-2f9d-42da-ae1e-7df1f4190892-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.968290 4699 generic.go:334] "Generic (PLEG): container finished" podID="5783da86-2f9d-42da-ae1e-7df1f4190892" containerID="7419fb8defa328e64d4ff51073bc24c52256ef3871192c86a3186a5335afa3a4" exitCode=0 Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.968393 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.968420 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5783da86-2f9d-42da-ae1e-7df1f4190892","Type":"ContainerDied","Data":"7419fb8defa328e64d4ff51073bc24c52256ef3871192c86a3186a5335afa3a4"} Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.969385 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5783da86-2f9d-42da-ae1e-7df1f4190892","Type":"ContainerDied","Data":"643f5d9494a203696bd97ca6ac87808480d0591a9fdea1382c0442db74ffeabf"} Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.969434 4699 scope.go:117] "RemoveContainer" containerID="7419fb8defa328e64d4ff51073bc24c52256ef3871192c86a3186a5335afa3a4" Feb 26 11:34:21 crc kubenswrapper[4699]: I0226 11:34:21.990104 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.004811 4699 scope.go:117] "RemoveContainer" containerID="5da9136f3d1044f7ed5915a0bc35b164a1ef4c4d70bffc4d5337bc060879e9cd" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.041641 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.124140 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.153407 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 11:34:22 crc kubenswrapper[4699]: E0226 11:34:22.156541 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5783da86-2f9d-42da-ae1e-7df1f4190892" containerName="nova-api-log" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.156578 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="5783da86-2f9d-42da-ae1e-7df1f4190892" containerName="nova-api-log" Feb 26 11:34:22 crc kubenswrapper[4699]: E0226 11:34:22.156625 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5783da86-2f9d-42da-ae1e-7df1f4190892" containerName="nova-api-api" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.156635 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="5783da86-2f9d-42da-ae1e-7df1f4190892" containerName="nova-api-api" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.156858 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="5783da86-2f9d-42da-ae1e-7df1f4190892" containerName="nova-api-log" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.156881 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="5783da86-2f9d-42da-ae1e-7df1f4190892" containerName="nova-api-api" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.158567 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.159613 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.161934 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.162182 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.162340 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.163946 4699 scope.go:117] "RemoveContainer" containerID="7419fb8defa328e64d4ff51073bc24c52256ef3871192c86a3186a5335afa3a4" Feb 26 11:34:22 crc kubenswrapper[4699]: E0226 11:34:22.164661 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7419fb8defa328e64d4ff51073bc24c52256ef3871192c86a3186a5335afa3a4\": container with ID starting with 7419fb8defa328e64d4ff51073bc24c52256ef3871192c86a3186a5335afa3a4 not found: ID does not exist" containerID="7419fb8defa328e64d4ff51073bc24c52256ef3871192c86a3186a5335afa3a4" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.164703 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7419fb8defa328e64d4ff51073bc24c52256ef3871192c86a3186a5335afa3a4"} err="failed to get container status \"7419fb8defa328e64d4ff51073bc24c52256ef3871192c86a3186a5335afa3a4\": rpc error: code = NotFound desc = could not find container \"7419fb8defa328e64d4ff51073bc24c52256ef3871192c86a3186a5335afa3a4\": container with ID starting with 7419fb8defa328e64d4ff51073bc24c52256ef3871192c86a3186a5335afa3a4 not found: ID does not exist" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.164749 4699 scope.go:117] "RemoveContainer" containerID="5da9136f3d1044f7ed5915a0bc35b164a1ef4c4d70bffc4d5337bc060879e9cd" Feb 26 11:34:22 crc kubenswrapper[4699]: E0226 11:34:22.165534 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5da9136f3d1044f7ed5915a0bc35b164a1ef4c4d70bffc4d5337bc060879e9cd\": container with ID starting with 5da9136f3d1044f7ed5915a0bc35b164a1ef4c4d70bffc4d5337bc060879e9cd not found: ID does not exist" containerID="5da9136f3d1044f7ed5915a0bc35b164a1ef4c4d70bffc4d5337bc060879e9cd" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.165580 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5da9136f3d1044f7ed5915a0bc35b164a1ef4c4d70bffc4d5337bc060879e9cd"} err="failed to get container status \"5da9136f3d1044f7ed5915a0bc35b164a1ef4c4d70bffc4d5337bc060879e9cd\": rpc error: code = NotFound desc = could not find container \"5da9136f3d1044f7ed5915a0bc35b164a1ef4c4d70bffc4d5337bc060879e9cd\": container with ID starting with 5da9136f3d1044f7ed5915a0bc35b164a1ef4c4d70bffc4d5337bc060879e9cd not found: ID does not exist" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.166560 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.271510 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5783da86-2f9d-42da-ae1e-7df1f4190892" path="/var/lib/kubelet/pods/5783da86-2f9d-42da-ae1e-7df1f4190892/volumes" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.272262 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfc43627-a5fc-40fe-b7a4-6d04e80481dd" path="/var/lib/kubelet/pods/cfc43627-a5fc-40fe-b7a4-6d04e80481dd/volumes" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.272672 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-77cbz"] Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.273849 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-77cbz" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.276187 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-77cbz"] Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.276371 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.276680 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.280388 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/325cc77b-b7fa-435b-b6fe-332ee76d0feb-logs\") pod \"nova-api-0\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " pod="openstack/nova-api-0" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.280431 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-config-data\") pod \"nova-api-0\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " pod="openstack/nova-api-0" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.280492 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " pod="openstack/nova-api-0" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.280546 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " pod="openstack/nova-api-0" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.280566 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdwcz\" (UniqueName: \"kubernetes.io/projected/325cc77b-b7fa-435b-b6fe-332ee76d0feb-kube-api-access-qdwcz\") pod \"nova-api-0\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " pod="openstack/nova-api-0" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.280624 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-public-tls-certs\") pod \"nova-api-0\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " pod="openstack/nova-api-0" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.382151 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-public-tls-certs\") pod \"nova-api-0\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " pod="openstack/nova-api-0" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.382237 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/325cc77b-b7fa-435b-b6fe-332ee76d0feb-logs\") pod \"nova-api-0\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " pod="openstack/nova-api-0" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.382268 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-config-data\") pod \"nova-api-0\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " pod="openstack/nova-api-0" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.382315 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/462a2449-2712-4bb7-9ec9-6e09a1800361-config-data\") pod \"nova-cell1-cell-mapping-77cbz\" (UID: \"462a2449-2712-4bb7-9ec9-6e09a1800361\") " pod="openstack/nova-cell1-cell-mapping-77cbz" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.382352 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " pod="openstack/nova-api-0" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.382381 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/462a2449-2712-4bb7-9ec9-6e09a1800361-scripts\") pod \"nova-cell1-cell-mapping-77cbz\" (UID: \"462a2449-2712-4bb7-9ec9-6e09a1800361\") " pod="openstack/nova-cell1-cell-mapping-77cbz" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.382443 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/462a2449-2712-4bb7-9ec9-6e09a1800361-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-77cbz\" (UID: \"462a2449-2712-4bb7-9ec9-6e09a1800361\") " pod="openstack/nova-cell1-cell-mapping-77cbz" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.382462 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " pod="openstack/nova-api-0" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.382480 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdwcz\" (UniqueName: \"kubernetes.io/projected/325cc77b-b7fa-435b-b6fe-332ee76d0feb-kube-api-access-qdwcz\") pod \"nova-api-0\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " pod="openstack/nova-api-0" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.382529 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5hfk\" (UniqueName: \"kubernetes.io/projected/462a2449-2712-4bb7-9ec9-6e09a1800361-kube-api-access-r5hfk\") pod \"nova-cell1-cell-mapping-77cbz\" (UID: \"462a2449-2712-4bb7-9ec9-6e09a1800361\") " pod="openstack/nova-cell1-cell-mapping-77cbz" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.383781 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/325cc77b-b7fa-435b-b6fe-332ee76d0feb-logs\") pod \"nova-api-0\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " pod="openstack/nova-api-0" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.388279 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " pod="openstack/nova-api-0" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.389411 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-config-data\") pod \"nova-api-0\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " pod="openstack/nova-api-0" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.398707 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " pod="openstack/nova-api-0" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.403462 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-public-tls-certs\") pod \"nova-api-0\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " pod="openstack/nova-api-0" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.404391 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdwcz\" (UniqueName: \"kubernetes.io/projected/325cc77b-b7fa-435b-b6fe-332ee76d0feb-kube-api-access-qdwcz\") pod \"nova-api-0\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " pod="openstack/nova-api-0" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.483882 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/462a2449-2712-4bb7-9ec9-6e09a1800361-scripts\") pod \"nova-cell1-cell-mapping-77cbz\" (UID: \"462a2449-2712-4bb7-9ec9-6e09a1800361\") " pod="openstack/nova-cell1-cell-mapping-77cbz" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.484187 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/462a2449-2712-4bb7-9ec9-6e09a1800361-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-77cbz\" (UID: \"462a2449-2712-4bb7-9ec9-6e09a1800361\") " pod="openstack/nova-cell1-cell-mapping-77cbz" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.484455 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.485171 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5hfk\" (UniqueName: \"kubernetes.io/projected/462a2449-2712-4bb7-9ec9-6e09a1800361-kube-api-access-r5hfk\") pod \"nova-cell1-cell-mapping-77cbz\" (UID: \"462a2449-2712-4bb7-9ec9-6e09a1800361\") " pod="openstack/nova-cell1-cell-mapping-77cbz" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.485860 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/462a2449-2712-4bb7-9ec9-6e09a1800361-config-data\") pod \"nova-cell1-cell-mapping-77cbz\" (UID: \"462a2449-2712-4bb7-9ec9-6e09a1800361\") " pod="openstack/nova-cell1-cell-mapping-77cbz" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.488895 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/462a2449-2712-4bb7-9ec9-6e09a1800361-scripts\") pod \"nova-cell1-cell-mapping-77cbz\" (UID: \"462a2449-2712-4bb7-9ec9-6e09a1800361\") " pod="openstack/nova-cell1-cell-mapping-77cbz" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.489529 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/462a2449-2712-4bb7-9ec9-6e09a1800361-config-data\") pod \"nova-cell1-cell-mapping-77cbz\" (UID: \"462a2449-2712-4bb7-9ec9-6e09a1800361\") " pod="openstack/nova-cell1-cell-mapping-77cbz" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.493326 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/462a2449-2712-4bb7-9ec9-6e09a1800361-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-77cbz\" (UID: \"462a2449-2712-4bb7-9ec9-6e09a1800361\") " pod="openstack/nova-cell1-cell-mapping-77cbz" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.509520 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5hfk\" (UniqueName: \"kubernetes.io/projected/462a2449-2712-4bb7-9ec9-6e09a1800361-kube-api-access-r5hfk\") pod \"nova-cell1-cell-mapping-77cbz\" (UID: \"462a2449-2712-4bb7-9ec9-6e09a1800361\") " pod="openstack/nova-cell1-cell-mapping-77cbz" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.595930 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-77cbz" Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.940518 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 11:34:22 crc kubenswrapper[4699]: W0226 11:34:22.947468 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod325cc77b_b7fa_435b_b6fe_332ee76d0feb.slice/crio-f0fb906b3d6a1aee565ebdc2050e6c35e9bcecf22462ca9f4656f519d7f3b17f WatchSource:0}: Error finding container f0fb906b3d6a1aee565ebdc2050e6c35e9bcecf22462ca9f4656f519d7f3b17f: Status 404 returned error can't find the container with id f0fb906b3d6a1aee565ebdc2050e6c35e9bcecf22462ca9f4656f519d7f3b17f Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.991947 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09a6eb79-27c3-465b-adae-b32d96c56b65","Type":"ContainerStarted","Data":"b953548672b22e05f3c8a2d7c2f458bb37c1ef0d94f774cc000e8124fcf46ff2"} Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.993227 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09a6eb79-27c3-465b-adae-b32d96c56b65","Type":"ContainerStarted","Data":"38c673a748b8c890c35cde20c5dedb4dde0c5601fed372bbd77f1da4ff9c4fc4"} Feb 26 11:34:22 crc kubenswrapper[4699]: I0226 11:34:22.993747 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"325cc77b-b7fa-435b-b6fe-332ee76d0feb","Type":"ContainerStarted","Data":"f0fb906b3d6a1aee565ebdc2050e6c35e9bcecf22462ca9f4656f519d7f3b17f"} Feb 26 11:34:23 crc kubenswrapper[4699]: I0226 11:34:23.070274 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-77cbz"] Feb 26 11:34:23 crc kubenswrapper[4699]: W0226 11:34:23.075882 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod462a2449_2712_4bb7_9ec9_6e09a1800361.slice/crio-d6ac1061cf5430e06667789d441516ddac57163a4567ce58f4cc4e16b80bf0eb WatchSource:0}: Error finding container d6ac1061cf5430e06667789d441516ddac57163a4567ce58f4cc4e16b80bf0eb: Status 404 returned error can't find the container with id d6ac1061cf5430e06667789d441516ddac57163a4567ce58f4cc4e16b80bf0eb Feb 26 11:34:24 crc kubenswrapper[4699]: I0226 11:34:24.009977 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-77cbz" event={"ID":"462a2449-2712-4bb7-9ec9-6e09a1800361","Type":"ContainerStarted","Data":"c08f0ffa53e77347fd581c677192ce80109e73083d1caad9bb7251a920a34172"} Feb 26 11:34:24 crc kubenswrapper[4699]: I0226 11:34:24.010301 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-77cbz" event={"ID":"462a2449-2712-4bb7-9ec9-6e09a1800361","Type":"ContainerStarted","Data":"d6ac1061cf5430e06667789d441516ddac57163a4567ce58f4cc4e16b80bf0eb"} Feb 26 11:34:24 crc kubenswrapper[4699]: I0226 11:34:24.013642 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09a6eb79-27c3-465b-adae-b32d96c56b65","Type":"ContainerStarted","Data":"01eea719091c456f24754ccdf719523f2d585f9f472b3ec03f08f869d78d48a9"} Feb 26 11:34:24 crc kubenswrapper[4699]: I0226 11:34:24.015931 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"325cc77b-b7fa-435b-b6fe-332ee76d0feb","Type":"ContainerStarted","Data":"1a8f7a599301cb7f75d7e70ef43d139da9f5ff4b21e208c0bbbf4a797499bc02"} Feb 26 11:34:24 crc kubenswrapper[4699]: I0226 11:34:24.015975 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"325cc77b-b7fa-435b-b6fe-332ee76d0feb","Type":"ContainerStarted","Data":"07f6dfef9a97a1b21b7413336f64271a090f8627e3593a5862c177db3ac2f705"} Feb 26 11:34:24 crc kubenswrapper[4699]: I0226 11:34:24.028834 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-77cbz" podStartSLOduration=2.02881789 podStartE2EDuration="2.02881789s" podCreationTimestamp="2026-02-26 11:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:34:24.024710251 +0000 UTC m=+1409.835536685" watchObservedRunningTime="2026-02-26 11:34:24.02881789 +0000 UTC m=+1409.839644324" Feb 26 11:34:24 crc kubenswrapper[4699]: I0226 11:34:24.054369 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.054344422 podStartE2EDuration="2.054344422s" podCreationTimestamp="2026-02-26 11:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:34:24.050180921 +0000 UTC m=+1409.861007375" watchObservedRunningTime="2026-02-26 11:34:24.054344422 +0000 UTC m=+1409.865170856" Feb 26 11:34:25 crc kubenswrapper[4699]: I0226 11:34:25.032084 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09a6eb79-27c3-465b-adae-b32d96c56b65","Type":"ContainerStarted","Data":"04e3bf77d1c5035c9875eed43705f82d5f25f919def391b6d1b9f1e0eccc3eed"} Feb 26 11:34:25 crc kubenswrapper[4699]: I0226 11:34:25.354317 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:34:25 crc kubenswrapper[4699]: I0226 11:34:25.456155 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-5jmd5"] Feb 26 11:34:25 crc kubenswrapper[4699]: I0226 11:34:25.456964 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" podUID="90cd25a3-8ac5-49d2-b3a1-79c773a0b394" containerName="dnsmasq-dns" containerID="cri-o://50c7ddb03e58cd9791ab6f41d1755213bce0ea0826aec0f5b6934548dfaf9782" gracePeriod=10 Feb 26 11:34:25 crc kubenswrapper[4699]: I0226 11:34:25.616221 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" podUID="90cd25a3-8ac5-49d2-b3a1-79c773a0b394" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.201:5353: connect: connection refused" Feb 26 11:34:26 crc kubenswrapper[4699]: I0226 11:34:26.041309 4699 generic.go:334] "Generic (PLEG): container finished" podID="90cd25a3-8ac5-49d2-b3a1-79c773a0b394" containerID="50c7ddb03e58cd9791ab6f41d1755213bce0ea0826aec0f5b6934548dfaf9782" exitCode=0 Feb 26 11:34:26 crc kubenswrapper[4699]: I0226 11:34:26.041519 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" event={"ID":"90cd25a3-8ac5-49d2-b3a1-79c773a0b394","Type":"ContainerDied","Data":"50c7ddb03e58cd9791ab6f41d1755213bce0ea0826aec0f5b6934548dfaf9782"} Feb 26 11:34:26 crc kubenswrapper[4699]: I0226 11:34:26.041649 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" event={"ID":"90cd25a3-8ac5-49d2-b3a1-79c773a0b394","Type":"ContainerDied","Data":"e1b123469f14c639c8594d09af4903ba398bf0ca95a50aeadc71f0627b95230b"} Feb 26 11:34:26 crc kubenswrapper[4699]: I0226 11:34:26.041670 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1b123469f14c639c8594d09af4903ba398bf0ca95a50aeadc71f0627b95230b" Feb 26 11:34:26 crc kubenswrapper[4699]: I0226 11:34:26.062340 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:34:26 crc kubenswrapper[4699]: I0226 11:34:26.162230 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-dns-svc\") pod \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " Feb 26 11:34:26 crc kubenswrapper[4699]: I0226 11:34:26.162299 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-ovsdbserver-nb\") pod \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " Feb 26 11:34:26 crc kubenswrapper[4699]: I0226 11:34:26.162327 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-ovsdbserver-sb\") pod \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " Feb 26 11:34:26 crc kubenswrapper[4699]: I0226 11:34:26.162388 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gpwp\" (UniqueName: \"kubernetes.io/projected/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-kube-api-access-6gpwp\") pod \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " Feb 26 11:34:26 crc kubenswrapper[4699]: I0226 11:34:26.162524 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-dns-swift-storage-0\") pod \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " Feb 26 11:34:26 crc kubenswrapper[4699]: I0226 11:34:26.162607 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-config\") pod \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\" (UID: \"90cd25a3-8ac5-49d2-b3a1-79c773a0b394\") " Feb 26 11:34:26 crc kubenswrapper[4699]: I0226 11:34:26.184562 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-kube-api-access-6gpwp" (OuterVolumeSpecName: "kube-api-access-6gpwp") pod "90cd25a3-8ac5-49d2-b3a1-79c773a0b394" (UID: "90cd25a3-8ac5-49d2-b3a1-79c773a0b394"). InnerVolumeSpecName "kube-api-access-6gpwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:34:26 crc kubenswrapper[4699]: I0226 11:34:26.208441 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "90cd25a3-8ac5-49d2-b3a1-79c773a0b394" (UID: "90cd25a3-8ac5-49d2-b3a1-79c773a0b394"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:34:26 crc kubenswrapper[4699]: I0226 11:34:26.215655 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "90cd25a3-8ac5-49d2-b3a1-79c773a0b394" (UID: "90cd25a3-8ac5-49d2-b3a1-79c773a0b394"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:34:26 crc kubenswrapper[4699]: I0226 11:34:26.215674 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "90cd25a3-8ac5-49d2-b3a1-79c773a0b394" (UID: "90cd25a3-8ac5-49d2-b3a1-79c773a0b394"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:34:26 crc kubenswrapper[4699]: I0226 11:34:26.220103 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-config" (OuterVolumeSpecName: "config") pod "90cd25a3-8ac5-49d2-b3a1-79c773a0b394" (UID: "90cd25a3-8ac5-49d2-b3a1-79c773a0b394"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:34:26 crc kubenswrapper[4699]: I0226 11:34:26.224998 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "90cd25a3-8ac5-49d2-b3a1-79c773a0b394" (UID: "90cd25a3-8ac5-49d2-b3a1-79c773a0b394"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:34:26 crc kubenswrapper[4699]: I0226 11:34:26.264628 4699 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:26 crc kubenswrapper[4699]: I0226 11:34:26.264669 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:26 crc kubenswrapper[4699]: I0226 11:34:26.264679 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:26 crc kubenswrapper[4699]: I0226 11:34:26.264689 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:26 crc kubenswrapper[4699]: I0226 11:34:26.264698 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:26 crc kubenswrapper[4699]: I0226 11:34:26.264706 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gpwp\" (UniqueName: \"kubernetes.io/projected/90cd25a3-8ac5-49d2-b3a1-79c773a0b394-kube-api-access-6gpwp\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:27 crc kubenswrapper[4699]: I0226 11:34:27.049532 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-5jmd5" Feb 26 11:34:27 crc kubenswrapper[4699]: I0226 11:34:27.077209 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-5jmd5"] Feb 26 11:34:27 crc kubenswrapper[4699]: I0226 11:34:27.085643 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-5jmd5"] Feb 26 11:34:27 crc kubenswrapper[4699]: I0226 11:34:27.714591 4699 scope.go:117] "RemoveContainer" containerID="1a0ef1ef6d99c76627fc03dba6d4f740ea96e617f11be2b18231f70b40dd8703" Feb 26 11:34:28 crc kubenswrapper[4699]: I0226 11:34:28.272952 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90cd25a3-8ac5-49d2-b3a1-79c773a0b394" path="/var/lib/kubelet/pods/90cd25a3-8ac5-49d2-b3a1-79c773a0b394/volumes" Feb 26 11:34:29 crc kubenswrapper[4699]: I0226 11:34:29.070000 4699 generic.go:334] "Generic (PLEG): container finished" podID="462a2449-2712-4bb7-9ec9-6e09a1800361" containerID="c08f0ffa53e77347fd581c677192ce80109e73083d1caad9bb7251a920a34172" exitCode=0 Feb 26 11:34:29 crc kubenswrapper[4699]: I0226 11:34:29.070103 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-77cbz" event={"ID":"462a2449-2712-4bb7-9ec9-6e09a1800361","Type":"ContainerDied","Data":"c08f0ffa53e77347fd581c677192ce80109e73083d1caad9bb7251a920a34172"} Feb 26 11:34:29 crc kubenswrapper[4699]: I0226 11:34:29.074855 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09a6eb79-27c3-465b-adae-b32d96c56b65","Type":"ContainerStarted","Data":"43050544752a7e32adeaf5163ad3ea01f011caaf4c7520d8ab03222a32f920f2"} Feb 26 11:34:29 crc kubenswrapper[4699]: I0226 11:34:29.074971 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 11:34:29 crc kubenswrapper[4699]: I0226 11:34:29.126970 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.258535211 podStartE2EDuration="8.126942975s" podCreationTimestamp="2026-02-26 11:34:21 +0000 UTC" firstStartedPulling="2026-02-26 11:34:22.109236094 +0000 UTC m=+1407.920062528" lastFinishedPulling="2026-02-26 11:34:27.977643848 +0000 UTC m=+1413.788470292" observedRunningTime="2026-02-26 11:34:29.110171808 +0000 UTC m=+1414.920998252" watchObservedRunningTime="2026-02-26 11:34:29.126942975 +0000 UTC m=+1414.937769419" Feb 26 11:34:30 crc kubenswrapper[4699]: I0226 11:34:30.455792 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-77cbz" Feb 26 11:34:30 crc kubenswrapper[4699]: I0226 11:34:30.580869 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5hfk\" (UniqueName: \"kubernetes.io/projected/462a2449-2712-4bb7-9ec9-6e09a1800361-kube-api-access-r5hfk\") pod \"462a2449-2712-4bb7-9ec9-6e09a1800361\" (UID: \"462a2449-2712-4bb7-9ec9-6e09a1800361\") " Feb 26 11:34:30 crc kubenswrapper[4699]: I0226 11:34:30.581349 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/462a2449-2712-4bb7-9ec9-6e09a1800361-scripts\") pod \"462a2449-2712-4bb7-9ec9-6e09a1800361\" (UID: \"462a2449-2712-4bb7-9ec9-6e09a1800361\") " Feb 26 11:34:30 crc kubenswrapper[4699]: I0226 11:34:30.581411 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/462a2449-2712-4bb7-9ec9-6e09a1800361-combined-ca-bundle\") pod \"462a2449-2712-4bb7-9ec9-6e09a1800361\" (UID: \"462a2449-2712-4bb7-9ec9-6e09a1800361\") " Feb 26 11:34:30 crc kubenswrapper[4699]: I0226 11:34:30.581546 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/462a2449-2712-4bb7-9ec9-6e09a1800361-config-data\") pod \"462a2449-2712-4bb7-9ec9-6e09a1800361\" (UID: \"462a2449-2712-4bb7-9ec9-6e09a1800361\") " Feb 26 11:34:30 crc kubenswrapper[4699]: I0226 11:34:30.587995 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/462a2449-2712-4bb7-9ec9-6e09a1800361-kube-api-access-r5hfk" (OuterVolumeSpecName: "kube-api-access-r5hfk") pod "462a2449-2712-4bb7-9ec9-6e09a1800361" (UID: "462a2449-2712-4bb7-9ec9-6e09a1800361"). InnerVolumeSpecName "kube-api-access-r5hfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:34:30 crc kubenswrapper[4699]: I0226 11:34:30.588195 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/462a2449-2712-4bb7-9ec9-6e09a1800361-scripts" (OuterVolumeSpecName: "scripts") pod "462a2449-2712-4bb7-9ec9-6e09a1800361" (UID: "462a2449-2712-4bb7-9ec9-6e09a1800361"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:30 crc kubenswrapper[4699]: I0226 11:34:30.612583 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/462a2449-2712-4bb7-9ec9-6e09a1800361-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "462a2449-2712-4bb7-9ec9-6e09a1800361" (UID: "462a2449-2712-4bb7-9ec9-6e09a1800361"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:30 crc kubenswrapper[4699]: I0226 11:34:30.620264 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/462a2449-2712-4bb7-9ec9-6e09a1800361-config-data" (OuterVolumeSpecName: "config-data") pod "462a2449-2712-4bb7-9ec9-6e09a1800361" (UID: "462a2449-2712-4bb7-9ec9-6e09a1800361"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:30 crc kubenswrapper[4699]: I0226 11:34:30.683733 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5hfk\" (UniqueName: \"kubernetes.io/projected/462a2449-2712-4bb7-9ec9-6e09a1800361-kube-api-access-r5hfk\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:30 crc kubenswrapper[4699]: I0226 11:34:30.683771 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/462a2449-2712-4bb7-9ec9-6e09a1800361-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:30 crc kubenswrapper[4699]: I0226 11:34:30.683783 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/462a2449-2712-4bb7-9ec9-6e09a1800361-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:30 crc kubenswrapper[4699]: I0226 11:34:30.683795 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/462a2449-2712-4bb7-9ec9-6e09a1800361-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:31 crc kubenswrapper[4699]: I0226 11:34:31.094155 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-77cbz" event={"ID":"462a2449-2712-4bb7-9ec9-6e09a1800361","Type":"ContainerDied","Data":"d6ac1061cf5430e06667789d441516ddac57163a4567ce58f4cc4e16b80bf0eb"} Feb 26 11:34:31 crc kubenswrapper[4699]: I0226 11:34:31.094195 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6ac1061cf5430e06667789d441516ddac57163a4567ce58f4cc4e16b80bf0eb" Feb 26 11:34:31 crc kubenswrapper[4699]: I0226 11:34:31.094265 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-77cbz" Feb 26 11:34:31 crc kubenswrapper[4699]: I0226 11:34:31.287537 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 11:34:31 crc kubenswrapper[4699]: I0226 11:34:31.287809 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="325cc77b-b7fa-435b-b6fe-332ee76d0feb" containerName="nova-api-log" containerID="cri-o://07f6dfef9a97a1b21b7413336f64271a090f8627e3593a5862c177db3ac2f705" gracePeriod=30 Feb 26 11:34:31 crc kubenswrapper[4699]: I0226 11:34:31.287855 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="325cc77b-b7fa-435b-b6fe-332ee76d0feb" containerName="nova-api-api" containerID="cri-o://1a8f7a599301cb7f75d7e70ef43d139da9f5ff4b21e208c0bbbf4a797499bc02" gracePeriod=30 Feb 26 11:34:31 crc kubenswrapper[4699]: I0226 11:34:31.298272 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 11:34:31 crc kubenswrapper[4699]: I0226 11:34:31.298856 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a8bf50ee-a389-4a35-8899-81d885e1ec38" containerName="nova-scheduler-scheduler" containerID="cri-o://c97d92e712559c7220f14c08b215ac1ea015fa517ad26257d3edff7ee08e2ec0" gracePeriod=30 Feb 26 11:34:31 crc kubenswrapper[4699]: I0226 11:34:31.361051 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 11:34:31 crc kubenswrapper[4699]: I0226 11:34:31.361366 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c847caf4-446a-4738-88a8-26d1628c91f7" containerName="nova-metadata-log" containerID="cri-o://7421fb2591f05611aff351133a3564dd072d71f5d14ccfd25934a2a6a477aeaa" gracePeriod=30 Feb 26 11:34:31 crc kubenswrapper[4699]: I0226 11:34:31.361479 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c847caf4-446a-4738-88a8-26d1628c91f7" containerName="nova-metadata-metadata" containerID="cri-o://36d1cb8833b256faa223eaa6a7298ffa4e21878d531a7626347d9cfebc03ef81" gracePeriod=30 Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.028319 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.114532 4699 generic.go:334] "Generic (PLEG): container finished" podID="c847caf4-446a-4738-88a8-26d1628c91f7" containerID="7421fb2591f05611aff351133a3564dd072d71f5d14ccfd25934a2a6a477aeaa" exitCode=143 Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.114614 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c847caf4-446a-4738-88a8-26d1628c91f7","Type":"ContainerDied","Data":"7421fb2591f05611aff351133a3564dd072d71f5d14ccfd25934a2a6a477aeaa"} Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.117214 4699 generic.go:334] "Generic (PLEG): container finished" podID="325cc77b-b7fa-435b-b6fe-332ee76d0feb" containerID="1a8f7a599301cb7f75d7e70ef43d139da9f5ff4b21e208c0bbbf4a797499bc02" exitCode=0 Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.117242 4699 generic.go:334] "Generic (PLEG): container finished" podID="325cc77b-b7fa-435b-b6fe-332ee76d0feb" containerID="07f6dfef9a97a1b21b7413336f64271a090f8627e3593a5862c177db3ac2f705" exitCode=143 Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.117287 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"325cc77b-b7fa-435b-b6fe-332ee76d0feb","Type":"ContainerDied","Data":"1a8f7a599301cb7f75d7e70ef43d139da9f5ff4b21e208c0bbbf4a797499bc02"} Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.117306 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"325cc77b-b7fa-435b-b6fe-332ee76d0feb","Type":"ContainerDied","Data":"07f6dfef9a97a1b21b7413336f64271a090f8627e3593a5862c177db3ac2f705"} Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.117317 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"325cc77b-b7fa-435b-b6fe-332ee76d0feb","Type":"ContainerDied","Data":"f0fb906b3d6a1aee565ebdc2050e6c35e9bcecf22462ca9f4656f519d7f3b17f"} Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.117332 4699 scope.go:117] "RemoveContainer" containerID="1a8f7a599301cb7f75d7e70ef43d139da9f5ff4b21e208c0bbbf4a797499bc02" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.117329 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.121202 4699 generic.go:334] "Generic (PLEG): container finished" podID="a8bf50ee-a389-4a35-8899-81d885e1ec38" containerID="c97d92e712559c7220f14c08b215ac1ea015fa517ad26257d3edff7ee08e2ec0" exitCode=0 Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.121491 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a8bf50ee-a389-4a35-8899-81d885e1ec38","Type":"ContainerDied","Data":"c97d92e712559c7220f14c08b215ac1ea015fa517ad26257d3edff7ee08e2ec0"} Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.141844 4699 scope.go:117] "RemoveContainer" containerID="07f6dfef9a97a1b21b7413336f64271a090f8627e3593a5862c177db3ac2f705" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.171672 4699 scope.go:117] "RemoveContainer" containerID="1a8f7a599301cb7f75d7e70ef43d139da9f5ff4b21e208c0bbbf4a797499bc02" Feb 26 11:34:32 crc kubenswrapper[4699]: E0226 11:34:32.172335 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a8f7a599301cb7f75d7e70ef43d139da9f5ff4b21e208c0bbbf4a797499bc02\": container with ID starting with 1a8f7a599301cb7f75d7e70ef43d139da9f5ff4b21e208c0bbbf4a797499bc02 not found: ID does not exist" containerID="1a8f7a599301cb7f75d7e70ef43d139da9f5ff4b21e208c0bbbf4a797499bc02" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.172384 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a8f7a599301cb7f75d7e70ef43d139da9f5ff4b21e208c0bbbf4a797499bc02"} err="failed to get container status \"1a8f7a599301cb7f75d7e70ef43d139da9f5ff4b21e208c0bbbf4a797499bc02\": rpc error: code = NotFound desc = could not find container \"1a8f7a599301cb7f75d7e70ef43d139da9f5ff4b21e208c0bbbf4a797499bc02\": container with ID starting with 1a8f7a599301cb7f75d7e70ef43d139da9f5ff4b21e208c0bbbf4a797499bc02 not found: ID does not exist" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.172549 4699 scope.go:117] "RemoveContainer" containerID="07f6dfef9a97a1b21b7413336f64271a090f8627e3593a5862c177db3ac2f705" Feb 26 11:34:32 crc kubenswrapper[4699]: E0226 11:34:32.173154 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07f6dfef9a97a1b21b7413336f64271a090f8627e3593a5862c177db3ac2f705\": container with ID starting with 07f6dfef9a97a1b21b7413336f64271a090f8627e3593a5862c177db3ac2f705 not found: ID does not exist" containerID="07f6dfef9a97a1b21b7413336f64271a090f8627e3593a5862c177db3ac2f705" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.173232 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07f6dfef9a97a1b21b7413336f64271a090f8627e3593a5862c177db3ac2f705"} err="failed to get container status \"07f6dfef9a97a1b21b7413336f64271a090f8627e3593a5862c177db3ac2f705\": rpc error: code = NotFound desc = could not find container \"07f6dfef9a97a1b21b7413336f64271a090f8627e3593a5862c177db3ac2f705\": container with ID starting with 07f6dfef9a97a1b21b7413336f64271a090f8627e3593a5862c177db3ac2f705 not found: ID does not exist" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.173262 4699 scope.go:117] "RemoveContainer" containerID="1a8f7a599301cb7f75d7e70ef43d139da9f5ff4b21e208c0bbbf4a797499bc02" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.174136 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a8f7a599301cb7f75d7e70ef43d139da9f5ff4b21e208c0bbbf4a797499bc02"} err="failed to get container status \"1a8f7a599301cb7f75d7e70ef43d139da9f5ff4b21e208c0bbbf4a797499bc02\": rpc error: code = NotFound desc = could not find container \"1a8f7a599301cb7f75d7e70ef43d139da9f5ff4b21e208c0bbbf4a797499bc02\": container with ID starting with 1a8f7a599301cb7f75d7e70ef43d139da9f5ff4b21e208c0bbbf4a797499bc02 not found: ID does not exist" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.174185 4699 scope.go:117] "RemoveContainer" containerID="07f6dfef9a97a1b21b7413336f64271a090f8627e3593a5862c177db3ac2f705" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.174443 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07f6dfef9a97a1b21b7413336f64271a090f8627e3593a5862c177db3ac2f705"} err="failed to get container status \"07f6dfef9a97a1b21b7413336f64271a090f8627e3593a5862c177db3ac2f705\": rpc error: code = NotFound desc = could not find container \"07f6dfef9a97a1b21b7413336f64271a090f8627e3593a5862c177db3ac2f705\": container with ID starting with 07f6dfef9a97a1b21b7413336f64271a090f8627e3593a5862c177db3ac2f705 not found: ID does not exist" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.207933 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.214112 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-combined-ca-bundle\") pod \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.214236 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/325cc77b-b7fa-435b-b6fe-332ee76d0feb-logs\") pod \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.214365 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdwcz\" (UniqueName: \"kubernetes.io/projected/325cc77b-b7fa-435b-b6fe-332ee76d0feb-kube-api-access-qdwcz\") pod \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.214411 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-internal-tls-certs\") pod \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.214515 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-public-tls-certs\") pod \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.214602 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-config-data\") pod \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\" (UID: \"325cc77b-b7fa-435b-b6fe-332ee76d0feb\") " Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.216360 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/325cc77b-b7fa-435b-b6fe-332ee76d0feb-logs" (OuterVolumeSpecName: "logs") pod "325cc77b-b7fa-435b-b6fe-332ee76d0feb" (UID: "325cc77b-b7fa-435b-b6fe-332ee76d0feb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.245147 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/325cc77b-b7fa-435b-b6fe-332ee76d0feb-kube-api-access-qdwcz" (OuterVolumeSpecName: "kube-api-access-qdwcz") pod "325cc77b-b7fa-435b-b6fe-332ee76d0feb" (UID: "325cc77b-b7fa-435b-b6fe-332ee76d0feb"). InnerVolumeSpecName "kube-api-access-qdwcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.261909 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "325cc77b-b7fa-435b-b6fe-332ee76d0feb" (UID: "325cc77b-b7fa-435b-b6fe-332ee76d0feb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.276452 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-config-data" (OuterVolumeSpecName: "config-data") pod "325cc77b-b7fa-435b-b6fe-332ee76d0feb" (UID: "325cc77b-b7fa-435b-b6fe-332ee76d0feb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.286947 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "325cc77b-b7fa-435b-b6fe-332ee76d0feb" (UID: "325cc77b-b7fa-435b-b6fe-332ee76d0feb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.312089 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "325cc77b-b7fa-435b-b6fe-332ee76d0feb" (UID: "325cc77b-b7fa-435b-b6fe-332ee76d0feb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.315647 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8bf50ee-a389-4a35-8899-81d885e1ec38-combined-ca-bundle\") pod \"a8bf50ee-a389-4a35-8899-81d885e1ec38\" (UID: \"a8bf50ee-a389-4a35-8899-81d885e1ec38\") " Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.315912 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8bf50ee-a389-4a35-8899-81d885e1ec38-config-data\") pod \"a8bf50ee-a389-4a35-8899-81d885e1ec38\" (UID: \"a8bf50ee-a389-4a35-8899-81d885e1ec38\") " Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.316230 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85xhh\" (UniqueName: \"kubernetes.io/projected/a8bf50ee-a389-4a35-8899-81d885e1ec38-kube-api-access-85xhh\") pod \"a8bf50ee-a389-4a35-8899-81d885e1ec38\" (UID: \"a8bf50ee-a389-4a35-8899-81d885e1ec38\") " Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.316899 4699 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.317001 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.317087 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.317243 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/325cc77b-b7fa-435b-b6fe-332ee76d0feb-logs\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.317326 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdwcz\" (UniqueName: \"kubernetes.io/projected/325cc77b-b7fa-435b-b6fe-332ee76d0feb-kube-api-access-qdwcz\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.317400 4699 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/325cc77b-b7fa-435b-b6fe-332ee76d0feb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.320174 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8bf50ee-a389-4a35-8899-81d885e1ec38-kube-api-access-85xhh" (OuterVolumeSpecName: "kube-api-access-85xhh") pod "a8bf50ee-a389-4a35-8899-81d885e1ec38" (UID: "a8bf50ee-a389-4a35-8899-81d885e1ec38"). InnerVolumeSpecName "kube-api-access-85xhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:34:32 crc kubenswrapper[4699]: E0226 11:34:32.343298 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8bf50ee-a389-4a35-8899-81d885e1ec38-combined-ca-bundle podName:a8bf50ee-a389-4a35-8899-81d885e1ec38 nodeName:}" failed. No retries permitted until 2026-02-26 11:34:32.84326515 +0000 UTC m=+1418.654091584 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/a8bf50ee-a389-4a35-8899-81d885e1ec38-combined-ca-bundle") pod "a8bf50ee-a389-4a35-8899-81d885e1ec38" (UID: "a8bf50ee-a389-4a35-8899-81d885e1ec38") : error deleting /var/lib/kubelet/pods/a8bf50ee-a389-4a35-8899-81d885e1ec38/volume-subpaths: remove /var/lib/kubelet/pods/a8bf50ee-a389-4a35-8899-81d885e1ec38/volume-subpaths: no such file or directory Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.346017 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8bf50ee-a389-4a35-8899-81d885e1ec38-config-data" (OuterVolumeSpecName: "config-data") pod "a8bf50ee-a389-4a35-8899-81d885e1ec38" (UID: "a8bf50ee-a389-4a35-8899-81d885e1ec38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.419352 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85xhh\" (UniqueName: \"kubernetes.io/projected/a8bf50ee-a389-4a35-8899-81d885e1ec38-kube-api-access-85xhh\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.419385 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8bf50ee-a389-4a35-8899-81d885e1ec38-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.500858 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.509845 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.523423 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 11:34:32 crc kubenswrapper[4699]: E0226 11:34:32.523895 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90cd25a3-8ac5-49d2-b3a1-79c773a0b394" containerName="init" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.523917 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="90cd25a3-8ac5-49d2-b3a1-79c773a0b394" containerName="init" Feb 26 11:34:32 crc kubenswrapper[4699]: E0226 11:34:32.523930 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="325cc77b-b7fa-435b-b6fe-332ee76d0feb" containerName="nova-api-api" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.523936 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="325cc77b-b7fa-435b-b6fe-332ee76d0feb" containerName="nova-api-api" Feb 26 11:34:32 crc kubenswrapper[4699]: E0226 11:34:32.523955 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8bf50ee-a389-4a35-8899-81d885e1ec38" containerName="nova-scheduler-scheduler" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.523962 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8bf50ee-a389-4a35-8899-81d885e1ec38" containerName="nova-scheduler-scheduler" Feb 26 11:34:32 crc kubenswrapper[4699]: E0226 11:34:32.523981 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="325cc77b-b7fa-435b-b6fe-332ee76d0feb" containerName="nova-api-log" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.523987 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="325cc77b-b7fa-435b-b6fe-332ee76d0feb" containerName="nova-api-log" Feb 26 11:34:32 crc kubenswrapper[4699]: E0226 11:34:32.524002 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="462a2449-2712-4bb7-9ec9-6e09a1800361" containerName="nova-manage" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.524008 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="462a2449-2712-4bb7-9ec9-6e09a1800361" containerName="nova-manage" Feb 26 11:34:32 crc kubenswrapper[4699]: E0226 11:34:32.524019 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90cd25a3-8ac5-49d2-b3a1-79c773a0b394" containerName="dnsmasq-dns" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.524024 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="90cd25a3-8ac5-49d2-b3a1-79c773a0b394" containerName="dnsmasq-dns" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.524200 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="325cc77b-b7fa-435b-b6fe-332ee76d0feb" containerName="nova-api-log" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.524224 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="90cd25a3-8ac5-49d2-b3a1-79c773a0b394" containerName="dnsmasq-dns" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.524234 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8bf50ee-a389-4a35-8899-81d885e1ec38" containerName="nova-scheduler-scheduler" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.524244 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="325cc77b-b7fa-435b-b6fe-332ee76d0feb" containerName="nova-api-api" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.524253 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="462a2449-2712-4bb7-9ec9-6e09a1800361" containerName="nova-manage" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.525913 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.528512 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.528725 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.528772 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.536598 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.622914 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d0d807f-7fdc-4239-b7bb-1952c2f7c222-logs\") pod \"nova-api-0\" (UID: \"2d0d807f-7fdc-4239-b7bb-1952c2f7c222\") " pod="openstack/nova-api-0" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.623427 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d0d807f-7fdc-4239-b7bb-1952c2f7c222-public-tls-certs\") pod \"nova-api-0\" (UID: \"2d0d807f-7fdc-4239-b7bb-1952c2f7c222\") " pod="openstack/nova-api-0" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.623506 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d0d807f-7fdc-4239-b7bb-1952c2f7c222-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2d0d807f-7fdc-4239-b7bb-1952c2f7c222\") " pod="openstack/nova-api-0" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.623583 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d0d807f-7fdc-4239-b7bb-1952c2f7c222-config-data\") pod \"nova-api-0\" (UID: \"2d0d807f-7fdc-4239-b7bb-1952c2f7c222\") " pod="openstack/nova-api-0" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.623607 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d0d807f-7fdc-4239-b7bb-1952c2f7c222-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2d0d807f-7fdc-4239-b7bb-1952c2f7c222\") " pod="openstack/nova-api-0" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.623675 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx47p\" (UniqueName: \"kubernetes.io/projected/2d0d807f-7fdc-4239-b7bb-1952c2f7c222-kube-api-access-bx47p\") pod \"nova-api-0\" (UID: \"2d0d807f-7fdc-4239-b7bb-1952c2f7c222\") " pod="openstack/nova-api-0" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.725328 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d0d807f-7fdc-4239-b7bb-1952c2f7c222-public-tls-certs\") pod \"nova-api-0\" (UID: \"2d0d807f-7fdc-4239-b7bb-1952c2f7c222\") " pod="openstack/nova-api-0" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.725369 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d0d807f-7fdc-4239-b7bb-1952c2f7c222-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2d0d807f-7fdc-4239-b7bb-1952c2f7c222\") " pod="openstack/nova-api-0" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.725399 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d0d807f-7fdc-4239-b7bb-1952c2f7c222-config-data\") pod \"nova-api-0\" (UID: \"2d0d807f-7fdc-4239-b7bb-1952c2f7c222\") " pod="openstack/nova-api-0" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.725413 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d0d807f-7fdc-4239-b7bb-1952c2f7c222-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2d0d807f-7fdc-4239-b7bb-1952c2f7c222\") " pod="openstack/nova-api-0" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.725442 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx47p\" (UniqueName: \"kubernetes.io/projected/2d0d807f-7fdc-4239-b7bb-1952c2f7c222-kube-api-access-bx47p\") pod \"nova-api-0\" (UID: \"2d0d807f-7fdc-4239-b7bb-1952c2f7c222\") " pod="openstack/nova-api-0" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.725504 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d0d807f-7fdc-4239-b7bb-1952c2f7c222-logs\") pod \"nova-api-0\" (UID: \"2d0d807f-7fdc-4239-b7bb-1952c2f7c222\") " pod="openstack/nova-api-0" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.725941 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d0d807f-7fdc-4239-b7bb-1952c2f7c222-logs\") pod \"nova-api-0\" (UID: \"2d0d807f-7fdc-4239-b7bb-1952c2f7c222\") " pod="openstack/nova-api-0" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.729143 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d0d807f-7fdc-4239-b7bb-1952c2f7c222-public-tls-certs\") pod \"nova-api-0\" (UID: \"2d0d807f-7fdc-4239-b7bb-1952c2f7c222\") " pod="openstack/nova-api-0" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.729742 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d0d807f-7fdc-4239-b7bb-1952c2f7c222-config-data\") pod \"nova-api-0\" (UID: \"2d0d807f-7fdc-4239-b7bb-1952c2f7c222\") " pod="openstack/nova-api-0" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.740762 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d0d807f-7fdc-4239-b7bb-1952c2f7c222-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2d0d807f-7fdc-4239-b7bb-1952c2f7c222\") " pod="openstack/nova-api-0" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.740838 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d0d807f-7fdc-4239-b7bb-1952c2f7c222-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2d0d807f-7fdc-4239-b7bb-1952c2f7c222\") " pod="openstack/nova-api-0" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.743921 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx47p\" (UniqueName: \"kubernetes.io/projected/2d0d807f-7fdc-4239-b7bb-1952c2f7c222-kube-api-access-bx47p\") pod \"nova-api-0\" (UID: \"2d0d807f-7fdc-4239-b7bb-1952c2f7c222\") " pod="openstack/nova-api-0" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.842809 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.929194 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8bf50ee-a389-4a35-8899-81d885e1ec38-combined-ca-bundle\") pod \"a8bf50ee-a389-4a35-8899-81d885e1ec38\" (UID: \"a8bf50ee-a389-4a35-8899-81d885e1ec38\") " Feb 26 11:34:32 crc kubenswrapper[4699]: I0226 11:34:32.933561 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8bf50ee-a389-4a35-8899-81d885e1ec38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8bf50ee-a389-4a35-8899-81d885e1ec38" (UID: "a8bf50ee-a389-4a35-8899-81d885e1ec38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:33 crc kubenswrapper[4699]: I0226 11:34:33.031453 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8bf50ee-a389-4a35-8899-81d885e1ec38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:33 crc kubenswrapper[4699]: I0226 11:34:33.131083 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a8bf50ee-a389-4a35-8899-81d885e1ec38","Type":"ContainerDied","Data":"35561c1ff93cac360e0003512da7f67e357d0a40bd9387c2cdd037287561205d"} Feb 26 11:34:33 crc kubenswrapper[4699]: I0226 11:34:33.131159 4699 scope.go:117] "RemoveContainer" containerID="c97d92e712559c7220f14c08b215ac1ea015fa517ad26257d3edff7ee08e2ec0" Feb 26 11:34:33 crc kubenswrapper[4699]: I0226 11:34:33.131101 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 11:34:33 crc kubenswrapper[4699]: I0226 11:34:33.192190 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 11:34:33 crc kubenswrapper[4699]: I0226 11:34:33.221480 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 11:34:33 crc kubenswrapper[4699]: I0226 11:34:33.232463 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 11:34:33 crc kubenswrapper[4699]: I0226 11:34:33.233928 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 11:34:33 crc kubenswrapper[4699]: I0226 11:34:33.238243 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 26 11:34:33 crc kubenswrapper[4699]: I0226 11:34:33.260800 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 11:34:33 crc kubenswrapper[4699]: I0226 11:34:33.348435 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d8371db-373f-4a41-97cb-b2d00aa17571-config-data\") pod \"nova-scheduler-0\" (UID: \"9d8371db-373f-4a41-97cb-b2d00aa17571\") " pod="openstack/nova-scheduler-0" Feb 26 11:34:33 crc kubenswrapper[4699]: I0226 11:34:33.348786 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d8371db-373f-4a41-97cb-b2d00aa17571-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9d8371db-373f-4a41-97cb-b2d00aa17571\") " pod="openstack/nova-scheduler-0" Feb 26 11:34:33 crc kubenswrapper[4699]: I0226 11:34:33.348844 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w45jw\" (UniqueName: \"kubernetes.io/projected/9d8371db-373f-4a41-97cb-b2d00aa17571-kube-api-access-w45jw\") pod \"nova-scheduler-0\" (UID: \"9d8371db-373f-4a41-97cb-b2d00aa17571\") " pod="openstack/nova-scheduler-0" Feb 26 11:34:33 crc kubenswrapper[4699]: I0226 11:34:33.381022 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 11:34:33 crc kubenswrapper[4699]: I0226 11:34:33.451273 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d8371db-373f-4a41-97cb-b2d00aa17571-config-data\") pod \"nova-scheduler-0\" (UID: \"9d8371db-373f-4a41-97cb-b2d00aa17571\") " pod="openstack/nova-scheduler-0" Feb 26 11:34:33 crc kubenswrapper[4699]: I0226 11:34:33.451491 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d8371db-373f-4a41-97cb-b2d00aa17571-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9d8371db-373f-4a41-97cb-b2d00aa17571\") " pod="openstack/nova-scheduler-0" Feb 26 11:34:33 crc kubenswrapper[4699]: I0226 11:34:33.451530 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w45jw\" (UniqueName: \"kubernetes.io/projected/9d8371db-373f-4a41-97cb-b2d00aa17571-kube-api-access-w45jw\") pod \"nova-scheduler-0\" (UID: \"9d8371db-373f-4a41-97cb-b2d00aa17571\") " pod="openstack/nova-scheduler-0" Feb 26 11:34:33 crc kubenswrapper[4699]: I0226 11:34:33.456794 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d8371db-373f-4a41-97cb-b2d00aa17571-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9d8371db-373f-4a41-97cb-b2d00aa17571\") " pod="openstack/nova-scheduler-0" Feb 26 11:34:33 crc kubenswrapper[4699]: I0226 11:34:33.457010 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d8371db-373f-4a41-97cb-b2d00aa17571-config-data\") pod \"nova-scheduler-0\" (UID: \"9d8371db-373f-4a41-97cb-b2d00aa17571\") " pod="openstack/nova-scheduler-0" Feb 26 11:34:33 crc kubenswrapper[4699]: I0226 11:34:33.469091 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w45jw\" (UniqueName: \"kubernetes.io/projected/9d8371db-373f-4a41-97cb-b2d00aa17571-kube-api-access-w45jw\") pod \"nova-scheduler-0\" (UID: \"9d8371db-373f-4a41-97cb-b2d00aa17571\") " pod="openstack/nova-scheduler-0" Feb 26 11:34:33 crc kubenswrapper[4699]: I0226 11:34:33.579518 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 11:34:34 crc kubenswrapper[4699]: I0226 11:34:34.025525 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 11:34:34 crc kubenswrapper[4699]: I0226 11:34:34.145726 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d0d807f-7fdc-4239-b7bb-1952c2f7c222","Type":"ContainerStarted","Data":"7d3165477e1642b932da75d5ca4b8b6972b15beeee88048f905d0dbaba9ac6ea"} Feb 26 11:34:34 crc kubenswrapper[4699]: I0226 11:34:34.145774 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d0d807f-7fdc-4239-b7bb-1952c2f7c222","Type":"ContainerStarted","Data":"8027d2ce6c898378c118504a1d7fc78d863f7a78bd83a665b6126a5ea1a0d61a"} Feb 26 11:34:34 crc kubenswrapper[4699]: I0226 11:34:34.145786 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d0d807f-7fdc-4239-b7bb-1952c2f7c222","Type":"ContainerStarted","Data":"191a5ce01cbb2f558c3373d675ae1bea411b42b7fe62137febb4e27dd54ec69d"} Feb 26 11:34:34 crc kubenswrapper[4699]: I0226 11:34:34.148819 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9d8371db-373f-4a41-97cb-b2d00aa17571","Type":"ContainerStarted","Data":"094dacce7e358a32d66fa5fcd4112046338df7d27740f54feecf442612c8341d"} Feb 26 11:34:34 crc kubenswrapper[4699]: I0226 11:34:34.184644 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.184619714 podStartE2EDuration="2.184619714s" podCreationTimestamp="2026-02-26 11:34:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:34:34.175898361 +0000 UTC m=+1419.986724815" watchObservedRunningTime="2026-02-26 11:34:34.184619714 +0000 UTC m=+1419.995446158" Feb 26 11:34:34 crc kubenswrapper[4699]: I0226 11:34:34.270626 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="325cc77b-b7fa-435b-b6fe-332ee76d0feb" path="/var/lib/kubelet/pods/325cc77b-b7fa-435b-b6fe-332ee76d0feb/volumes" Feb 26 11:34:34 crc kubenswrapper[4699]: I0226 11:34:34.271264 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8bf50ee-a389-4a35-8899-81d885e1ec38" path="/var/lib/kubelet/pods/a8bf50ee-a389-4a35-8899-81d885e1ec38/volumes" Feb 26 11:34:34 crc kubenswrapper[4699]: I0226 11:34:34.495071 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c847caf4-446a-4738-88a8-26d1628c91f7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": read tcp 10.217.0.2:41344->10.217.0.204:8775: read: connection reset by peer" Feb 26 11:34:34 crc kubenswrapper[4699]: I0226 11:34:34.495144 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c847caf4-446a-4738-88a8-26d1628c91f7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": read tcp 10.217.0.2:41342->10.217.0.204:8775: read: connection reset by peer" Feb 26 11:34:34 crc kubenswrapper[4699]: I0226 11:34:34.960551 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.085021 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c847caf4-446a-4738-88a8-26d1628c91f7-config-data\") pod \"c847caf4-446a-4738-88a8-26d1628c91f7\" (UID: \"c847caf4-446a-4738-88a8-26d1628c91f7\") " Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.085092 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c847caf4-446a-4738-88a8-26d1628c91f7-logs\") pod \"c847caf4-446a-4738-88a8-26d1628c91f7\" (UID: \"c847caf4-446a-4738-88a8-26d1628c91f7\") " Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.085110 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj8g6\" (UniqueName: \"kubernetes.io/projected/c847caf4-446a-4738-88a8-26d1628c91f7-kube-api-access-rj8g6\") pod \"c847caf4-446a-4738-88a8-26d1628c91f7\" (UID: \"c847caf4-446a-4738-88a8-26d1628c91f7\") " Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.085214 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c847caf4-446a-4738-88a8-26d1628c91f7-nova-metadata-tls-certs\") pod \"c847caf4-446a-4738-88a8-26d1628c91f7\" (UID: \"c847caf4-446a-4738-88a8-26d1628c91f7\") " Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.085265 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c847caf4-446a-4738-88a8-26d1628c91f7-combined-ca-bundle\") pod \"c847caf4-446a-4738-88a8-26d1628c91f7\" (UID: \"c847caf4-446a-4738-88a8-26d1628c91f7\") " Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.086403 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c847caf4-446a-4738-88a8-26d1628c91f7-logs" (OuterVolumeSpecName: "logs") pod "c847caf4-446a-4738-88a8-26d1628c91f7" (UID: "c847caf4-446a-4738-88a8-26d1628c91f7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.102418 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c847caf4-446a-4738-88a8-26d1628c91f7-kube-api-access-rj8g6" (OuterVolumeSpecName: "kube-api-access-rj8g6") pod "c847caf4-446a-4738-88a8-26d1628c91f7" (UID: "c847caf4-446a-4738-88a8-26d1628c91f7"). InnerVolumeSpecName "kube-api-access-rj8g6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.120785 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c847caf4-446a-4738-88a8-26d1628c91f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c847caf4-446a-4738-88a8-26d1628c91f7" (UID: "c847caf4-446a-4738-88a8-26d1628c91f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.124700 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c847caf4-446a-4738-88a8-26d1628c91f7-config-data" (OuterVolumeSpecName: "config-data") pod "c847caf4-446a-4738-88a8-26d1628c91f7" (UID: "c847caf4-446a-4738-88a8-26d1628c91f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.174764 4699 generic.go:334] "Generic (PLEG): container finished" podID="c847caf4-446a-4738-88a8-26d1628c91f7" containerID="36d1cb8833b256faa223eaa6a7298ffa4e21878d531a7626347d9cfebc03ef81" exitCode=0 Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.175076 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c847caf4-446a-4738-88a8-26d1628c91f7","Type":"ContainerDied","Data":"36d1cb8833b256faa223eaa6a7298ffa4e21878d531a7626347d9cfebc03ef81"} Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.175103 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c847caf4-446a-4738-88a8-26d1628c91f7","Type":"ContainerDied","Data":"3e0522c5f3203ed953a2188504efca913e34205ca0ad66ee7025a214978a16b0"} Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.175137 4699 scope.go:117] "RemoveContainer" containerID="36d1cb8833b256faa223eaa6a7298ffa4e21878d531a7626347d9cfebc03ef81" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.175260 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.181439 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9d8371db-373f-4a41-97cb-b2d00aa17571","Type":"ContainerStarted","Data":"c9f59789fb46f140c14a75b396fc6615b2fefe16f21f2f8d660f04f539768d82"} Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.184913 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c847caf4-446a-4738-88a8-26d1628c91f7-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c847caf4-446a-4738-88a8-26d1628c91f7" (UID: "c847caf4-446a-4738-88a8-26d1628c91f7"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.187759 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c847caf4-446a-4738-88a8-26d1628c91f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.187783 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c847caf4-446a-4738-88a8-26d1628c91f7-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.187813 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c847caf4-446a-4738-88a8-26d1628c91f7-logs\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.187824 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rj8g6\" (UniqueName: \"kubernetes.io/projected/c847caf4-446a-4738-88a8-26d1628c91f7-kube-api-access-rj8g6\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.187833 4699 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c847caf4-446a-4738-88a8-26d1628c91f7-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.199778 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.199758436 podStartE2EDuration="2.199758436s" podCreationTimestamp="2026-02-26 11:34:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:34:35.1994743 +0000 UTC m=+1421.010300734" watchObservedRunningTime="2026-02-26 11:34:35.199758436 +0000 UTC m=+1421.010584870" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.213933 4699 scope.go:117] "RemoveContainer" containerID="7421fb2591f05611aff351133a3564dd072d71f5d14ccfd25934a2a6a477aeaa" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.236489 4699 scope.go:117] "RemoveContainer" containerID="36d1cb8833b256faa223eaa6a7298ffa4e21878d531a7626347d9cfebc03ef81" Feb 26 11:34:35 crc kubenswrapper[4699]: E0226 11:34:35.237006 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36d1cb8833b256faa223eaa6a7298ffa4e21878d531a7626347d9cfebc03ef81\": container with ID starting with 36d1cb8833b256faa223eaa6a7298ffa4e21878d531a7626347d9cfebc03ef81 not found: ID does not exist" containerID="36d1cb8833b256faa223eaa6a7298ffa4e21878d531a7626347d9cfebc03ef81" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.237037 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36d1cb8833b256faa223eaa6a7298ffa4e21878d531a7626347d9cfebc03ef81"} err="failed to get container status \"36d1cb8833b256faa223eaa6a7298ffa4e21878d531a7626347d9cfebc03ef81\": rpc error: code = NotFound desc = could not find container \"36d1cb8833b256faa223eaa6a7298ffa4e21878d531a7626347d9cfebc03ef81\": container with ID starting with 36d1cb8833b256faa223eaa6a7298ffa4e21878d531a7626347d9cfebc03ef81 not found: ID does not exist" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.237065 4699 scope.go:117] "RemoveContainer" containerID="7421fb2591f05611aff351133a3564dd072d71f5d14ccfd25934a2a6a477aeaa" Feb 26 11:34:35 crc kubenswrapper[4699]: E0226 11:34:35.237703 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7421fb2591f05611aff351133a3564dd072d71f5d14ccfd25934a2a6a477aeaa\": container with ID starting with 7421fb2591f05611aff351133a3564dd072d71f5d14ccfd25934a2a6a477aeaa not found: ID does not exist" containerID="7421fb2591f05611aff351133a3564dd072d71f5d14ccfd25934a2a6a477aeaa" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.237761 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7421fb2591f05611aff351133a3564dd072d71f5d14ccfd25934a2a6a477aeaa"} err="failed to get container status \"7421fb2591f05611aff351133a3564dd072d71f5d14ccfd25934a2a6a477aeaa\": rpc error: code = NotFound desc = could not find container \"7421fb2591f05611aff351133a3564dd072d71f5d14ccfd25934a2a6a477aeaa\": container with ID starting with 7421fb2591f05611aff351133a3564dd072d71f5d14ccfd25934a2a6a477aeaa not found: ID does not exist" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.519991 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.541103 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.564208 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 11:34:35 crc kubenswrapper[4699]: E0226 11:34:35.564950 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c847caf4-446a-4738-88a8-26d1628c91f7" containerName="nova-metadata-metadata" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.564976 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="c847caf4-446a-4738-88a8-26d1628c91f7" containerName="nova-metadata-metadata" Feb 26 11:34:35 crc kubenswrapper[4699]: E0226 11:34:35.564989 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c847caf4-446a-4738-88a8-26d1628c91f7" containerName="nova-metadata-log" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.564999 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="c847caf4-446a-4738-88a8-26d1628c91f7" containerName="nova-metadata-log" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.565317 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="c847caf4-446a-4738-88a8-26d1628c91f7" containerName="nova-metadata-log" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.565347 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="c847caf4-446a-4738-88a8-26d1628c91f7" containerName="nova-metadata-metadata" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.567020 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.576308 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.581915 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.583609 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.708884 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15752dfa-4afb-412f-99a0-75c5fe76f6a8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"15752dfa-4afb-412f-99a0-75c5fe76f6a8\") " pod="openstack/nova-metadata-0" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.708963 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q66wh\" (UniqueName: \"kubernetes.io/projected/15752dfa-4afb-412f-99a0-75c5fe76f6a8-kube-api-access-q66wh\") pod \"nova-metadata-0\" (UID: \"15752dfa-4afb-412f-99a0-75c5fe76f6a8\") " pod="openstack/nova-metadata-0" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.709000 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15752dfa-4afb-412f-99a0-75c5fe76f6a8-logs\") pod \"nova-metadata-0\" (UID: \"15752dfa-4afb-412f-99a0-75c5fe76f6a8\") " pod="openstack/nova-metadata-0" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.709041 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15752dfa-4afb-412f-99a0-75c5fe76f6a8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"15752dfa-4afb-412f-99a0-75c5fe76f6a8\") " pod="openstack/nova-metadata-0" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.709148 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15752dfa-4afb-412f-99a0-75c5fe76f6a8-config-data\") pod \"nova-metadata-0\" (UID: \"15752dfa-4afb-412f-99a0-75c5fe76f6a8\") " pod="openstack/nova-metadata-0" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.810414 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15752dfa-4afb-412f-99a0-75c5fe76f6a8-logs\") pod \"nova-metadata-0\" (UID: \"15752dfa-4afb-412f-99a0-75c5fe76f6a8\") " pod="openstack/nova-metadata-0" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.810499 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15752dfa-4afb-412f-99a0-75c5fe76f6a8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"15752dfa-4afb-412f-99a0-75c5fe76f6a8\") " pod="openstack/nova-metadata-0" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.810617 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15752dfa-4afb-412f-99a0-75c5fe76f6a8-config-data\") pod \"nova-metadata-0\" (UID: \"15752dfa-4afb-412f-99a0-75c5fe76f6a8\") " pod="openstack/nova-metadata-0" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.810666 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15752dfa-4afb-412f-99a0-75c5fe76f6a8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"15752dfa-4afb-412f-99a0-75c5fe76f6a8\") " pod="openstack/nova-metadata-0" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.810719 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q66wh\" (UniqueName: \"kubernetes.io/projected/15752dfa-4afb-412f-99a0-75c5fe76f6a8-kube-api-access-q66wh\") pod \"nova-metadata-0\" (UID: \"15752dfa-4afb-412f-99a0-75c5fe76f6a8\") " pod="openstack/nova-metadata-0" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.811982 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15752dfa-4afb-412f-99a0-75c5fe76f6a8-logs\") pod \"nova-metadata-0\" (UID: \"15752dfa-4afb-412f-99a0-75c5fe76f6a8\") " pod="openstack/nova-metadata-0" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.816465 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15752dfa-4afb-412f-99a0-75c5fe76f6a8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"15752dfa-4afb-412f-99a0-75c5fe76f6a8\") " pod="openstack/nova-metadata-0" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.816494 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15752dfa-4afb-412f-99a0-75c5fe76f6a8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"15752dfa-4afb-412f-99a0-75c5fe76f6a8\") " pod="openstack/nova-metadata-0" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.818002 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15752dfa-4afb-412f-99a0-75c5fe76f6a8-config-data\") pod \"nova-metadata-0\" (UID: \"15752dfa-4afb-412f-99a0-75c5fe76f6a8\") " pod="openstack/nova-metadata-0" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.834401 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q66wh\" (UniqueName: \"kubernetes.io/projected/15752dfa-4afb-412f-99a0-75c5fe76f6a8-kube-api-access-q66wh\") pod \"nova-metadata-0\" (UID: \"15752dfa-4afb-412f-99a0-75c5fe76f6a8\") " pod="openstack/nova-metadata-0" Feb 26 11:34:35 crc kubenswrapper[4699]: I0226 11:34:35.907626 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 11:34:36 crc kubenswrapper[4699]: I0226 11:34:36.272828 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c847caf4-446a-4738-88a8-26d1628c91f7" path="/var/lib/kubelet/pods/c847caf4-446a-4738-88a8-26d1628c91f7/volumes" Feb 26 11:34:36 crc kubenswrapper[4699]: I0226 11:34:36.424255 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 11:34:37 crc kubenswrapper[4699]: I0226 11:34:37.204806 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15752dfa-4afb-412f-99a0-75c5fe76f6a8","Type":"ContainerStarted","Data":"09054b285959da7487a1e768db02c33e9684f2433939813bfe30bacc02103ce0"} Feb 26 11:34:37 crc kubenswrapper[4699]: I0226 11:34:37.205198 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15752dfa-4afb-412f-99a0-75c5fe76f6a8","Type":"ContainerStarted","Data":"8b667388cc95abb45a06e9af25477109dd583ef5c939684b9cf29a0ee0fc1ffc"} Feb 26 11:34:37 crc kubenswrapper[4699]: I0226 11:34:37.205215 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15752dfa-4afb-412f-99a0-75c5fe76f6a8","Type":"ContainerStarted","Data":"956e2a368ad68123e08c1b9457041b4b83f645c3c2d4c939e03e4a34a9fc3016"} Feb 26 11:34:37 crc kubenswrapper[4699]: I0226 11:34:37.219829 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.219806155 podStartE2EDuration="2.219806155s" podCreationTimestamp="2026-02-26 11:34:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:34:37.218377401 +0000 UTC m=+1423.029203855" watchObservedRunningTime="2026-02-26 11:34:37.219806155 +0000 UTC m=+1423.030632599" Feb 26 11:34:38 crc kubenswrapper[4699]: I0226 11:34:38.580085 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 26 11:34:40 crc kubenswrapper[4699]: I0226 11:34:40.908336 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 11:34:40 crc kubenswrapper[4699]: I0226 11:34:40.908971 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 11:34:42 crc kubenswrapper[4699]: I0226 11:34:42.843065 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 11:34:42 crc kubenswrapper[4699]: I0226 11:34:42.843161 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 11:34:43 crc kubenswrapper[4699]: I0226 11:34:43.580691 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 26 11:34:43 crc kubenswrapper[4699]: I0226 11:34:43.613464 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 26 11:34:43 crc kubenswrapper[4699]: I0226 11:34:43.854283 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2d0d807f-7fdc-4239-b7bb-1952c2f7c222" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.217:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 11:34:43 crc kubenswrapper[4699]: I0226 11:34:43.854339 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2d0d807f-7fdc-4239-b7bb-1952c2f7c222" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.217:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 11:34:44 crc kubenswrapper[4699]: I0226 11:34:44.303371 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 26 11:34:45 crc kubenswrapper[4699]: I0226 11:34:45.907983 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 11:34:45 crc kubenswrapper[4699]: I0226 11:34:45.908083 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 11:34:46 crc kubenswrapper[4699]: I0226 11:34:46.922277 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="15752dfa-4afb-412f-99a0-75c5fe76f6a8" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.219:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 11:34:46 crc kubenswrapper[4699]: I0226 11:34:46.922373 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="15752dfa-4afb-412f-99a0-75c5fe76f6a8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.219:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 11:34:51 crc kubenswrapper[4699]: I0226 11:34:51.495614 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 26 11:34:52 crc kubenswrapper[4699]: I0226 11:34:52.849717 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 11:34:52 crc kubenswrapper[4699]: I0226 11:34:52.850294 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 11:34:52 crc kubenswrapper[4699]: I0226 11:34:52.850412 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 11:34:52 crc kubenswrapper[4699]: I0226 11:34:52.855786 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 11:34:53 crc kubenswrapper[4699]: I0226 11:34:53.383825 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 11:34:53 crc kubenswrapper[4699]: I0226 11:34:53.390021 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 11:34:55 crc kubenswrapper[4699]: I0226 11:34:55.914249 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 11:34:55 crc kubenswrapper[4699]: I0226 11:34:55.916008 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 11:34:55 crc kubenswrapper[4699]: I0226 11:34:55.923498 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 11:34:56 crc kubenswrapper[4699]: I0226 11:34:56.419521 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 11:35:03 crc kubenswrapper[4699]: I0226 11:35:03.966339 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 11:35:04 crc kubenswrapper[4699]: I0226 11:35:04.764662 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 11:35:08 crc kubenswrapper[4699]: I0226 11:35:08.489885 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="f4a652b4-5b96-4ebf-81b4-df92846455bd" containerName="rabbitmq" containerID="cri-o://5338446d199768b870e8acd23a871db21784e94ef89f9e11e4dc4c4cebdfd0f7" gracePeriod=604796 Feb 26 11:35:08 crc kubenswrapper[4699]: I0226 11:35:08.833570 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="2d57084d-dc87-44e4-bbc8-50c402b7165b" containerName="rabbitmq" containerID="cri-o://34a964fb2885f46c40a593f7964e9249ef84fe7b3d8685d54bd17236234cc625" gracePeriod=604796 Feb 26 11:35:12 crc kubenswrapper[4699]: I0226 11:35:12.336473 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="f4a652b4-5b96-4ebf-81b4-df92846455bd" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Feb 26 11:35:12 crc kubenswrapper[4699]: I0226 11:35:12.389941 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="2d57084d-dc87-44e4-bbc8-50c402b7165b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.217993 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.370217 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f4a652b4-5b96-4ebf-81b4-df92846455bd-pod-info\") pod \"f4a652b4-5b96-4ebf-81b4-df92846455bd\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.370349 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-erlang-cookie\") pod \"f4a652b4-5b96-4ebf-81b4-df92846455bd\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.370414 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-confd\") pod \"f4a652b4-5b96-4ebf-81b4-df92846455bd\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.370438 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42595\" (UniqueName: \"kubernetes.io/projected/f4a652b4-5b96-4ebf-81b4-df92846455bd-kube-api-access-42595\") pod \"f4a652b4-5b96-4ebf-81b4-df92846455bd\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.370524 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-tls\") pod \"f4a652b4-5b96-4ebf-81b4-df92846455bd\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.370564 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f4a652b4-5b96-4ebf-81b4-df92846455bd-plugins-conf\") pod \"f4a652b4-5b96-4ebf-81b4-df92846455bd\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.370591 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f4a652b4-5b96-4ebf-81b4-df92846455bd-server-conf\") pod \"f4a652b4-5b96-4ebf-81b4-df92846455bd\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.370633 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-plugins\") pod \"f4a652b4-5b96-4ebf-81b4-df92846455bd\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.370678 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f4a652b4-5b96-4ebf-81b4-df92846455bd-erlang-cookie-secret\") pod \"f4a652b4-5b96-4ebf-81b4-df92846455bd\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.370734 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4a652b4-5b96-4ebf-81b4-df92846455bd-config-data\") pod \"f4a652b4-5b96-4ebf-81b4-df92846455bd\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.370768 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"f4a652b4-5b96-4ebf-81b4-df92846455bd\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.372133 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4a652b4-5b96-4ebf-81b4-df92846455bd-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f4a652b4-5b96-4ebf-81b4-df92846455bd" (UID: "f4a652b4-5b96-4ebf-81b4-df92846455bd"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.372154 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f4a652b4-5b96-4ebf-81b4-df92846455bd" (UID: "f4a652b4-5b96-4ebf-81b4-df92846455bd"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.372387 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f4a652b4-5b96-4ebf-81b4-df92846455bd" (UID: "f4a652b4-5b96-4ebf-81b4-df92846455bd"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.378675 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "f4a652b4-5b96-4ebf-81b4-df92846455bd" (UID: "f4a652b4-5b96-4ebf-81b4-df92846455bd"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.381434 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "f4a652b4-5b96-4ebf-81b4-df92846455bd" (UID: "f4a652b4-5b96-4ebf-81b4-df92846455bd"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.394071 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4a652b4-5b96-4ebf-81b4-df92846455bd-kube-api-access-42595" (OuterVolumeSpecName: "kube-api-access-42595") pod "f4a652b4-5b96-4ebf-81b4-df92846455bd" (UID: "f4a652b4-5b96-4ebf-81b4-df92846455bd"). InnerVolumeSpecName "kube-api-access-42595". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.397593 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a652b4-5b96-4ebf-81b4-df92846455bd-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f4a652b4-5b96-4ebf-81b4-df92846455bd" (UID: "f4a652b4-5b96-4ebf-81b4-df92846455bd"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.397683 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f4a652b4-5b96-4ebf-81b4-df92846455bd-pod-info" (OuterVolumeSpecName: "pod-info") pod "f4a652b4-5b96-4ebf-81b4-df92846455bd" (UID: "f4a652b4-5b96-4ebf-81b4-df92846455bd"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.422692 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4a652b4-5b96-4ebf-81b4-df92846455bd-config-data" (OuterVolumeSpecName: "config-data") pod "f4a652b4-5b96-4ebf-81b4-df92846455bd" (UID: "f4a652b4-5b96-4ebf-81b4-df92846455bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.472408 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4a652b4-5b96-4ebf-81b4-df92846455bd-server-conf" (OuterVolumeSpecName: "server-conf") pod "f4a652b4-5b96-4ebf-81b4-df92846455bd" (UID: "f4a652b4-5b96-4ebf-81b4-df92846455bd"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.472555 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f4a652b4-5b96-4ebf-81b4-df92846455bd-server-conf\") pod \"f4a652b4-5b96-4ebf-81b4-df92846455bd\" (UID: \"f4a652b4-5b96-4ebf-81b4-df92846455bd\") " Feb 26 11:35:15 crc kubenswrapper[4699]: W0226 11:35:15.472906 4699 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/f4a652b4-5b96-4ebf-81b4-df92846455bd/volumes/kubernetes.io~configmap/server-conf Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.472924 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4a652b4-5b96-4ebf-81b4-df92846455bd-server-conf" (OuterVolumeSpecName: "server-conf") pod "f4a652b4-5b96-4ebf-81b4-df92846455bd" (UID: "f4a652b4-5b96-4ebf-81b4-df92846455bd"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.473231 4699 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f4a652b4-5b96-4ebf-81b4-df92846455bd-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.473251 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4a652b4-5b96-4ebf-81b4-df92846455bd-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.473273 4699 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.473282 4699 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f4a652b4-5b96-4ebf-81b4-df92846455bd-pod-info\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.473292 4699 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.473303 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42595\" (UniqueName: \"kubernetes.io/projected/f4a652b4-5b96-4ebf-81b4-df92846455bd-kube-api-access-42595\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.473311 4699 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.473319 4699 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f4a652b4-5b96-4ebf-81b4-df92846455bd-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.473327 4699 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.491499 4699 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.549676 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.551485 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f4a652b4-5b96-4ebf-81b4-df92846455bd" (UID: "f4a652b4-5b96-4ebf-81b4-df92846455bd"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.588610 4699 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f4a652b4-5b96-4ebf-81b4-df92846455bd-server-conf\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.588657 4699 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.588671 4699 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f4a652b4-5b96-4ebf-81b4-df92846455bd-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.670990 4699 generic.go:334] "Generic (PLEG): container finished" podID="f4a652b4-5b96-4ebf-81b4-df92846455bd" containerID="5338446d199768b870e8acd23a871db21784e94ef89f9e11e4dc4c4cebdfd0f7" exitCode=0 Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.671074 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f4a652b4-5b96-4ebf-81b4-df92846455bd","Type":"ContainerDied","Data":"5338446d199768b870e8acd23a871db21784e94ef89f9e11e4dc4c4cebdfd0f7"} Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.671102 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f4a652b4-5b96-4ebf-81b4-df92846455bd","Type":"ContainerDied","Data":"c653f2114aeba63b01bf441458d5ec8f8a6f7c0f66f8ee44c878928901c377ac"} Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.671247 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.671293 4699 scope.go:117] "RemoveContainer" containerID="5338446d199768b870e8acd23a871db21784e94ef89f9e11e4dc4c4cebdfd0f7" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.686976 4699 generic.go:334] "Generic (PLEG): container finished" podID="2d57084d-dc87-44e4-bbc8-50c402b7165b" containerID="34a964fb2885f46c40a593f7964e9249ef84fe7b3d8685d54bd17236234cc625" exitCode=0 Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.687018 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d57084d-dc87-44e4-bbc8-50c402b7165b","Type":"ContainerDied","Data":"34a964fb2885f46c40a593f7964e9249ef84fe7b3d8685d54bd17236234cc625"} Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.687043 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d57084d-dc87-44e4-bbc8-50c402b7165b","Type":"ContainerDied","Data":"6c8df8aa27d02e0ceb8002bd8f20b8b521706c7be8fe88b152c705914906b7ae"} Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.687104 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.689077 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-plugins\") pod \"2d57084d-dc87-44e4-bbc8-50c402b7165b\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.689139 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d57084d-dc87-44e4-bbc8-50c402b7165b-config-data\") pod \"2d57084d-dc87-44e4-bbc8-50c402b7165b\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.689177 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d57084d-dc87-44e4-bbc8-50c402b7165b-erlang-cookie-secret\") pod \"2d57084d-dc87-44e4-bbc8-50c402b7165b\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.689269 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"2d57084d-dc87-44e4-bbc8-50c402b7165b\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.689313 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-confd\") pod \"2d57084d-dc87-44e4-bbc8-50c402b7165b\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.689339 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d57084d-dc87-44e4-bbc8-50c402b7165b-plugins-conf\") pod \"2d57084d-dc87-44e4-bbc8-50c402b7165b\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.689362 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d57084d-dc87-44e4-bbc8-50c402b7165b-pod-info\") pod \"2d57084d-dc87-44e4-bbc8-50c402b7165b\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.689422 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-erlang-cookie\") pod \"2d57084d-dc87-44e4-bbc8-50c402b7165b\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.689453 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-tls\") pod \"2d57084d-dc87-44e4-bbc8-50c402b7165b\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.689517 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d57084d-dc87-44e4-bbc8-50c402b7165b-server-conf\") pod \"2d57084d-dc87-44e4-bbc8-50c402b7165b\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.689565 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz7xp\" (UniqueName: \"kubernetes.io/projected/2d57084d-dc87-44e4-bbc8-50c402b7165b-kube-api-access-pz7xp\") pod \"2d57084d-dc87-44e4-bbc8-50c402b7165b\" (UID: \"2d57084d-dc87-44e4-bbc8-50c402b7165b\") " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.689586 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2d57084d-dc87-44e4-bbc8-50c402b7165b" (UID: "2d57084d-dc87-44e4-bbc8-50c402b7165b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.689904 4699 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.690100 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2d57084d-dc87-44e4-bbc8-50c402b7165b" (UID: "2d57084d-dc87-44e4-bbc8-50c402b7165b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.690133 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d57084d-dc87-44e4-bbc8-50c402b7165b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2d57084d-dc87-44e4-bbc8-50c402b7165b" (UID: "2d57084d-dc87-44e4-bbc8-50c402b7165b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.692882 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2d57084d-dc87-44e4-bbc8-50c402b7165b-pod-info" (OuterVolumeSpecName: "pod-info") pod "2d57084d-dc87-44e4-bbc8-50c402b7165b" (UID: "2d57084d-dc87-44e4-bbc8-50c402b7165b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.693326 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d57084d-dc87-44e4-bbc8-50c402b7165b-kube-api-access-pz7xp" (OuterVolumeSpecName: "kube-api-access-pz7xp") pod "2d57084d-dc87-44e4-bbc8-50c402b7165b" (UID: "2d57084d-dc87-44e4-bbc8-50c402b7165b"). InnerVolumeSpecName "kube-api-access-pz7xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.694470 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "2d57084d-dc87-44e4-bbc8-50c402b7165b" (UID: "2d57084d-dc87-44e4-bbc8-50c402b7165b"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.698804 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "2d57084d-dc87-44e4-bbc8-50c402b7165b" (UID: "2d57084d-dc87-44e4-bbc8-50c402b7165b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.718283 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d57084d-dc87-44e4-bbc8-50c402b7165b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2d57084d-dc87-44e4-bbc8-50c402b7165b" (UID: "2d57084d-dc87-44e4-bbc8-50c402b7165b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.722604 4699 scope.go:117] "RemoveContainer" containerID="01e22f9568e3ffe75e59f3efb64eaf87a7d36c70e421e1930b11972704fa1c7f" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.733666 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.754102 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d57084d-dc87-44e4-bbc8-50c402b7165b-config-data" (OuterVolumeSpecName: "config-data") pod "2d57084d-dc87-44e4-bbc8-50c402b7165b" (UID: "2d57084d-dc87-44e4-bbc8-50c402b7165b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.759604 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.782585 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 11:35:15 crc kubenswrapper[4699]: E0226 11:35:15.783047 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d57084d-dc87-44e4-bbc8-50c402b7165b" containerName="rabbitmq" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.783060 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d57084d-dc87-44e4-bbc8-50c402b7165b" containerName="rabbitmq" Feb 26 11:35:15 crc kubenswrapper[4699]: E0226 11:35:15.783080 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a652b4-5b96-4ebf-81b4-df92846455bd" containerName="setup-container" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.783086 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a652b4-5b96-4ebf-81b4-df92846455bd" containerName="setup-container" Feb 26 11:35:15 crc kubenswrapper[4699]: E0226 11:35:15.783100 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a652b4-5b96-4ebf-81b4-df92846455bd" containerName="rabbitmq" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.783106 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a652b4-5b96-4ebf-81b4-df92846455bd" containerName="rabbitmq" Feb 26 11:35:15 crc kubenswrapper[4699]: E0226 11:35:15.783147 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d57084d-dc87-44e4-bbc8-50c402b7165b" containerName="setup-container" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.783153 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d57084d-dc87-44e4-bbc8-50c402b7165b" containerName="setup-container" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.783332 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d57084d-dc87-44e4-bbc8-50c402b7165b" containerName="rabbitmq" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.783348 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4a652b4-5b96-4ebf-81b4-df92846455bd" containerName="rabbitmq" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.784722 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.789714 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.790061 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.790233 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.790393 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.790507 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.790557 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.790738 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-g9kcp" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.791983 4699 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.792009 4699 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d57084d-dc87-44e4-bbc8-50c402b7165b-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.792019 4699 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d57084d-dc87-44e4-bbc8-50c402b7165b-pod-info\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.792028 4699 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.792037 4699 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.792046 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz7xp\" (UniqueName: \"kubernetes.io/projected/2d57084d-dc87-44e4-bbc8-50c402b7165b-kube-api-access-pz7xp\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.792054 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d57084d-dc87-44e4-bbc8-50c402b7165b-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.792063 4699 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d57084d-dc87-44e4-bbc8-50c402b7165b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.801909 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.832584 4699 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.847619 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d57084d-dc87-44e4-bbc8-50c402b7165b-server-conf" (OuterVolumeSpecName: "server-conf") pod "2d57084d-dc87-44e4-bbc8-50c402b7165b" (UID: "2d57084d-dc87-44e4-bbc8-50c402b7165b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.861862 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2d57084d-dc87-44e4-bbc8-50c402b7165b" (UID: "2d57084d-dc87-44e4-bbc8-50c402b7165b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.863726 4699 scope.go:117] "RemoveContainer" containerID="5338446d199768b870e8acd23a871db21784e94ef89f9e11e4dc4c4cebdfd0f7" Feb 26 11:35:15 crc kubenswrapper[4699]: E0226 11:35:15.864238 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5338446d199768b870e8acd23a871db21784e94ef89f9e11e4dc4c4cebdfd0f7\": container with ID starting with 5338446d199768b870e8acd23a871db21784e94ef89f9e11e4dc4c4cebdfd0f7 not found: ID does not exist" containerID="5338446d199768b870e8acd23a871db21784e94ef89f9e11e4dc4c4cebdfd0f7" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.864323 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5338446d199768b870e8acd23a871db21784e94ef89f9e11e4dc4c4cebdfd0f7"} err="failed to get container status \"5338446d199768b870e8acd23a871db21784e94ef89f9e11e4dc4c4cebdfd0f7\": rpc error: code = NotFound desc = could not find container \"5338446d199768b870e8acd23a871db21784e94ef89f9e11e4dc4c4cebdfd0f7\": container with ID starting with 5338446d199768b870e8acd23a871db21784e94ef89f9e11e4dc4c4cebdfd0f7 not found: ID does not exist" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.864354 4699 scope.go:117] "RemoveContainer" containerID="01e22f9568e3ffe75e59f3efb64eaf87a7d36c70e421e1930b11972704fa1c7f" Feb 26 11:35:15 crc kubenswrapper[4699]: E0226 11:35:15.864658 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01e22f9568e3ffe75e59f3efb64eaf87a7d36c70e421e1930b11972704fa1c7f\": container with ID starting with 01e22f9568e3ffe75e59f3efb64eaf87a7d36c70e421e1930b11972704fa1c7f not found: ID does not exist" containerID="01e22f9568e3ffe75e59f3efb64eaf87a7d36c70e421e1930b11972704fa1c7f" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.864692 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01e22f9568e3ffe75e59f3efb64eaf87a7d36c70e421e1930b11972704fa1c7f"} err="failed to get container status \"01e22f9568e3ffe75e59f3efb64eaf87a7d36c70e421e1930b11972704fa1c7f\": rpc error: code = NotFound desc = could not find container \"01e22f9568e3ffe75e59f3efb64eaf87a7d36c70e421e1930b11972704fa1c7f\": container with ID starting with 01e22f9568e3ffe75e59f3efb64eaf87a7d36c70e421e1930b11972704fa1c7f not found: ID does not exist" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.864715 4699 scope.go:117] "RemoveContainer" containerID="34a964fb2885f46c40a593f7964e9249ef84fe7b3d8685d54bd17236234cc625" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.894215 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0d9b2e6e-c43b-49ae-a71e-844610621e3e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.894460 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0d9b2e6e-c43b-49ae-a71e-844610621e3e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.894529 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0d9b2e6e-c43b-49ae-a71e-844610621e3e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.894613 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0d9b2e6e-c43b-49ae-a71e-844610621e3e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.894707 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5nf4\" (UniqueName: \"kubernetes.io/projected/0d9b2e6e-c43b-49ae-a71e-844610621e3e-kube-api-access-x5nf4\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.894755 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d9b2e6e-c43b-49ae-a71e-844610621e3e-config-data\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.894788 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.894989 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0d9b2e6e-c43b-49ae-a71e-844610621e3e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.895074 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0d9b2e6e-c43b-49ae-a71e-844610621e3e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.895131 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0d9b2e6e-c43b-49ae-a71e-844610621e3e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.895255 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0d9b2e6e-c43b-49ae-a71e-844610621e3e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.895432 4699 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d57084d-dc87-44e4-bbc8-50c402b7165b-server-conf\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.895450 4699 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.895460 4699 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d57084d-dc87-44e4-bbc8-50c402b7165b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.937226 4699 scope.go:117] "RemoveContainer" containerID="4d1eb8a447a752838461f0d636790cc5233f2885f6f184a34b386abf15a99629" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.967790 4699 scope.go:117] "RemoveContainer" containerID="34a964fb2885f46c40a593f7964e9249ef84fe7b3d8685d54bd17236234cc625" Feb 26 11:35:15 crc kubenswrapper[4699]: E0226 11:35:15.968452 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34a964fb2885f46c40a593f7964e9249ef84fe7b3d8685d54bd17236234cc625\": container with ID starting with 34a964fb2885f46c40a593f7964e9249ef84fe7b3d8685d54bd17236234cc625 not found: ID does not exist" containerID="34a964fb2885f46c40a593f7964e9249ef84fe7b3d8685d54bd17236234cc625" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.968496 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34a964fb2885f46c40a593f7964e9249ef84fe7b3d8685d54bd17236234cc625"} err="failed to get container status \"34a964fb2885f46c40a593f7964e9249ef84fe7b3d8685d54bd17236234cc625\": rpc error: code = NotFound desc = could not find container \"34a964fb2885f46c40a593f7964e9249ef84fe7b3d8685d54bd17236234cc625\": container with ID starting with 34a964fb2885f46c40a593f7964e9249ef84fe7b3d8685d54bd17236234cc625 not found: ID does not exist" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.968524 4699 scope.go:117] "RemoveContainer" containerID="4d1eb8a447a752838461f0d636790cc5233f2885f6f184a34b386abf15a99629" Feb 26 11:35:15 crc kubenswrapper[4699]: E0226 11:35:15.968834 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d1eb8a447a752838461f0d636790cc5233f2885f6f184a34b386abf15a99629\": container with ID starting with 4d1eb8a447a752838461f0d636790cc5233f2885f6f184a34b386abf15a99629 not found: ID does not exist" containerID="4d1eb8a447a752838461f0d636790cc5233f2885f6f184a34b386abf15a99629" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.968867 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d1eb8a447a752838461f0d636790cc5233f2885f6f184a34b386abf15a99629"} err="failed to get container status \"4d1eb8a447a752838461f0d636790cc5233f2885f6f184a34b386abf15a99629\": rpc error: code = NotFound desc = could not find container \"4d1eb8a447a752838461f0d636790cc5233f2885f6f184a34b386abf15a99629\": container with ID starting with 4d1eb8a447a752838461f0d636790cc5233f2885f6f184a34b386abf15a99629 not found: ID does not exist" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.997362 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5nf4\" (UniqueName: \"kubernetes.io/projected/0d9b2e6e-c43b-49ae-a71e-844610621e3e-kube-api-access-x5nf4\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.998024 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d9b2e6e-c43b-49ae-a71e-844610621e3e-config-data\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.998074 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.998138 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0d9b2e6e-c43b-49ae-a71e-844610621e3e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.998174 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0d9b2e6e-c43b-49ae-a71e-844610621e3e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.998204 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0d9b2e6e-c43b-49ae-a71e-844610621e3e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.998289 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0d9b2e6e-c43b-49ae-a71e-844610621e3e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.998382 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0d9b2e6e-c43b-49ae-a71e-844610621e3e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.998453 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0d9b2e6e-c43b-49ae-a71e-844610621e3e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.998499 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0d9b2e6e-c43b-49ae-a71e-844610621e3e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.998555 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0d9b2e6e-c43b-49ae-a71e-844610621e3e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.999000 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d9b2e6e-c43b-49ae-a71e-844610621e3e-config-data\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.999208 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.999533 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0d9b2e6e-c43b-49ae-a71e-844610621e3e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.999629 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0d9b2e6e-c43b-49ae-a71e-844610621e3e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:15 crc kubenswrapper[4699]: I0226 11:35:15.999795 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0d9b2e6e-c43b-49ae-a71e-844610621e3e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.000551 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0d9b2e6e-c43b-49ae-a71e-844610621e3e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.002740 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0d9b2e6e-c43b-49ae-a71e-844610621e3e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.003062 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0d9b2e6e-c43b-49ae-a71e-844610621e3e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.003364 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0d9b2e6e-c43b-49ae-a71e-844610621e3e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.006011 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0d9b2e6e-c43b-49ae-a71e-844610621e3e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.023026 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5nf4\" (UniqueName: \"kubernetes.io/projected/0d9b2e6e-c43b-49ae-a71e-844610621e3e-kube-api-access-x5nf4\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.041344 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"0d9b2e6e-c43b-49ae-a71e-844610621e3e\") " pod="openstack/rabbitmq-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.047286 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.058676 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.069475 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.071435 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.076813 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.076954 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.077088 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.077155 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.077233 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.077096 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.077359 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mp8r4" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.078466 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.202273 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3b731314-eb90-4a19-a425-2f9282af2a7f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.202325 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3b731314-eb90-4a19-a425-2f9282af2a7f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.202361 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3b731314-eb90-4a19-a425-2f9282af2a7f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.202385 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.202464 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b731314-eb90-4a19-a425-2f9282af2a7f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.202488 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3b731314-eb90-4a19-a425-2f9282af2a7f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.202507 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3b731314-eb90-4a19-a425-2f9282af2a7f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.202528 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4f6j\" (UniqueName: \"kubernetes.io/projected/3b731314-eb90-4a19-a425-2f9282af2a7f-kube-api-access-z4f6j\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.202552 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3b731314-eb90-4a19-a425-2f9282af2a7f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.202754 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3b731314-eb90-4a19-a425-2f9282af2a7f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.202789 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3b731314-eb90-4a19-a425-2f9282af2a7f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.234802 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.282241 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d57084d-dc87-44e4-bbc8-50c402b7165b" path="/var/lib/kubelet/pods/2d57084d-dc87-44e4-bbc8-50c402b7165b/volumes" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.283195 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4a652b4-5b96-4ebf-81b4-df92846455bd" path="/var/lib/kubelet/pods/f4a652b4-5b96-4ebf-81b4-df92846455bd/volumes" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.304282 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3b731314-eb90-4a19-a425-2f9282af2a7f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.304348 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3b731314-eb90-4a19-a425-2f9282af2a7f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.305181 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3b731314-eb90-4a19-a425-2f9282af2a7f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.305241 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3b731314-eb90-4a19-a425-2f9282af2a7f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.305294 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3b731314-eb90-4a19-a425-2f9282af2a7f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.305583 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3b731314-eb90-4a19-a425-2f9282af2a7f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.305626 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3b731314-eb90-4a19-a425-2f9282af2a7f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.305674 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.305837 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.306057 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b731314-eb90-4a19-a425-2f9282af2a7f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.306151 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3b731314-eb90-4a19-a425-2f9282af2a7f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.306184 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3b731314-eb90-4a19-a425-2f9282af2a7f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.306219 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4f6j\" (UniqueName: \"kubernetes.io/projected/3b731314-eb90-4a19-a425-2f9282af2a7f-kube-api-access-z4f6j\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.306279 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3b731314-eb90-4a19-a425-2f9282af2a7f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.306668 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3b731314-eb90-4a19-a425-2f9282af2a7f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.306848 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3b731314-eb90-4a19-a425-2f9282af2a7f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.306992 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b731314-eb90-4a19-a425-2f9282af2a7f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.308409 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3b731314-eb90-4a19-a425-2f9282af2a7f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.309460 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3b731314-eb90-4a19-a425-2f9282af2a7f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.310047 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3b731314-eb90-4a19-a425-2f9282af2a7f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.310053 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3b731314-eb90-4a19-a425-2f9282af2a7f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.323029 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4f6j\" (UniqueName: \"kubernetes.io/projected/3b731314-eb90-4a19-a425-2f9282af2a7f-kube-api-access-z4f6j\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.344674 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b731314-eb90-4a19-a425-2f9282af2a7f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.396938 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.728170 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.905300 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 11:35:16 crc kubenswrapper[4699]: W0226 11:35:16.908216 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b731314_eb90_4a19_a425_2f9282af2a7f.slice/crio-a46589663d794710d4c07fab7c01ba86c58a47a81946b5e4edd54fdd8b063a57 WatchSource:0}: Error finding container a46589663d794710d4c07fab7c01ba86c58a47a81946b5e4edd54fdd8b063a57: Status 404 returned error can't find the container with id a46589663d794710d4c07fab7c01ba86c58a47a81946b5e4edd54fdd8b063a57 Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.974907 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-5898z"] Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.976790 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:16 crc kubenswrapper[4699]: I0226 11:35:16.978881 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.029826 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-5898z"] Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.127254 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-config\") pod \"dnsmasq-dns-5576978c7c-5898z\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.127319 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-dns-svc\") pod \"dnsmasq-dns-5576978c7c-5898z\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.127349 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-5898z\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.127380 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tchk9\" (UniqueName: \"kubernetes.io/projected/390537ad-fb8f-417c-9577-c6958c371659-kube-api-access-tchk9\") pod \"dnsmasq-dns-5576978c7c-5898z\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.127480 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-5898z\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.127507 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-5898z\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.127560 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-5898z\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.173795 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-5898z"] Feb 26 11:35:17 crc kubenswrapper[4699]: E0226 11:35:17.174545 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-tchk9 openstack-edpm-ipam ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5576978c7c-5898z" podUID="390537ad-fb8f-417c-9577-c6958c371659" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.240658 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-hddfn"] Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.242313 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.244060 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-5898z\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.244106 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-5898z\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.244164 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-5898z\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.244224 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-config\") pod \"dnsmasq-dns-5576978c7c-5898z\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.244250 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-dns-svc\") pod \"dnsmasq-dns-5576978c7c-5898z\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.244272 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-5898z\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.244293 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tchk9\" (UniqueName: \"kubernetes.io/projected/390537ad-fb8f-417c-9577-c6958c371659-kube-api-access-tchk9\") pod \"dnsmasq-dns-5576978c7c-5898z\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.245446 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-5898z\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.245917 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-config\") pod \"dnsmasq-dns-5576978c7c-5898z\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.246366 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-dns-svc\") pod \"dnsmasq-dns-5576978c7c-5898z\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.246717 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-5898z\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.247108 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-5898z\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.247214 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-5898z\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.280356 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-hddfn"] Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.290182 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tchk9\" (UniqueName: \"kubernetes.io/projected/390537ad-fb8f-417c-9577-c6958c371659-kube-api-access-tchk9\") pod \"dnsmasq-dns-5576978c7c-5898z\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.346171 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/24dd88a8-4737-4ebc-8925-b2bcedb760c2-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-hddfn\" (UID: \"24dd88a8-4737-4ebc-8925-b2bcedb760c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.346254 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24dd88a8-4737-4ebc-8925-b2bcedb760c2-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-hddfn\" (UID: \"24dd88a8-4737-4ebc-8925-b2bcedb760c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.346322 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24dd88a8-4737-4ebc-8925-b2bcedb760c2-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-hddfn\" (UID: \"24dd88a8-4737-4ebc-8925-b2bcedb760c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.346345 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24dd88a8-4737-4ebc-8925-b2bcedb760c2-config\") pod \"dnsmasq-dns-8c6f6df99-hddfn\" (UID: \"24dd88a8-4737-4ebc-8925-b2bcedb760c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.346369 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24dd88a8-4737-4ebc-8925-b2bcedb760c2-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-hddfn\" (UID: \"24dd88a8-4737-4ebc-8925-b2bcedb760c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.346384 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24dd88a8-4737-4ebc-8925-b2bcedb760c2-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-hddfn\" (UID: \"24dd88a8-4737-4ebc-8925-b2bcedb760c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.346419 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vtm4\" (UniqueName: \"kubernetes.io/projected/24dd88a8-4737-4ebc-8925-b2bcedb760c2-kube-api-access-7vtm4\") pod \"dnsmasq-dns-8c6f6df99-hddfn\" (UID: \"24dd88a8-4737-4ebc-8925-b2bcedb760c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.447789 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24dd88a8-4737-4ebc-8925-b2bcedb760c2-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-hddfn\" (UID: \"24dd88a8-4737-4ebc-8925-b2bcedb760c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.447846 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24dd88a8-4737-4ebc-8925-b2bcedb760c2-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-hddfn\" (UID: \"24dd88a8-4737-4ebc-8925-b2bcedb760c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.447906 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vtm4\" (UniqueName: \"kubernetes.io/projected/24dd88a8-4737-4ebc-8925-b2bcedb760c2-kube-api-access-7vtm4\") pod \"dnsmasq-dns-8c6f6df99-hddfn\" (UID: \"24dd88a8-4737-4ebc-8925-b2bcedb760c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.448015 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/24dd88a8-4737-4ebc-8925-b2bcedb760c2-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-hddfn\" (UID: \"24dd88a8-4737-4ebc-8925-b2bcedb760c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.448062 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24dd88a8-4737-4ebc-8925-b2bcedb760c2-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-hddfn\" (UID: \"24dd88a8-4737-4ebc-8925-b2bcedb760c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.448166 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24dd88a8-4737-4ebc-8925-b2bcedb760c2-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-hddfn\" (UID: \"24dd88a8-4737-4ebc-8925-b2bcedb760c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.448198 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24dd88a8-4737-4ebc-8925-b2bcedb760c2-config\") pod \"dnsmasq-dns-8c6f6df99-hddfn\" (UID: \"24dd88a8-4737-4ebc-8925-b2bcedb760c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.448876 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24dd88a8-4737-4ebc-8925-b2bcedb760c2-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-hddfn\" (UID: \"24dd88a8-4737-4ebc-8925-b2bcedb760c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.448930 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24dd88a8-4737-4ebc-8925-b2bcedb760c2-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-hddfn\" (UID: \"24dd88a8-4737-4ebc-8925-b2bcedb760c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.449005 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24dd88a8-4737-4ebc-8925-b2bcedb760c2-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-hddfn\" (UID: \"24dd88a8-4737-4ebc-8925-b2bcedb760c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.449098 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/24dd88a8-4737-4ebc-8925-b2bcedb760c2-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-hddfn\" (UID: \"24dd88a8-4737-4ebc-8925-b2bcedb760c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.449224 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24dd88a8-4737-4ebc-8925-b2bcedb760c2-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-hddfn\" (UID: \"24dd88a8-4737-4ebc-8925-b2bcedb760c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.449431 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24dd88a8-4737-4ebc-8925-b2bcedb760c2-config\") pod \"dnsmasq-dns-8c6f6df99-hddfn\" (UID: \"24dd88a8-4737-4ebc-8925-b2bcedb760c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.528239 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vtm4\" (UniqueName: \"kubernetes.io/projected/24dd88a8-4737-4ebc-8925-b2bcedb760c2-kube-api-access-7vtm4\") pod \"dnsmasq-dns-8c6f6df99-hddfn\" (UID: \"24dd88a8-4737-4ebc-8925-b2bcedb760c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.591585 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.708708 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0d9b2e6e-c43b-49ae-a71e-844610621e3e","Type":"ContainerStarted","Data":"5aad5beafa395051116bd03caf5da12524f4ae6cf970c26fa65a71dc636e2c06"} Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.710197 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3b731314-eb90-4a19-a425-2f9282af2a7f","Type":"ContainerStarted","Data":"a46589663d794710d4c07fab7c01ba86c58a47a81946b5e4edd54fdd8b063a57"} Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.710217 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.812092 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.957573 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-dns-svc\") pod \"390537ad-fb8f-417c-9577-c6958c371659\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.958052 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-openstack-edpm-ipam\") pod \"390537ad-fb8f-417c-9577-c6958c371659\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.958134 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tchk9\" (UniqueName: \"kubernetes.io/projected/390537ad-fb8f-417c-9577-c6958c371659-kube-api-access-tchk9\") pod \"390537ad-fb8f-417c-9577-c6958c371659\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.958262 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-config\") pod \"390537ad-fb8f-417c-9577-c6958c371659\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.958271 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "390537ad-fb8f-417c-9577-c6958c371659" (UID: "390537ad-fb8f-417c-9577-c6958c371659"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.958337 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-ovsdbserver-nb\") pod \"390537ad-fb8f-417c-9577-c6958c371659\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.958588 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "390537ad-fb8f-417c-9577-c6958c371659" (UID: "390537ad-fb8f-417c-9577-c6958c371659"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.958825 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-dns-swift-storage-0\") pod \"390537ad-fb8f-417c-9577-c6958c371659\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.958895 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-ovsdbserver-sb\") pod \"390537ad-fb8f-417c-9577-c6958c371659\" (UID: \"390537ad-fb8f-417c-9577-c6958c371659\") " Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.959034 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "390537ad-fb8f-417c-9577-c6958c371659" (UID: "390537ad-fb8f-417c-9577-c6958c371659"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.959450 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "390537ad-fb8f-417c-9577-c6958c371659" (UID: "390537ad-fb8f-417c-9577-c6958c371659"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.959514 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.959523 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "390537ad-fb8f-417c-9577-c6958c371659" (UID: "390537ad-fb8f-417c-9577-c6958c371659"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.959532 4699 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.959563 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:17 crc kubenswrapper[4699]: I0226 11:35:17.959593 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-config" (OuterVolumeSpecName: "config") pod "390537ad-fb8f-417c-9577-c6958c371659" (UID: "390537ad-fb8f-417c-9577-c6958c371659"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:35:18 crc kubenswrapper[4699]: I0226 11:35:18.061897 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:18 crc kubenswrapper[4699]: I0226 11:35:18.061964 4699 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:18 crc kubenswrapper[4699]: I0226 11:35:18.061986 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/390537ad-fb8f-417c-9577-c6958c371659-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:18 crc kubenswrapper[4699]: I0226 11:35:18.126238 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/390537ad-fb8f-417c-9577-c6958c371659-kube-api-access-tchk9" (OuterVolumeSpecName: "kube-api-access-tchk9") pod "390537ad-fb8f-417c-9577-c6958c371659" (UID: "390537ad-fb8f-417c-9577-c6958c371659"). InnerVolumeSpecName "kube-api-access-tchk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:35:18 crc kubenswrapper[4699]: W0226 11:35:18.162924 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24dd88a8_4737_4ebc_8925_b2bcedb760c2.slice/crio-61314fbd31e9e12163888c8229d0a18a04f98696e27f58cce31b7c12f38dec1f WatchSource:0}: Error finding container 61314fbd31e9e12163888c8229d0a18a04f98696e27f58cce31b7c12f38dec1f: Status 404 returned error can't find the container with id 61314fbd31e9e12163888c8229d0a18a04f98696e27f58cce31b7c12f38dec1f Feb 26 11:35:18 crc kubenswrapper[4699]: I0226 11:35:18.163226 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tchk9\" (UniqueName: \"kubernetes.io/projected/390537ad-fb8f-417c-9577-c6958c371659-kube-api-access-tchk9\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:18 crc kubenswrapper[4699]: I0226 11:35:18.163783 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-hddfn"] Feb 26 11:35:18 crc kubenswrapper[4699]: I0226 11:35:18.788016 4699 generic.go:334] "Generic (PLEG): container finished" podID="24dd88a8-4737-4ebc-8925-b2bcedb760c2" containerID="957c1998dc4811d9f911eb451c4fa82b1cc78906876fe523cf44f5d6bff01ae4" exitCode=0 Feb 26 11:35:18 crc kubenswrapper[4699]: I0226 11:35:18.788169 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" event={"ID":"24dd88a8-4737-4ebc-8925-b2bcedb760c2","Type":"ContainerDied","Data":"957c1998dc4811d9f911eb451c4fa82b1cc78906876fe523cf44f5d6bff01ae4"} Feb 26 11:35:18 crc kubenswrapper[4699]: I0226 11:35:18.789357 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" event={"ID":"24dd88a8-4737-4ebc-8925-b2bcedb760c2","Type":"ContainerStarted","Data":"61314fbd31e9e12163888c8229d0a18a04f98696e27f58cce31b7c12f38dec1f"} Feb 26 11:35:18 crc kubenswrapper[4699]: I0226 11:35:18.793819 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0d9b2e6e-c43b-49ae-a71e-844610621e3e","Type":"ContainerStarted","Data":"55f4011887d6914b7d8dfc8eb0b5e6a2ccfc779f66663a9834966baecf2a10a6"} Feb 26 11:35:18 crc kubenswrapper[4699]: I0226 11:35:18.795906 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-5898z" Feb 26 11:35:18 crc kubenswrapper[4699]: I0226 11:35:18.796615 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3b731314-eb90-4a19-a425-2f9282af2a7f","Type":"ContainerStarted","Data":"f1100fd18904af6344b106082dc21d94e757513c180993bbeb69617d1198ee7b"} Feb 26 11:35:19 crc kubenswrapper[4699]: I0226 11:35:19.038602 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-5898z"] Feb 26 11:35:19 crc kubenswrapper[4699]: I0226 11:35:19.047874 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-5898z"] Feb 26 11:35:19 crc kubenswrapper[4699]: I0226 11:35:19.808833 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" event={"ID":"24dd88a8-4737-4ebc-8925-b2bcedb760c2","Type":"ContainerStarted","Data":"1e294159b0f9ba0aebb703b10b2bb5ec590973057e4b46964cbb2bd082081378"} Feb 26 11:35:19 crc kubenswrapper[4699]: I0226 11:35:19.809209 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:19 crc kubenswrapper[4699]: I0226 11:35:19.835355 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" podStartSLOduration=2.835334031 podStartE2EDuration="2.835334031s" podCreationTimestamp="2026-02-26 11:35:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:35:19.824409485 +0000 UTC m=+1465.635235919" watchObservedRunningTime="2026-02-26 11:35:19.835334031 +0000 UTC m=+1465.646160465" Feb 26 11:35:20 crc kubenswrapper[4699]: I0226 11:35:20.272445 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="390537ad-fb8f-417c-9577-c6958c371659" path="/var/lib/kubelet/pods/390537ad-fb8f-417c-9577-c6958c371659/volumes" Feb 26 11:35:27 crc kubenswrapper[4699]: I0226 11:35:27.593933 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8c6f6df99-hddfn" Feb 26 11:35:27 crc kubenswrapper[4699]: I0226 11:35:27.653356 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-n24ct"] Feb 26 11:35:27 crc kubenswrapper[4699]: I0226 11:35:27.653673 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" podUID="a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8" containerName="dnsmasq-dns" containerID="cri-o://6eaae1ee8cf33fbf9c5f7338398f314d84ab95982df8c9ecfdd230c190623ca8" gracePeriod=10 Feb 26 11:35:28 crc kubenswrapper[4699]: I0226 11:35:28.001739 4699 generic.go:334] "Generic (PLEG): container finished" podID="a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8" containerID="6eaae1ee8cf33fbf9c5f7338398f314d84ab95982df8c9ecfdd230c190623ca8" exitCode=0 Feb 26 11:35:28 crc kubenswrapper[4699]: I0226 11:35:28.002062 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" event={"ID":"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8","Type":"ContainerDied","Data":"6eaae1ee8cf33fbf9c5f7338398f314d84ab95982df8c9ecfdd230c190623ca8"} Feb 26 11:35:28 crc kubenswrapper[4699]: I0226 11:35:28.124483 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:35:28 crc kubenswrapper[4699]: I0226 11:35:28.198187 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-ovsdbserver-sb\") pod \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " Feb 26 11:35:28 crc kubenswrapper[4699]: I0226 11:35:28.198283 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-ovsdbserver-nb\") pod \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " Feb 26 11:35:28 crc kubenswrapper[4699]: I0226 11:35:28.198332 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr9gh\" (UniqueName: \"kubernetes.io/projected/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-kube-api-access-mr9gh\") pod \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " Feb 26 11:35:28 crc kubenswrapper[4699]: I0226 11:35:28.198422 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-dns-swift-storage-0\") pod \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " Feb 26 11:35:28 crc kubenswrapper[4699]: I0226 11:35:28.198450 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-dns-svc\") pod \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " Feb 26 11:35:28 crc kubenswrapper[4699]: I0226 11:35:28.198544 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-config\") pod \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\" (UID: \"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8\") " Feb 26 11:35:28 crc kubenswrapper[4699]: I0226 11:35:28.203452 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-kube-api-access-mr9gh" (OuterVolumeSpecName: "kube-api-access-mr9gh") pod "a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8" (UID: "a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8"). InnerVolumeSpecName "kube-api-access-mr9gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:35:28 crc kubenswrapper[4699]: I0226 11:35:28.247317 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8" (UID: "a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:35:28 crc kubenswrapper[4699]: I0226 11:35:28.248384 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8" (UID: "a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:35:28 crc kubenswrapper[4699]: I0226 11:35:28.251523 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8" (UID: "a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:35:28 crc kubenswrapper[4699]: I0226 11:35:28.258452 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-config" (OuterVolumeSpecName: "config") pod "a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8" (UID: "a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:35:28 crc kubenswrapper[4699]: I0226 11:35:28.267217 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8" (UID: "a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:35:28 crc kubenswrapper[4699]: I0226 11:35:28.304385 4699 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:28 crc kubenswrapper[4699]: I0226 11:35:28.304422 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:28 crc kubenswrapper[4699]: I0226 11:35:28.304433 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-config\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:28 crc kubenswrapper[4699]: I0226 11:35:28.304443 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:28 crc kubenswrapper[4699]: I0226 11:35:28.304452 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:28 crc kubenswrapper[4699]: I0226 11:35:28.304461 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr9gh\" (UniqueName: \"kubernetes.io/projected/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8-kube-api-access-mr9gh\") on node \"crc\" DevicePath \"\"" Feb 26 11:35:29 crc kubenswrapper[4699]: I0226 11:35:29.012008 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" event={"ID":"a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8","Type":"ContainerDied","Data":"d1d082240eaff72440b2e6ab6682cc7abdf39c898255b3c76048247bf61866be"} Feb 26 11:35:29 crc kubenswrapper[4699]: I0226 11:35:29.012177 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-n24ct" Feb 26 11:35:29 crc kubenswrapper[4699]: I0226 11:35:29.012423 4699 scope.go:117] "RemoveContainer" containerID="6eaae1ee8cf33fbf9c5f7338398f314d84ab95982df8c9ecfdd230c190623ca8" Feb 26 11:35:29 crc kubenswrapper[4699]: I0226 11:35:29.037765 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-n24ct"] Feb 26 11:35:29 crc kubenswrapper[4699]: I0226 11:35:29.038640 4699 scope.go:117] "RemoveContainer" containerID="ac279ccad47adb2f6ab2c9bfda803625849869922644f32045786543361b143f" Feb 26 11:35:29 crc kubenswrapper[4699]: I0226 11:35:29.046891 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-n24ct"] Feb 26 11:35:30 crc kubenswrapper[4699]: I0226 11:35:30.272564 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8" path="/var/lib/kubelet/pods/a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8/volumes" Feb 26 11:35:36 crc kubenswrapper[4699]: I0226 11:35:36.203982 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n"] Feb 26 11:35:36 crc kubenswrapper[4699]: E0226 11:35:36.205352 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8" containerName="init" Feb 26 11:35:36 crc kubenswrapper[4699]: I0226 11:35:36.205374 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8" containerName="init" Feb 26 11:35:36 crc kubenswrapper[4699]: E0226 11:35:36.205410 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8" containerName="dnsmasq-dns" Feb 26 11:35:36 crc kubenswrapper[4699]: I0226 11:35:36.205420 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8" containerName="dnsmasq-dns" Feb 26 11:35:36 crc kubenswrapper[4699]: I0226 11:35:36.205685 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3e9b0e9-b01f-4c92-aa43-f31c9fd397f8" containerName="dnsmasq-dns" Feb 26 11:35:36 crc kubenswrapper[4699]: I0226 11:35:36.206714 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n" Feb 26 11:35:36 crc kubenswrapper[4699]: I0226 11:35:36.210773 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 11:35:36 crc kubenswrapper[4699]: I0226 11:35:36.210812 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 11:35:36 crc kubenswrapper[4699]: I0226 11:35:36.211054 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 11:35:36 crc kubenswrapper[4699]: I0226 11:35:36.214959 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n"] Feb 26 11:35:36 crc kubenswrapper[4699]: I0226 11:35:36.217633 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-69sdb" Feb 26 11:35:36 crc kubenswrapper[4699]: I0226 11:35:36.338271 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57bbec48-f33e-43b8-9f82-8cc3a42e7723-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n\" (UID: \"57bbec48-f33e-43b8-9f82-8cc3a42e7723\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n" Feb 26 11:35:36 crc kubenswrapper[4699]: I0226 11:35:36.338331 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57bbec48-f33e-43b8-9f82-8cc3a42e7723-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n\" (UID: \"57bbec48-f33e-43b8-9f82-8cc3a42e7723\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n" Feb 26 11:35:36 crc kubenswrapper[4699]: I0226 11:35:36.338607 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfm59\" (UniqueName: \"kubernetes.io/projected/57bbec48-f33e-43b8-9f82-8cc3a42e7723-kube-api-access-kfm59\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n\" (UID: \"57bbec48-f33e-43b8-9f82-8cc3a42e7723\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n" Feb 26 11:35:36 crc kubenswrapper[4699]: I0226 11:35:36.338813 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57bbec48-f33e-43b8-9f82-8cc3a42e7723-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n\" (UID: \"57bbec48-f33e-43b8-9f82-8cc3a42e7723\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n" Feb 26 11:35:36 crc kubenswrapper[4699]: I0226 11:35:36.441188 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57bbec48-f33e-43b8-9f82-8cc3a42e7723-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n\" (UID: \"57bbec48-f33e-43b8-9f82-8cc3a42e7723\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n" Feb 26 11:35:36 crc kubenswrapper[4699]: I0226 11:35:36.441384 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57bbec48-f33e-43b8-9f82-8cc3a42e7723-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n\" (UID: \"57bbec48-f33e-43b8-9f82-8cc3a42e7723\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n" Feb 26 11:35:36 crc kubenswrapper[4699]: I0226 11:35:36.441428 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57bbec48-f33e-43b8-9f82-8cc3a42e7723-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n\" (UID: \"57bbec48-f33e-43b8-9f82-8cc3a42e7723\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n" Feb 26 11:35:36 crc kubenswrapper[4699]: I0226 11:35:36.441572 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfm59\" (UniqueName: \"kubernetes.io/projected/57bbec48-f33e-43b8-9f82-8cc3a42e7723-kube-api-access-kfm59\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n\" (UID: \"57bbec48-f33e-43b8-9f82-8cc3a42e7723\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n" Feb 26 11:35:36 crc kubenswrapper[4699]: I0226 11:35:36.448899 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57bbec48-f33e-43b8-9f82-8cc3a42e7723-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n\" (UID: \"57bbec48-f33e-43b8-9f82-8cc3a42e7723\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n" Feb 26 11:35:36 crc kubenswrapper[4699]: I0226 11:35:36.449140 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57bbec48-f33e-43b8-9f82-8cc3a42e7723-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n\" (UID: \"57bbec48-f33e-43b8-9f82-8cc3a42e7723\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n" Feb 26 11:35:36 crc kubenswrapper[4699]: I0226 11:35:36.453726 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57bbec48-f33e-43b8-9f82-8cc3a42e7723-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n\" (UID: \"57bbec48-f33e-43b8-9f82-8cc3a42e7723\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n" Feb 26 11:35:36 crc kubenswrapper[4699]: I0226 11:35:36.459671 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfm59\" (UniqueName: \"kubernetes.io/projected/57bbec48-f33e-43b8-9f82-8cc3a42e7723-kube-api-access-kfm59\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n\" (UID: \"57bbec48-f33e-43b8-9f82-8cc3a42e7723\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n" Feb 26 11:35:36 crc kubenswrapper[4699]: I0226 11:35:36.530244 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n" Feb 26 11:35:37 crc kubenswrapper[4699]: W0226 11:35:37.047912 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57bbec48_f33e_43b8_9f82_8cc3a42e7723.slice/crio-af6016a31142a78e40da5360c3d498d8cf13c0803705344a16985003cec0582e WatchSource:0}: Error finding container af6016a31142a78e40da5360c3d498d8cf13c0803705344a16985003cec0582e: Status 404 returned error can't find the container with id af6016a31142a78e40da5360c3d498d8cf13c0803705344a16985003cec0582e Feb 26 11:35:37 crc kubenswrapper[4699]: I0226 11:35:37.050373 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n"] Feb 26 11:35:37 crc kubenswrapper[4699]: I0226 11:35:37.050984 4699 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 11:35:37 crc kubenswrapper[4699]: I0226 11:35:37.317373 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n" event={"ID":"57bbec48-f33e-43b8-9f82-8cc3a42e7723","Type":"ContainerStarted","Data":"af6016a31142a78e40da5360c3d498d8cf13c0803705344a16985003cec0582e"} Feb 26 11:35:51 crc kubenswrapper[4699]: I0226 11:35:51.408787 4699 generic.go:334] "Generic (PLEG): container finished" podID="0d9b2e6e-c43b-49ae-a71e-844610621e3e" containerID="55f4011887d6914b7d8dfc8eb0b5e6a2ccfc779f66663a9834966baecf2a10a6" exitCode=0 Feb 26 11:35:51 crc kubenswrapper[4699]: I0226 11:35:51.408873 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0d9b2e6e-c43b-49ae-a71e-844610621e3e","Type":"ContainerDied","Data":"55f4011887d6914b7d8dfc8eb0b5e6a2ccfc779f66663a9834966baecf2a10a6"} Feb 26 11:35:51 crc kubenswrapper[4699]: I0226 11:35:51.412140 4699 generic.go:334] "Generic (PLEG): container finished" podID="3b731314-eb90-4a19-a425-2f9282af2a7f" containerID="f1100fd18904af6344b106082dc21d94e757513c180993bbeb69617d1198ee7b" exitCode=0 Feb 26 11:35:51 crc kubenswrapper[4699]: I0226 11:35:51.412180 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3b731314-eb90-4a19-a425-2f9282af2a7f","Type":"ContainerDied","Data":"f1100fd18904af6344b106082dc21d94e757513c180993bbeb69617d1198ee7b"} Feb 26 11:35:52 crc kubenswrapper[4699]: E0226 11:35:52.320409 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest" Feb 26 11:35:52 crc kubenswrapper[4699]: E0226 11:35:52.321100 4699 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 26 11:35:52 crc kubenswrapper[4699]: container &Container{Name:repo-setup-edpm-deployment-openstack-edpm-ipam,Image:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,Command:[],Args:[ansible-runner run /runner -p playbook.yaml -i repo-setup-edpm-deployment-openstack-edpm-ipam],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ANSIBLE_VERBOSITY,Value:2,ValueFrom:nil,},EnvVar{Name:RUNNER_PLAYBOOK,Value: Feb 26 11:35:52 crc kubenswrapper[4699]: - hosts: all Feb 26 11:35:52 crc kubenswrapper[4699]: strategy: linear Feb 26 11:35:52 crc kubenswrapper[4699]: tasks: Feb 26 11:35:52 crc kubenswrapper[4699]: - name: Enable podified-repos Feb 26 11:35:52 crc kubenswrapper[4699]: become: true Feb 26 11:35:52 crc kubenswrapper[4699]: ansible.builtin.shell: | Feb 26 11:35:52 crc kubenswrapper[4699]: set -euxo pipefail Feb 26 11:35:52 crc kubenswrapper[4699]: pushd /var/tmp Feb 26 11:35:52 crc kubenswrapper[4699]: curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz Feb 26 11:35:52 crc kubenswrapper[4699]: pushd repo-setup-main Feb 26 11:35:52 crc kubenswrapper[4699]: python3 -m venv ./venv Feb 26 11:35:52 crc kubenswrapper[4699]: PBR_VERSION=0.0.0 ./venv/bin/pip install ./ Feb 26 11:35:52 crc kubenswrapper[4699]: ./venv/bin/repo-setup current-podified -b antelope Feb 26 11:35:52 crc kubenswrapper[4699]: popd Feb 26 11:35:52 crc kubenswrapper[4699]: rm -rf repo-setup-main Feb 26 11:35:52 crc kubenswrapper[4699]: Feb 26 11:35:52 crc kubenswrapper[4699]: Feb 26 11:35:52 crc kubenswrapper[4699]: ,ValueFrom:nil,},EnvVar{Name:RUNNER_EXTRA_VARS,Value: Feb 26 11:35:52 crc kubenswrapper[4699]: edpm_override_hosts: openstack-edpm-ipam Feb 26 11:35:52 crc kubenswrapper[4699]: edpm_service_type: repo-setup Feb 26 11:35:52 crc kubenswrapper[4699]: Feb 26 11:35:52 crc kubenswrapper[4699]: Feb 26 11:35:52 crc kubenswrapper[4699]: ,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:repo-setup-combined-ca-bundle,ReadOnly:false,MountPath:/var/lib/openstack/cacerts/repo-setup,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key-openstack-edpm-ipam,ReadOnly:false,MountPath:/runner/env/ssh_key/ssh_key_openstack-edpm-ipam,SubPath:ssh_key_openstack-edpm-ipam,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:inventory,ReadOnly:false,MountPath:/runner/inventory/hosts,SubPath:inventory,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kfm59,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:openstack-aee-default-env,},Optional:*true,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n_openstack(57bbec48-f33e-43b8-9f82-8cc3a42e7723): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Feb 26 11:35:52 crc kubenswrapper[4699]: > logger="UnhandledError" Feb 26 11:35:52 crc kubenswrapper[4699]: E0226 11:35:52.322366 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n" podUID="57bbec48-f33e-43b8-9f82-8cc3a42e7723" Feb 26 11:35:52 crc kubenswrapper[4699]: E0226 11:35:52.421497 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest\\\"\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n" podUID="57bbec48-f33e-43b8-9f82-8cc3a42e7723" Feb 26 11:35:53 crc kubenswrapper[4699]: I0226 11:35:53.436469 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0d9b2e6e-c43b-49ae-a71e-844610621e3e","Type":"ContainerStarted","Data":"081ba4071f2b3c84f8726ec4603efe137af81b7020ff19b2eeacb43719124818"} Feb 26 11:35:53 crc kubenswrapper[4699]: I0226 11:35:53.437023 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 26 11:35:53 crc kubenswrapper[4699]: I0226 11:35:53.441225 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3b731314-eb90-4a19-a425-2f9282af2a7f","Type":"ContainerStarted","Data":"6e7541daf389c5f883354e578c9ebbd4867cbce55b6bc94a679cc8a43069d874"} Feb 26 11:35:53 crc kubenswrapper[4699]: I0226 11:35:53.442077 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:35:53 crc kubenswrapper[4699]: I0226 11:35:53.472518 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.472489024 podStartE2EDuration="38.472489024s" podCreationTimestamp="2026-02-26 11:35:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:35:53.458911787 +0000 UTC m=+1499.269738231" watchObservedRunningTime="2026-02-26 11:35:53.472489024 +0000 UTC m=+1499.283315458" Feb 26 11:35:53 crc kubenswrapper[4699]: I0226 11:35:53.498426 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.49840161 podStartE2EDuration="37.49840161s" podCreationTimestamp="2026-02-26 11:35:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 11:35:53.487997507 +0000 UTC m=+1499.298823961" watchObservedRunningTime="2026-02-26 11:35:53.49840161 +0000 UTC m=+1499.309228044" Feb 26 11:35:55 crc kubenswrapper[4699]: I0226 11:35:55.007934 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b92vt"] Feb 26 11:35:55 crc kubenswrapper[4699]: I0226 11:35:55.010837 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b92vt" Feb 26 11:35:55 crc kubenswrapper[4699]: I0226 11:35:55.019038 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b92vt"] Feb 26 11:35:55 crc kubenswrapper[4699]: I0226 11:35:55.142011 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdc5p\" (UniqueName: \"kubernetes.io/projected/095e0632-b9cc-4410-af45-249da70797aa-kube-api-access-fdc5p\") pod \"redhat-operators-b92vt\" (UID: \"095e0632-b9cc-4410-af45-249da70797aa\") " pod="openshift-marketplace/redhat-operators-b92vt" Feb 26 11:35:55 crc kubenswrapper[4699]: I0226 11:35:55.142075 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/095e0632-b9cc-4410-af45-249da70797aa-utilities\") pod \"redhat-operators-b92vt\" (UID: \"095e0632-b9cc-4410-af45-249da70797aa\") " pod="openshift-marketplace/redhat-operators-b92vt" Feb 26 11:35:55 crc kubenswrapper[4699]: I0226 11:35:55.142327 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/095e0632-b9cc-4410-af45-249da70797aa-catalog-content\") pod \"redhat-operators-b92vt\" (UID: \"095e0632-b9cc-4410-af45-249da70797aa\") " pod="openshift-marketplace/redhat-operators-b92vt" Feb 26 11:35:55 crc kubenswrapper[4699]: I0226 11:35:55.244712 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/095e0632-b9cc-4410-af45-249da70797aa-catalog-content\") pod \"redhat-operators-b92vt\" (UID: \"095e0632-b9cc-4410-af45-249da70797aa\") " pod="openshift-marketplace/redhat-operators-b92vt" Feb 26 11:35:55 crc kubenswrapper[4699]: I0226 11:35:55.244846 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdc5p\" (UniqueName: \"kubernetes.io/projected/095e0632-b9cc-4410-af45-249da70797aa-kube-api-access-fdc5p\") pod \"redhat-operators-b92vt\" (UID: \"095e0632-b9cc-4410-af45-249da70797aa\") " pod="openshift-marketplace/redhat-operators-b92vt" Feb 26 11:35:55 crc kubenswrapper[4699]: I0226 11:35:55.244890 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/095e0632-b9cc-4410-af45-249da70797aa-utilities\") pod \"redhat-operators-b92vt\" (UID: \"095e0632-b9cc-4410-af45-249da70797aa\") " pod="openshift-marketplace/redhat-operators-b92vt" Feb 26 11:35:55 crc kubenswrapper[4699]: I0226 11:35:55.245407 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/095e0632-b9cc-4410-af45-249da70797aa-catalog-content\") pod \"redhat-operators-b92vt\" (UID: \"095e0632-b9cc-4410-af45-249da70797aa\") " pod="openshift-marketplace/redhat-operators-b92vt" Feb 26 11:35:55 crc kubenswrapper[4699]: I0226 11:35:55.245448 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/095e0632-b9cc-4410-af45-249da70797aa-utilities\") pod \"redhat-operators-b92vt\" (UID: \"095e0632-b9cc-4410-af45-249da70797aa\") " pod="openshift-marketplace/redhat-operators-b92vt" Feb 26 11:35:55 crc kubenswrapper[4699]: I0226 11:35:55.267396 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdc5p\" (UniqueName: \"kubernetes.io/projected/095e0632-b9cc-4410-af45-249da70797aa-kube-api-access-fdc5p\") pod \"redhat-operators-b92vt\" (UID: \"095e0632-b9cc-4410-af45-249da70797aa\") " pod="openshift-marketplace/redhat-operators-b92vt" Feb 26 11:35:55 crc kubenswrapper[4699]: I0226 11:35:55.332194 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b92vt" Feb 26 11:35:56 crc kubenswrapper[4699]: I0226 11:35:55.999656 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b92vt"] Feb 26 11:35:57 crc kubenswrapper[4699]: I0226 11:35:57.074023 4699 generic.go:334] "Generic (PLEG): container finished" podID="095e0632-b9cc-4410-af45-249da70797aa" containerID="a412e1611c3e8a9530b03f1738de0d8137ca46fac15c0f749599ab1aabddd3c3" exitCode=0 Feb 26 11:35:57 crc kubenswrapper[4699]: I0226 11:35:57.074406 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b92vt" event={"ID":"095e0632-b9cc-4410-af45-249da70797aa","Type":"ContainerDied","Data":"a412e1611c3e8a9530b03f1738de0d8137ca46fac15c0f749599ab1aabddd3c3"} Feb 26 11:35:57 crc kubenswrapper[4699]: I0226 11:35:57.074441 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b92vt" event={"ID":"095e0632-b9cc-4410-af45-249da70797aa","Type":"ContainerStarted","Data":"d12ea0f251bc41e4b956605602d54f047da25af921010667a43f8d590bf06d61"} Feb 26 11:35:59 crc kubenswrapper[4699]: I0226 11:35:59.151133 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b92vt" event={"ID":"095e0632-b9cc-4410-af45-249da70797aa","Type":"ContainerStarted","Data":"b30d4966bced026f239d19e3ba2096688e67d934ba8c4fdf464484f205dad9f6"} Feb 26 11:36:00 crc kubenswrapper[4699]: I0226 11:36:00.157408 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535096-xr7rk"] Feb 26 11:36:00 crc kubenswrapper[4699]: I0226 11:36:00.159029 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535096-xr7rk" Feb 26 11:36:00 crc kubenswrapper[4699]: I0226 11:36:00.168635 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 11:36:00 crc kubenswrapper[4699]: I0226 11:36:00.169065 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 11:36:00 crc kubenswrapper[4699]: I0226 11:36:00.169293 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 11:36:00 crc kubenswrapper[4699]: I0226 11:36:00.180348 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535096-xr7rk"] Feb 26 11:36:00 crc kubenswrapper[4699]: I0226 11:36:00.288455 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzw9d\" (UniqueName: \"kubernetes.io/projected/6b65e61c-3853-4fd6-93c2-9d13c6776589-kube-api-access-lzw9d\") pod \"auto-csr-approver-29535096-xr7rk\" (UID: \"6b65e61c-3853-4fd6-93c2-9d13c6776589\") " pod="openshift-infra/auto-csr-approver-29535096-xr7rk" Feb 26 11:36:00 crc kubenswrapper[4699]: I0226 11:36:00.390690 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzw9d\" (UniqueName: \"kubernetes.io/projected/6b65e61c-3853-4fd6-93c2-9d13c6776589-kube-api-access-lzw9d\") pod \"auto-csr-approver-29535096-xr7rk\" (UID: \"6b65e61c-3853-4fd6-93c2-9d13c6776589\") " pod="openshift-infra/auto-csr-approver-29535096-xr7rk" Feb 26 11:36:00 crc kubenswrapper[4699]: I0226 11:36:00.875743 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzw9d\" (UniqueName: \"kubernetes.io/projected/6b65e61c-3853-4fd6-93c2-9d13c6776589-kube-api-access-lzw9d\") pod \"auto-csr-approver-29535096-xr7rk\" (UID: \"6b65e61c-3853-4fd6-93c2-9d13c6776589\") " pod="openshift-infra/auto-csr-approver-29535096-xr7rk" Feb 26 11:36:01 crc kubenswrapper[4699]: I0226 11:36:01.090731 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535096-xr7rk" Feb 26 11:36:01 crc kubenswrapper[4699]: W0226 11:36:01.567579 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b65e61c_3853_4fd6_93c2_9d13c6776589.slice/crio-e79e93ab54a6caa74fe83a7d4937a5bd6311e550b7ba931ab206e7d1f93c0ced WatchSource:0}: Error finding container e79e93ab54a6caa74fe83a7d4937a5bd6311e550b7ba931ab206e7d1f93c0ced: Status 404 returned error can't find the container with id e79e93ab54a6caa74fe83a7d4937a5bd6311e550b7ba931ab206e7d1f93c0ced Feb 26 11:36:01 crc kubenswrapper[4699]: I0226 11:36:01.584806 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535096-xr7rk"] Feb 26 11:36:02 crc kubenswrapper[4699]: I0226 11:36:02.217802 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535096-xr7rk" event={"ID":"6b65e61c-3853-4fd6-93c2-9d13c6776589","Type":"ContainerStarted","Data":"e79e93ab54a6caa74fe83a7d4937a5bd6311e550b7ba931ab206e7d1f93c0ced"} Feb 26 11:36:04 crc kubenswrapper[4699]: I0226 11:36:04.252771 4699 generic.go:334] "Generic (PLEG): container finished" podID="6b65e61c-3853-4fd6-93c2-9d13c6776589" containerID="dd9ce01dbb3d28e8559eda1261c169a7dbac7ba191f3aabd0c7a5d33511f3c12" exitCode=0 Feb 26 11:36:04 crc kubenswrapper[4699]: I0226 11:36:04.252866 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535096-xr7rk" event={"ID":"6b65e61c-3853-4fd6-93c2-9d13c6776589","Type":"ContainerDied","Data":"dd9ce01dbb3d28e8559eda1261c169a7dbac7ba191f3aabd0c7a5d33511f3c12"} Feb 26 11:36:05 crc kubenswrapper[4699]: I0226 11:36:05.597054 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535096-xr7rk" Feb 26 11:36:05 crc kubenswrapper[4699]: I0226 11:36:05.763724 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzw9d\" (UniqueName: \"kubernetes.io/projected/6b65e61c-3853-4fd6-93c2-9d13c6776589-kube-api-access-lzw9d\") pod \"6b65e61c-3853-4fd6-93c2-9d13c6776589\" (UID: \"6b65e61c-3853-4fd6-93c2-9d13c6776589\") " Feb 26 11:36:05 crc kubenswrapper[4699]: I0226 11:36:05.771743 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b65e61c-3853-4fd6-93c2-9d13c6776589-kube-api-access-lzw9d" (OuterVolumeSpecName: "kube-api-access-lzw9d") pod "6b65e61c-3853-4fd6-93c2-9d13c6776589" (UID: "6b65e61c-3853-4fd6-93c2-9d13c6776589"). InnerVolumeSpecName "kube-api-access-lzw9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:36:05 crc kubenswrapper[4699]: I0226 11:36:05.866593 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzw9d\" (UniqueName: \"kubernetes.io/projected/6b65e61c-3853-4fd6-93c2-9d13c6776589-kube-api-access-lzw9d\") on node \"crc\" DevicePath \"\"" Feb 26 11:36:06 crc kubenswrapper[4699]: I0226 11:36:06.240568 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 26 11:36:06 crc kubenswrapper[4699]: I0226 11:36:06.287642 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535096-xr7rk" event={"ID":"6b65e61c-3853-4fd6-93c2-9d13c6776589","Type":"ContainerDied","Data":"e79e93ab54a6caa74fe83a7d4937a5bd6311e550b7ba931ab206e7d1f93c0ced"} Feb 26 11:36:06 crc kubenswrapper[4699]: I0226 11:36:06.287949 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e79e93ab54a6caa74fe83a7d4937a5bd6311e550b7ba931ab206e7d1f93c0ced" Feb 26 11:36:06 crc kubenswrapper[4699]: I0226 11:36:06.288202 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535096-xr7rk" Feb 26 11:36:06 crc kubenswrapper[4699]: I0226 11:36:06.401385 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 26 11:36:06 crc kubenswrapper[4699]: I0226 11:36:06.689629 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535090-7v44h"] Feb 26 11:36:06 crc kubenswrapper[4699]: I0226 11:36:06.710510 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535090-7v44h"] Feb 26 11:36:08 crc kubenswrapper[4699]: I0226 11:36:08.277912 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0d38a99-b56f-423c-9c5b-c8f726bf62f9" path="/var/lib/kubelet/pods/a0d38a99-b56f-423c-9c5b-c8f726bf62f9/volumes" Feb 26 11:36:15 crc kubenswrapper[4699]: I0226 11:36:15.548958 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 11:36:16 crc kubenswrapper[4699]: I0226 11:36:16.407290 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n" event={"ID":"57bbec48-f33e-43b8-9f82-8cc3a42e7723","Type":"ContainerStarted","Data":"d3b1a1a717449801469d3bbcb93483dc2d3c83e649043f7dd4668fd3aea9c6fd"} Feb 26 11:36:17 crc kubenswrapper[4699]: I0226 11:36:17.442151 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n" podStartSLOduration=2.947932414 podStartE2EDuration="41.442078763s" podCreationTimestamp="2026-02-26 11:35:36 +0000 UTC" firstStartedPulling="2026-02-26 11:35:37.050691479 +0000 UTC m=+1482.861517923" lastFinishedPulling="2026-02-26 11:36:15.544837838 +0000 UTC m=+1521.355664272" observedRunningTime="2026-02-26 11:36:17.436324757 +0000 UTC m=+1523.247151221" watchObservedRunningTime="2026-02-26 11:36:17.442078763 +0000 UTC m=+1523.252905197" Feb 26 11:36:19 crc kubenswrapper[4699]: I0226 11:36:19.437083 4699 generic.go:334] "Generic (PLEG): container finished" podID="095e0632-b9cc-4410-af45-249da70797aa" containerID="b30d4966bced026f239d19e3ba2096688e67d934ba8c4fdf464484f205dad9f6" exitCode=0 Feb 26 11:36:19 crc kubenswrapper[4699]: I0226 11:36:19.437155 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b92vt" event={"ID":"095e0632-b9cc-4410-af45-249da70797aa","Type":"ContainerDied","Data":"b30d4966bced026f239d19e3ba2096688e67d934ba8c4fdf464484f205dad9f6"} Feb 26 11:36:20 crc kubenswrapper[4699]: I0226 11:36:20.449542 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b92vt" event={"ID":"095e0632-b9cc-4410-af45-249da70797aa","Type":"ContainerStarted","Data":"12ebf9a305b3675fffc94fa24d3e61874d69efaed6066d44344171bd74ba7c99"} Feb 26 11:36:20 crc kubenswrapper[4699]: I0226 11:36:20.479273 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b92vt" podStartSLOduration=3.58520726 podStartE2EDuration="26.479245947s" podCreationTimestamp="2026-02-26 11:35:54 +0000 UTC" firstStartedPulling="2026-02-26 11:35:57.07659021 +0000 UTC m=+1502.887416644" lastFinishedPulling="2026-02-26 11:36:19.970628897 +0000 UTC m=+1525.781455331" observedRunningTime="2026-02-26 11:36:20.469461293 +0000 UTC m=+1526.280287737" watchObservedRunningTime="2026-02-26 11:36:20.479245947 +0000 UTC m=+1526.290072391" Feb 26 11:36:25 crc kubenswrapper[4699]: I0226 11:36:25.332978 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b92vt" Feb 26 11:36:25 crc kubenswrapper[4699]: I0226 11:36:25.333495 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b92vt" Feb 26 11:36:26 crc kubenswrapper[4699]: I0226 11:36:26.378748 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b92vt" podUID="095e0632-b9cc-4410-af45-249da70797aa" containerName="registry-server" probeResult="failure" output=< Feb 26 11:36:26 crc kubenswrapper[4699]: timeout: failed to connect service ":50051" within 1s Feb 26 11:36:26 crc kubenswrapper[4699]: > Feb 26 11:36:28 crc kubenswrapper[4699]: I0226 11:36:28.139528 4699 scope.go:117] "RemoveContainer" containerID="80050d8650124cdda213563d70066e26f43de8d356825ac23d9b4fdfcc1d3b22" Feb 26 11:36:28 crc kubenswrapper[4699]: I0226 11:36:28.168147 4699 scope.go:117] "RemoveContainer" containerID="02c1126ec0d166bfd6091e444f16da2788ee1d75f58864b8bc99a6f2547f9104" Feb 26 11:36:30 crc kubenswrapper[4699]: I0226 11:36:30.550724 4699 generic.go:334] "Generic (PLEG): container finished" podID="57bbec48-f33e-43b8-9f82-8cc3a42e7723" containerID="d3b1a1a717449801469d3bbcb93483dc2d3c83e649043f7dd4668fd3aea9c6fd" exitCode=0 Feb 26 11:36:30 crc kubenswrapper[4699]: I0226 11:36:30.550848 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n" event={"ID":"57bbec48-f33e-43b8-9f82-8cc3a42e7723","Type":"ContainerDied","Data":"d3b1a1a717449801469d3bbcb93483dc2d3c83e649043f7dd4668fd3aea9c6fd"} Feb 26 11:36:31 crc kubenswrapper[4699]: I0226 11:36:31.946368 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.002774 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57bbec48-f33e-43b8-9f82-8cc3a42e7723-inventory\") pod \"57bbec48-f33e-43b8-9f82-8cc3a42e7723\" (UID: \"57bbec48-f33e-43b8-9f82-8cc3a42e7723\") " Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.003001 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfm59\" (UniqueName: \"kubernetes.io/projected/57bbec48-f33e-43b8-9f82-8cc3a42e7723-kube-api-access-kfm59\") pod \"57bbec48-f33e-43b8-9f82-8cc3a42e7723\" (UID: \"57bbec48-f33e-43b8-9f82-8cc3a42e7723\") " Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.003030 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57bbec48-f33e-43b8-9f82-8cc3a42e7723-repo-setup-combined-ca-bundle\") pod \"57bbec48-f33e-43b8-9f82-8cc3a42e7723\" (UID: \"57bbec48-f33e-43b8-9f82-8cc3a42e7723\") " Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.003165 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57bbec48-f33e-43b8-9f82-8cc3a42e7723-ssh-key-openstack-edpm-ipam\") pod \"57bbec48-f33e-43b8-9f82-8cc3a42e7723\" (UID: \"57bbec48-f33e-43b8-9f82-8cc3a42e7723\") " Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.008737 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57bbec48-f33e-43b8-9f82-8cc3a42e7723-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "57bbec48-f33e-43b8-9f82-8cc3a42e7723" (UID: "57bbec48-f33e-43b8-9f82-8cc3a42e7723"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.008964 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57bbec48-f33e-43b8-9f82-8cc3a42e7723-kube-api-access-kfm59" (OuterVolumeSpecName: "kube-api-access-kfm59") pod "57bbec48-f33e-43b8-9f82-8cc3a42e7723" (UID: "57bbec48-f33e-43b8-9f82-8cc3a42e7723"). InnerVolumeSpecName "kube-api-access-kfm59". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.035569 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57bbec48-f33e-43b8-9f82-8cc3a42e7723-inventory" (OuterVolumeSpecName: "inventory") pod "57bbec48-f33e-43b8-9f82-8cc3a42e7723" (UID: "57bbec48-f33e-43b8-9f82-8cc3a42e7723"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.037906 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57bbec48-f33e-43b8-9f82-8cc3a42e7723-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "57bbec48-f33e-43b8-9f82-8cc3a42e7723" (UID: "57bbec48-f33e-43b8-9f82-8cc3a42e7723"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.105042 4699 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57bbec48-f33e-43b8-9f82-8cc3a42e7723-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.105075 4699 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57bbec48-f33e-43b8-9f82-8cc3a42e7723-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.105084 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfm59\" (UniqueName: \"kubernetes.io/projected/57bbec48-f33e-43b8-9f82-8cc3a42e7723-kube-api-access-kfm59\") on node \"crc\" DevicePath \"\"" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.105092 4699 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57bbec48-f33e-43b8-9f82-8cc3a42e7723-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.571087 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n" event={"ID":"57bbec48-f33e-43b8-9f82-8cc3a42e7723","Type":"ContainerDied","Data":"af6016a31142a78e40da5360c3d498d8cf13c0803705344a16985003cec0582e"} Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.571400 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af6016a31142a78e40da5360c3d498d8cf13c0803705344a16985003cec0582e" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.571183 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.655274 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zdf2z"] Feb 26 11:36:32 crc kubenswrapper[4699]: E0226 11:36:32.655807 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57bbec48-f33e-43b8-9f82-8cc3a42e7723" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.655835 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="57bbec48-f33e-43b8-9f82-8cc3a42e7723" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 26 11:36:32 crc kubenswrapper[4699]: E0226 11:36:32.655854 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b65e61c-3853-4fd6-93c2-9d13c6776589" containerName="oc" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.655863 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b65e61c-3853-4fd6-93c2-9d13c6776589" containerName="oc" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.656140 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="57bbec48-f33e-43b8-9f82-8cc3a42e7723" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.656166 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b65e61c-3853-4fd6-93c2-9d13c6776589" containerName="oc" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.656913 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zdf2z" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.658894 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.659991 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.660151 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.662269 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-69sdb" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.664760 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zdf2z"] Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.716096 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fcea0fcf-0c80-4334-9327-f0a57b385cc9-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zdf2z\" (UID: \"fcea0fcf-0c80-4334-9327-f0a57b385cc9\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zdf2z" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.716338 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fcea0fcf-0c80-4334-9327-f0a57b385cc9-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zdf2z\" (UID: \"fcea0fcf-0c80-4334-9327-f0a57b385cc9\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zdf2z" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.716370 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbnnk\" (UniqueName: \"kubernetes.io/projected/fcea0fcf-0c80-4334-9327-f0a57b385cc9-kube-api-access-cbnnk\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zdf2z\" (UID: \"fcea0fcf-0c80-4334-9327-f0a57b385cc9\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zdf2z" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.818258 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fcea0fcf-0c80-4334-9327-f0a57b385cc9-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zdf2z\" (UID: \"fcea0fcf-0c80-4334-9327-f0a57b385cc9\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zdf2z" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.818383 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fcea0fcf-0c80-4334-9327-f0a57b385cc9-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zdf2z\" (UID: \"fcea0fcf-0c80-4334-9327-f0a57b385cc9\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zdf2z" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.818413 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbnnk\" (UniqueName: \"kubernetes.io/projected/fcea0fcf-0c80-4334-9327-f0a57b385cc9-kube-api-access-cbnnk\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zdf2z\" (UID: \"fcea0fcf-0c80-4334-9327-f0a57b385cc9\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zdf2z" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.822630 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fcea0fcf-0c80-4334-9327-f0a57b385cc9-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zdf2z\" (UID: \"fcea0fcf-0c80-4334-9327-f0a57b385cc9\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zdf2z" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.822676 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fcea0fcf-0c80-4334-9327-f0a57b385cc9-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zdf2z\" (UID: \"fcea0fcf-0c80-4334-9327-f0a57b385cc9\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zdf2z" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.835572 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbnnk\" (UniqueName: \"kubernetes.io/projected/fcea0fcf-0c80-4334-9327-f0a57b385cc9-kube-api-access-cbnnk\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zdf2z\" (UID: \"fcea0fcf-0c80-4334-9327-f0a57b385cc9\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zdf2z" Feb 26 11:36:32 crc kubenswrapper[4699]: I0226 11:36:32.971656 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zdf2z" Feb 26 11:36:33 crc kubenswrapper[4699]: I0226 11:36:33.506338 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zdf2z"] Feb 26 11:36:33 crc kubenswrapper[4699]: I0226 11:36:33.581227 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zdf2z" event={"ID":"fcea0fcf-0c80-4334-9327-f0a57b385cc9","Type":"ContainerStarted","Data":"eb9cf19f3dcfcec0226f2b1b4e3eeb146f04cb508c317c36f5c63ab6d203f2d3"} Feb 26 11:36:34 crc kubenswrapper[4699]: I0226 11:36:34.591470 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zdf2z" event={"ID":"fcea0fcf-0c80-4334-9327-f0a57b385cc9","Type":"ContainerStarted","Data":"d1583c97a6f8ed5901159ae8fbbdacf36c4ff0c48237ee8801ed6a7b20f80324"} Feb 26 11:36:34 crc kubenswrapper[4699]: I0226 11:36:34.613588 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zdf2z" podStartSLOduration=2.164491996 podStartE2EDuration="2.613563851s" podCreationTimestamp="2026-02-26 11:36:32 +0000 UTC" firstStartedPulling="2026-02-26 11:36:33.51113727 +0000 UTC m=+1539.321963714" lastFinishedPulling="2026-02-26 11:36:33.960209125 +0000 UTC m=+1539.771035569" observedRunningTime="2026-02-26 11:36:34.604721044 +0000 UTC m=+1540.415547478" watchObservedRunningTime="2026-02-26 11:36:34.613563851 +0000 UTC m=+1540.424390285" Feb 26 11:36:35 crc kubenswrapper[4699]: I0226 11:36:35.381526 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b92vt" Feb 26 11:36:35 crc kubenswrapper[4699]: I0226 11:36:35.432792 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b92vt" Feb 26 11:36:35 crc kubenswrapper[4699]: I0226 11:36:35.652850 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b92vt"] Feb 26 11:36:36 crc kubenswrapper[4699]: I0226 11:36:36.609203 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b92vt" podUID="095e0632-b9cc-4410-af45-249da70797aa" containerName="registry-server" containerID="cri-o://12ebf9a305b3675fffc94fa24d3e61874d69efaed6066d44344171bd74ba7c99" gracePeriod=2 Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.129057 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b92vt" Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.200717 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/095e0632-b9cc-4410-af45-249da70797aa-catalog-content\") pod \"095e0632-b9cc-4410-af45-249da70797aa\" (UID: \"095e0632-b9cc-4410-af45-249da70797aa\") " Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.200795 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/095e0632-b9cc-4410-af45-249da70797aa-utilities\") pod \"095e0632-b9cc-4410-af45-249da70797aa\" (UID: \"095e0632-b9cc-4410-af45-249da70797aa\") " Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.200864 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdc5p\" (UniqueName: \"kubernetes.io/projected/095e0632-b9cc-4410-af45-249da70797aa-kube-api-access-fdc5p\") pod \"095e0632-b9cc-4410-af45-249da70797aa\" (UID: \"095e0632-b9cc-4410-af45-249da70797aa\") " Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.202075 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/095e0632-b9cc-4410-af45-249da70797aa-utilities" (OuterVolumeSpecName: "utilities") pod "095e0632-b9cc-4410-af45-249da70797aa" (UID: "095e0632-b9cc-4410-af45-249da70797aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.207241 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/095e0632-b9cc-4410-af45-249da70797aa-kube-api-access-fdc5p" (OuterVolumeSpecName: "kube-api-access-fdc5p") pod "095e0632-b9cc-4410-af45-249da70797aa" (UID: "095e0632-b9cc-4410-af45-249da70797aa"). InnerVolumeSpecName "kube-api-access-fdc5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.303377 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/095e0632-b9cc-4410-af45-249da70797aa-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.303413 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdc5p\" (UniqueName: \"kubernetes.io/projected/095e0632-b9cc-4410-af45-249da70797aa-kube-api-access-fdc5p\") on node \"crc\" DevicePath \"\"" Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.331289 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/095e0632-b9cc-4410-af45-249da70797aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "095e0632-b9cc-4410-af45-249da70797aa" (UID: "095e0632-b9cc-4410-af45-249da70797aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.405574 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/095e0632-b9cc-4410-af45-249da70797aa-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.624423 4699 generic.go:334] "Generic (PLEG): container finished" podID="fcea0fcf-0c80-4334-9327-f0a57b385cc9" containerID="d1583c97a6f8ed5901159ae8fbbdacf36c4ff0c48237ee8801ed6a7b20f80324" exitCode=0 Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.624481 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zdf2z" event={"ID":"fcea0fcf-0c80-4334-9327-f0a57b385cc9","Type":"ContainerDied","Data":"d1583c97a6f8ed5901159ae8fbbdacf36c4ff0c48237ee8801ed6a7b20f80324"} Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.628281 4699 generic.go:334] "Generic (PLEG): container finished" podID="095e0632-b9cc-4410-af45-249da70797aa" containerID="12ebf9a305b3675fffc94fa24d3e61874d69efaed6066d44344171bd74ba7c99" exitCode=0 Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.628346 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b92vt" event={"ID":"095e0632-b9cc-4410-af45-249da70797aa","Type":"ContainerDied","Data":"12ebf9a305b3675fffc94fa24d3e61874d69efaed6066d44344171bd74ba7c99"} Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.628369 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b92vt" Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.628393 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b92vt" event={"ID":"095e0632-b9cc-4410-af45-249da70797aa","Type":"ContainerDied","Data":"d12ea0f251bc41e4b956605602d54f047da25af921010667a43f8d590bf06d61"} Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.628420 4699 scope.go:117] "RemoveContainer" containerID="12ebf9a305b3675fffc94fa24d3e61874d69efaed6066d44344171bd74ba7c99" Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.661049 4699 scope.go:117] "RemoveContainer" containerID="b30d4966bced026f239d19e3ba2096688e67d934ba8c4fdf464484f205dad9f6" Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.665633 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b92vt"] Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.681999 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b92vt"] Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.689317 4699 scope.go:117] "RemoveContainer" containerID="a412e1611c3e8a9530b03f1738de0d8137ca46fac15c0f749599ab1aabddd3c3" Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.726368 4699 scope.go:117] "RemoveContainer" containerID="12ebf9a305b3675fffc94fa24d3e61874d69efaed6066d44344171bd74ba7c99" Feb 26 11:36:37 crc kubenswrapper[4699]: E0226 11:36:37.726940 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12ebf9a305b3675fffc94fa24d3e61874d69efaed6066d44344171bd74ba7c99\": container with ID starting with 12ebf9a305b3675fffc94fa24d3e61874d69efaed6066d44344171bd74ba7c99 not found: ID does not exist" containerID="12ebf9a305b3675fffc94fa24d3e61874d69efaed6066d44344171bd74ba7c99" Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.726983 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12ebf9a305b3675fffc94fa24d3e61874d69efaed6066d44344171bd74ba7c99"} err="failed to get container status \"12ebf9a305b3675fffc94fa24d3e61874d69efaed6066d44344171bd74ba7c99\": rpc error: code = NotFound desc = could not find container \"12ebf9a305b3675fffc94fa24d3e61874d69efaed6066d44344171bd74ba7c99\": container with ID starting with 12ebf9a305b3675fffc94fa24d3e61874d69efaed6066d44344171bd74ba7c99 not found: ID does not exist" Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.727010 4699 scope.go:117] "RemoveContainer" containerID="b30d4966bced026f239d19e3ba2096688e67d934ba8c4fdf464484f205dad9f6" Feb 26 11:36:37 crc kubenswrapper[4699]: E0226 11:36:37.727332 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b30d4966bced026f239d19e3ba2096688e67d934ba8c4fdf464484f205dad9f6\": container with ID starting with b30d4966bced026f239d19e3ba2096688e67d934ba8c4fdf464484f205dad9f6 not found: ID does not exist" containerID="b30d4966bced026f239d19e3ba2096688e67d934ba8c4fdf464484f205dad9f6" Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.727367 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b30d4966bced026f239d19e3ba2096688e67d934ba8c4fdf464484f205dad9f6"} err="failed to get container status \"b30d4966bced026f239d19e3ba2096688e67d934ba8c4fdf464484f205dad9f6\": rpc error: code = NotFound desc = could not find container \"b30d4966bced026f239d19e3ba2096688e67d934ba8c4fdf464484f205dad9f6\": container with ID starting with b30d4966bced026f239d19e3ba2096688e67d934ba8c4fdf464484f205dad9f6 not found: ID does not exist" Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.727381 4699 scope.go:117] "RemoveContainer" containerID="a412e1611c3e8a9530b03f1738de0d8137ca46fac15c0f749599ab1aabddd3c3" Feb 26 11:36:37 crc kubenswrapper[4699]: E0226 11:36:37.727612 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a412e1611c3e8a9530b03f1738de0d8137ca46fac15c0f749599ab1aabddd3c3\": container with ID starting with a412e1611c3e8a9530b03f1738de0d8137ca46fac15c0f749599ab1aabddd3c3 not found: ID does not exist" containerID="a412e1611c3e8a9530b03f1738de0d8137ca46fac15c0f749599ab1aabddd3c3" Feb 26 11:36:37 crc kubenswrapper[4699]: I0226 11:36:37.727628 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a412e1611c3e8a9530b03f1738de0d8137ca46fac15c0f749599ab1aabddd3c3"} err="failed to get container status \"a412e1611c3e8a9530b03f1738de0d8137ca46fac15c0f749599ab1aabddd3c3\": rpc error: code = NotFound desc = could not find container \"a412e1611c3e8a9530b03f1738de0d8137ca46fac15c0f749599ab1aabddd3c3\": container with ID starting with a412e1611c3e8a9530b03f1738de0d8137ca46fac15c0f749599ab1aabddd3c3 not found: ID does not exist" Feb 26 11:36:38 crc kubenswrapper[4699]: I0226 11:36:38.271251 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="095e0632-b9cc-4410-af45-249da70797aa" path="/var/lib/kubelet/pods/095e0632-b9cc-4410-af45-249da70797aa/volumes" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.032064 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zdf2z" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.137024 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fcea0fcf-0c80-4334-9327-f0a57b385cc9-inventory\") pod \"fcea0fcf-0c80-4334-9327-f0a57b385cc9\" (UID: \"fcea0fcf-0c80-4334-9327-f0a57b385cc9\") " Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.137509 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbnnk\" (UniqueName: \"kubernetes.io/projected/fcea0fcf-0c80-4334-9327-f0a57b385cc9-kube-api-access-cbnnk\") pod \"fcea0fcf-0c80-4334-9327-f0a57b385cc9\" (UID: \"fcea0fcf-0c80-4334-9327-f0a57b385cc9\") " Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.137539 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fcea0fcf-0c80-4334-9327-f0a57b385cc9-ssh-key-openstack-edpm-ipam\") pod \"fcea0fcf-0c80-4334-9327-f0a57b385cc9\" (UID: \"fcea0fcf-0c80-4334-9327-f0a57b385cc9\") " Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.142686 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcea0fcf-0c80-4334-9327-f0a57b385cc9-kube-api-access-cbnnk" (OuterVolumeSpecName: "kube-api-access-cbnnk") pod "fcea0fcf-0c80-4334-9327-f0a57b385cc9" (UID: "fcea0fcf-0c80-4334-9327-f0a57b385cc9"). InnerVolumeSpecName "kube-api-access-cbnnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.163837 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcea0fcf-0c80-4334-9327-f0a57b385cc9-inventory" (OuterVolumeSpecName: "inventory") pod "fcea0fcf-0c80-4334-9327-f0a57b385cc9" (UID: "fcea0fcf-0c80-4334-9327-f0a57b385cc9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.172376 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcea0fcf-0c80-4334-9327-f0a57b385cc9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fcea0fcf-0c80-4334-9327-f0a57b385cc9" (UID: "fcea0fcf-0c80-4334-9327-f0a57b385cc9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.240130 4699 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fcea0fcf-0c80-4334-9327-f0a57b385cc9-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.240165 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbnnk\" (UniqueName: \"kubernetes.io/projected/fcea0fcf-0c80-4334-9327-f0a57b385cc9-kube-api-access-cbnnk\") on node \"crc\" DevicePath \"\"" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.240178 4699 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fcea0fcf-0c80-4334-9327-f0a57b385cc9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.647152 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zdf2z" event={"ID":"fcea0fcf-0c80-4334-9327-f0a57b385cc9","Type":"ContainerDied","Data":"eb9cf19f3dcfcec0226f2b1b4e3eeb146f04cb508c317c36f5c63ab6d203f2d3"} Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.647196 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb9cf19f3dcfcec0226f2b1b4e3eeb146f04cb508c317c36f5c63ab6d203f2d3" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.647260 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zdf2z" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.706054 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj"] Feb 26 11:36:39 crc kubenswrapper[4699]: E0226 11:36:39.706580 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095e0632-b9cc-4410-af45-249da70797aa" containerName="extract-utilities" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.706603 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="095e0632-b9cc-4410-af45-249da70797aa" containerName="extract-utilities" Feb 26 11:36:39 crc kubenswrapper[4699]: E0226 11:36:39.706617 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcea0fcf-0c80-4334-9327-f0a57b385cc9" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.706628 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcea0fcf-0c80-4334-9327-f0a57b385cc9" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 26 11:36:39 crc kubenswrapper[4699]: E0226 11:36:39.706647 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095e0632-b9cc-4410-af45-249da70797aa" containerName="registry-server" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.706654 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="095e0632-b9cc-4410-af45-249da70797aa" containerName="registry-server" Feb 26 11:36:39 crc kubenswrapper[4699]: E0226 11:36:39.706673 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095e0632-b9cc-4410-af45-249da70797aa" containerName="extract-content" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.706681 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="095e0632-b9cc-4410-af45-249da70797aa" containerName="extract-content" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.706935 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcea0fcf-0c80-4334-9327-f0a57b385cc9" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.706963 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="095e0632-b9cc-4410-af45-249da70797aa" containerName="registry-server" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.707781 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.710930 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.711864 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.711955 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-69sdb" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.712169 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.722815 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj"] Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.753920 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw4lj\" (UniqueName: \"kubernetes.io/projected/fee4a36b-0896-43c1-9b23-3da3ae870cbe-kube-api-access-rw4lj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj\" (UID: \"fee4a36b-0896-43c1-9b23-3da3ae870cbe\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.753983 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee4a36b-0896-43c1-9b23-3da3ae870cbe-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj\" (UID: \"fee4a36b-0896-43c1-9b23-3da3ae870cbe\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.754094 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fee4a36b-0896-43c1-9b23-3da3ae870cbe-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj\" (UID: \"fee4a36b-0896-43c1-9b23-3da3ae870cbe\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.754141 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fee4a36b-0896-43c1-9b23-3da3ae870cbe-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj\" (UID: \"fee4a36b-0896-43c1-9b23-3da3ae870cbe\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.856377 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fee4a36b-0896-43c1-9b23-3da3ae870cbe-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj\" (UID: \"fee4a36b-0896-43c1-9b23-3da3ae870cbe\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.856444 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fee4a36b-0896-43c1-9b23-3da3ae870cbe-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj\" (UID: \"fee4a36b-0896-43c1-9b23-3da3ae870cbe\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.856562 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw4lj\" (UniqueName: \"kubernetes.io/projected/fee4a36b-0896-43c1-9b23-3da3ae870cbe-kube-api-access-rw4lj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj\" (UID: \"fee4a36b-0896-43c1-9b23-3da3ae870cbe\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.856589 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee4a36b-0896-43c1-9b23-3da3ae870cbe-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj\" (UID: \"fee4a36b-0896-43c1-9b23-3da3ae870cbe\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.860295 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee4a36b-0896-43c1-9b23-3da3ae870cbe-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj\" (UID: \"fee4a36b-0896-43c1-9b23-3da3ae870cbe\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.866718 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fee4a36b-0896-43c1-9b23-3da3ae870cbe-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj\" (UID: \"fee4a36b-0896-43c1-9b23-3da3ae870cbe\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.873074 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw4lj\" (UniqueName: \"kubernetes.io/projected/fee4a36b-0896-43c1-9b23-3da3ae870cbe-kube-api-access-rw4lj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj\" (UID: \"fee4a36b-0896-43c1-9b23-3da3ae870cbe\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj" Feb 26 11:36:39 crc kubenswrapper[4699]: I0226 11:36:39.873398 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fee4a36b-0896-43c1-9b23-3da3ae870cbe-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj\" (UID: \"fee4a36b-0896-43c1-9b23-3da3ae870cbe\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj" Feb 26 11:36:40 crc kubenswrapper[4699]: I0226 11:36:40.028249 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj" Feb 26 11:36:40 crc kubenswrapper[4699]: I0226 11:36:40.528877 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj"] Feb 26 11:36:40 crc kubenswrapper[4699]: I0226 11:36:40.657233 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj" event={"ID":"fee4a36b-0896-43c1-9b23-3da3ae870cbe","Type":"ContainerStarted","Data":"80e4ef9ff110025ca1ad9e0b0c1c51b00737c757e4cae0d1f58cc0b932613fd9"} Feb 26 11:36:41 crc kubenswrapper[4699]: I0226 11:36:41.585641 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:36:41 crc kubenswrapper[4699]: I0226 11:36:41.586040 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:36:41 crc kubenswrapper[4699]: I0226 11:36:41.677424 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj" event={"ID":"fee4a36b-0896-43c1-9b23-3da3ae870cbe","Type":"ContainerStarted","Data":"9da4d0e1b71f7b3bc90f317d243afef6ee0b2480495e8fa6f0ce050f027878f5"} Feb 26 11:36:41 crc kubenswrapper[4699]: I0226 11:36:41.697929 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj" podStartSLOduration=2.2693969210000002 podStartE2EDuration="2.69790372s" podCreationTimestamp="2026-02-26 11:36:39 +0000 UTC" firstStartedPulling="2026-02-26 11:36:40.530693931 +0000 UTC m=+1546.341520365" lastFinishedPulling="2026-02-26 11:36:40.95920073 +0000 UTC m=+1546.770027164" observedRunningTime="2026-02-26 11:36:41.694334846 +0000 UTC m=+1547.505161280" watchObservedRunningTime="2026-02-26 11:36:41.69790372 +0000 UTC m=+1547.508730174" Feb 26 11:37:11 crc kubenswrapper[4699]: I0226 11:37:11.584929 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:37:11 crc kubenswrapper[4699]: I0226 11:37:11.585519 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:37:28 crc kubenswrapper[4699]: I0226 11:37:28.297665 4699 scope.go:117] "RemoveContainer" containerID="dad7fa90e67d3f965c26f7c4abb45503a74b01c5861c388e8b2b6571901121e5" Feb 26 11:37:28 crc kubenswrapper[4699]: I0226 11:37:28.334730 4699 scope.go:117] "RemoveContainer" containerID="c6b236ca3c3f327dbd547c137704ae3085c07d33a8a0f68103faaa60a3289bc1" Feb 26 11:37:28 crc kubenswrapper[4699]: I0226 11:37:28.391180 4699 scope.go:117] "RemoveContainer" containerID="5822866374c533954891aab83b4e82e6518ecfafe343985ba49ddc3abdfd00dc" Feb 26 11:37:28 crc kubenswrapper[4699]: I0226 11:37:28.489384 4699 scope.go:117] "RemoveContainer" containerID="3fc8431c0d9189816a6d87bbbf1bde79cfcb29458f69200822c417c75941073b" Feb 26 11:37:41 crc kubenswrapper[4699]: I0226 11:37:41.715521 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:37:41 crc kubenswrapper[4699]: I0226 11:37:41.716167 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:37:41 crc kubenswrapper[4699]: I0226 11:37:41.716221 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:37:41 crc kubenswrapper[4699]: I0226 11:37:41.716825 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99"} pod="openshift-machine-config-operator/machine-config-daemon-28p79" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 11:37:41 crc kubenswrapper[4699]: I0226 11:37:41.716882 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" containerID="cri-o://44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" gracePeriod=600 Feb 26 11:37:41 crc kubenswrapper[4699]: E0226 11:37:41.848243 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:37:42 crc kubenswrapper[4699]: I0226 11:37:42.820637 4699 generic.go:334] "Generic (PLEG): container finished" podID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" exitCode=0 Feb 26 11:37:42 crc kubenswrapper[4699]: I0226 11:37:42.820694 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerDied","Data":"44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99"} Feb 26 11:37:42 crc kubenswrapper[4699]: I0226 11:37:42.820745 4699 scope.go:117] "RemoveContainer" containerID="e281597aa593fa5c9ddd67a617de4ed4d3363a8c5b9ebcaaf78cd70cd013eef6" Feb 26 11:37:42 crc kubenswrapper[4699]: I0226 11:37:42.821497 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:37:42 crc kubenswrapper[4699]: E0226 11:37:42.821916 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:37:57 crc kubenswrapper[4699]: I0226 11:37:57.262501 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:37:57 crc kubenswrapper[4699]: E0226 11:37:57.263196 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:38:00 crc kubenswrapper[4699]: I0226 11:38:00.165395 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535098-km5z4"] Feb 26 11:38:00 crc kubenswrapper[4699]: I0226 11:38:00.167964 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535098-km5z4" Feb 26 11:38:00 crc kubenswrapper[4699]: I0226 11:38:00.172696 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 11:38:00 crc kubenswrapper[4699]: I0226 11:38:00.172752 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 11:38:00 crc kubenswrapper[4699]: I0226 11:38:00.172864 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 11:38:00 crc kubenswrapper[4699]: I0226 11:38:00.179973 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535098-km5z4"] Feb 26 11:38:00 crc kubenswrapper[4699]: I0226 11:38:00.306903 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28hls\" (UniqueName: \"kubernetes.io/projected/54818b28-fa0f-4021-9dc0-57f3186f3e64-kube-api-access-28hls\") pod \"auto-csr-approver-29535098-km5z4\" (UID: \"54818b28-fa0f-4021-9dc0-57f3186f3e64\") " pod="openshift-infra/auto-csr-approver-29535098-km5z4" Feb 26 11:38:00 crc kubenswrapper[4699]: I0226 11:38:00.408499 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28hls\" (UniqueName: \"kubernetes.io/projected/54818b28-fa0f-4021-9dc0-57f3186f3e64-kube-api-access-28hls\") pod \"auto-csr-approver-29535098-km5z4\" (UID: \"54818b28-fa0f-4021-9dc0-57f3186f3e64\") " pod="openshift-infra/auto-csr-approver-29535098-km5z4" Feb 26 11:38:00 crc kubenswrapper[4699]: I0226 11:38:00.427066 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28hls\" (UniqueName: \"kubernetes.io/projected/54818b28-fa0f-4021-9dc0-57f3186f3e64-kube-api-access-28hls\") pod \"auto-csr-approver-29535098-km5z4\" (UID: \"54818b28-fa0f-4021-9dc0-57f3186f3e64\") " pod="openshift-infra/auto-csr-approver-29535098-km5z4" Feb 26 11:38:00 crc kubenswrapper[4699]: I0226 11:38:00.500841 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535098-km5z4" Feb 26 11:38:01 crc kubenswrapper[4699]: I0226 11:38:01.037083 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535098-km5z4"] Feb 26 11:38:01 crc kubenswrapper[4699]: I0226 11:38:01.147978 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535098-km5z4" event={"ID":"54818b28-fa0f-4021-9dc0-57f3186f3e64","Type":"ContainerStarted","Data":"b6ba6bc53e2c20bd1bc843bf1c53f22c7fb2f19628fb3de77d156539c6b892f1"} Feb 26 11:38:04 crc kubenswrapper[4699]: I0226 11:38:04.212783 4699 generic.go:334] "Generic (PLEG): container finished" podID="54818b28-fa0f-4021-9dc0-57f3186f3e64" containerID="1b1986eede2e3874e8730ee539f7fe36f87c4471b7b1fdf2129756beebd0a599" exitCode=0 Feb 26 11:38:04 crc kubenswrapper[4699]: I0226 11:38:04.212928 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535098-km5z4" event={"ID":"54818b28-fa0f-4021-9dc0-57f3186f3e64","Type":"ContainerDied","Data":"1b1986eede2e3874e8730ee539f7fe36f87c4471b7b1fdf2129756beebd0a599"} Feb 26 11:38:05 crc kubenswrapper[4699]: I0226 11:38:05.755054 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535098-km5z4" Feb 26 11:38:05 crc kubenswrapper[4699]: I0226 11:38:05.909571 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28hls\" (UniqueName: \"kubernetes.io/projected/54818b28-fa0f-4021-9dc0-57f3186f3e64-kube-api-access-28hls\") pod \"54818b28-fa0f-4021-9dc0-57f3186f3e64\" (UID: \"54818b28-fa0f-4021-9dc0-57f3186f3e64\") " Feb 26 11:38:05 crc kubenswrapper[4699]: I0226 11:38:05.915149 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54818b28-fa0f-4021-9dc0-57f3186f3e64-kube-api-access-28hls" (OuterVolumeSpecName: "kube-api-access-28hls") pod "54818b28-fa0f-4021-9dc0-57f3186f3e64" (UID: "54818b28-fa0f-4021-9dc0-57f3186f3e64"). InnerVolumeSpecName "kube-api-access-28hls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:38:06 crc kubenswrapper[4699]: I0226 11:38:06.079636 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28hls\" (UniqueName: \"kubernetes.io/projected/54818b28-fa0f-4021-9dc0-57f3186f3e64-kube-api-access-28hls\") on node \"crc\" DevicePath \"\"" Feb 26 11:38:06 crc kubenswrapper[4699]: I0226 11:38:06.232709 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535098-km5z4" event={"ID":"54818b28-fa0f-4021-9dc0-57f3186f3e64","Type":"ContainerDied","Data":"b6ba6bc53e2c20bd1bc843bf1c53f22c7fb2f19628fb3de77d156539c6b892f1"} Feb 26 11:38:06 crc kubenswrapper[4699]: I0226 11:38:06.232748 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6ba6bc53e2c20bd1bc843bf1c53f22c7fb2f19628fb3de77d156539c6b892f1" Feb 26 11:38:06 crc kubenswrapper[4699]: I0226 11:38:06.232815 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535098-km5z4" Feb 26 11:38:06 crc kubenswrapper[4699]: I0226 11:38:06.830763 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535092-t7q4h"] Feb 26 11:38:06 crc kubenswrapper[4699]: I0226 11:38:06.839511 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535092-t7q4h"] Feb 26 11:38:08 crc kubenswrapper[4699]: I0226 11:38:08.401590 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="343bb829-035d-4834-a0c4-d9a61c11a2ee" path="/var/lib/kubelet/pods/343bb829-035d-4834-a0c4-d9a61c11a2ee/volumes" Feb 26 11:38:10 crc kubenswrapper[4699]: I0226 11:38:10.261491 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:38:10 crc kubenswrapper[4699]: E0226 11:38:10.261794 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:38:25 crc kubenswrapper[4699]: I0226 11:38:25.261233 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:38:25 crc kubenswrapper[4699]: E0226 11:38:25.262047 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:38:28 crc kubenswrapper[4699]: I0226 11:38:28.712621 4699 scope.go:117] "RemoveContainer" containerID="f6df4021899217dba2f01191869995ca628d93d69016b474ac26db11ce7351f9" Feb 26 11:38:28 crc kubenswrapper[4699]: I0226 11:38:28.736980 4699 scope.go:117] "RemoveContainer" containerID="3c60b289616323cd6352bf0b5554d4a5d5ee327ffbb6b71e27e82bb85958f651" Feb 26 11:38:28 crc kubenswrapper[4699]: I0226 11:38:28.758754 4699 scope.go:117] "RemoveContainer" containerID="f2cdecc6eba8599d08f98abb877e3708c955cb03d406931c6fd1ea5f2ab28e98" Feb 26 11:38:39 crc kubenswrapper[4699]: I0226 11:38:39.260729 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:38:39 crc kubenswrapper[4699]: E0226 11:38:39.261580 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:38:52 crc kubenswrapper[4699]: I0226 11:38:52.260978 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:38:52 crc kubenswrapper[4699]: E0226 11:38:52.261897 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:38:55 crc kubenswrapper[4699]: I0226 11:38:55.302000 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wngln"] Feb 26 11:38:55 crc kubenswrapper[4699]: E0226 11:38:55.302752 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54818b28-fa0f-4021-9dc0-57f3186f3e64" containerName="oc" Feb 26 11:38:55 crc kubenswrapper[4699]: I0226 11:38:55.302776 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="54818b28-fa0f-4021-9dc0-57f3186f3e64" containerName="oc" Feb 26 11:38:55 crc kubenswrapper[4699]: I0226 11:38:55.302991 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="54818b28-fa0f-4021-9dc0-57f3186f3e64" containerName="oc" Feb 26 11:38:55 crc kubenswrapper[4699]: I0226 11:38:55.304698 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wngln" Feb 26 11:38:55 crc kubenswrapper[4699]: I0226 11:38:55.319720 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wngln"] Feb 26 11:38:55 crc kubenswrapper[4699]: I0226 11:38:55.400139 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00de79e0-b495-44ac-ac69-461dae5cfcea-utilities\") pod \"community-operators-wngln\" (UID: \"00de79e0-b495-44ac-ac69-461dae5cfcea\") " pod="openshift-marketplace/community-operators-wngln" Feb 26 11:38:55 crc kubenswrapper[4699]: I0226 11:38:55.400296 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn72s\" (UniqueName: \"kubernetes.io/projected/00de79e0-b495-44ac-ac69-461dae5cfcea-kube-api-access-sn72s\") pod \"community-operators-wngln\" (UID: \"00de79e0-b495-44ac-ac69-461dae5cfcea\") " pod="openshift-marketplace/community-operators-wngln" Feb 26 11:38:55 crc kubenswrapper[4699]: I0226 11:38:55.400368 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00de79e0-b495-44ac-ac69-461dae5cfcea-catalog-content\") pod \"community-operators-wngln\" (UID: \"00de79e0-b495-44ac-ac69-461dae5cfcea\") " pod="openshift-marketplace/community-operators-wngln" Feb 26 11:38:55 crc kubenswrapper[4699]: I0226 11:38:55.501679 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00de79e0-b495-44ac-ac69-461dae5cfcea-catalog-content\") pod \"community-operators-wngln\" (UID: \"00de79e0-b495-44ac-ac69-461dae5cfcea\") " pod="openshift-marketplace/community-operators-wngln" Feb 26 11:38:55 crc kubenswrapper[4699]: I0226 11:38:55.501813 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00de79e0-b495-44ac-ac69-461dae5cfcea-utilities\") pod \"community-operators-wngln\" (UID: \"00de79e0-b495-44ac-ac69-461dae5cfcea\") " pod="openshift-marketplace/community-operators-wngln" Feb 26 11:38:55 crc kubenswrapper[4699]: I0226 11:38:55.501941 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn72s\" (UniqueName: \"kubernetes.io/projected/00de79e0-b495-44ac-ac69-461dae5cfcea-kube-api-access-sn72s\") pod \"community-operators-wngln\" (UID: \"00de79e0-b495-44ac-ac69-461dae5cfcea\") " pod="openshift-marketplace/community-operators-wngln" Feb 26 11:38:55 crc kubenswrapper[4699]: I0226 11:38:55.502419 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00de79e0-b495-44ac-ac69-461dae5cfcea-catalog-content\") pod \"community-operators-wngln\" (UID: \"00de79e0-b495-44ac-ac69-461dae5cfcea\") " pod="openshift-marketplace/community-operators-wngln" Feb 26 11:38:55 crc kubenswrapper[4699]: I0226 11:38:55.502773 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00de79e0-b495-44ac-ac69-461dae5cfcea-utilities\") pod \"community-operators-wngln\" (UID: \"00de79e0-b495-44ac-ac69-461dae5cfcea\") " pod="openshift-marketplace/community-operators-wngln" Feb 26 11:38:55 crc kubenswrapper[4699]: I0226 11:38:55.527228 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn72s\" (UniqueName: \"kubernetes.io/projected/00de79e0-b495-44ac-ac69-461dae5cfcea-kube-api-access-sn72s\") pod \"community-operators-wngln\" (UID: \"00de79e0-b495-44ac-ac69-461dae5cfcea\") " pod="openshift-marketplace/community-operators-wngln" Feb 26 11:38:55 crc kubenswrapper[4699]: I0226 11:38:55.628100 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wngln" Feb 26 11:38:56 crc kubenswrapper[4699]: I0226 11:38:56.141841 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wngln"] Feb 26 11:38:56 crc kubenswrapper[4699]: I0226 11:38:56.639057 4699 generic.go:334] "Generic (PLEG): container finished" podID="00de79e0-b495-44ac-ac69-461dae5cfcea" containerID="1140853f450f2ac7dc6b93a4d99c33ede9a8bdc36ddcd35c196e7946c551da81" exitCode=0 Feb 26 11:38:56 crc kubenswrapper[4699]: I0226 11:38:56.639394 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wngln" event={"ID":"00de79e0-b495-44ac-ac69-461dae5cfcea","Type":"ContainerDied","Data":"1140853f450f2ac7dc6b93a4d99c33ede9a8bdc36ddcd35c196e7946c551da81"} Feb 26 11:38:56 crc kubenswrapper[4699]: I0226 11:38:56.639475 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wngln" event={"ID":"00de79e0-b495-44ac-ac69-461dae5cfcea","Type":"ContainerStarted","Data":"ea93986519d68d2ee2fa0d6490c8ca1431cde9b600c5babfa220cd098cbce583"} Feb 26 11:38:58 crc kubenswrapper[4699]: I0226 11:38:58.659834 4699 generic.go:334] "Generic (PLEG): container finished" podID="00de79e0-b495-44ac-ac69-461dae5cfcea" containerID="55e33d9e4a2f48076bebaa29f774aa42b701e2bb71b502fe33614a88e47091ff" exitCode=0 Feb 26 11:38:58 crc kubenswrapper[4699]: I0226 11:38:58.659939 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wngln" event={"ID":"00de79e0-b495-44ac-ac69-461dae5cfcea","Type":"ContainerDied","Data":"55e33d9e4a2f48076bebaa29f774aa42b701e2bb71b502fe33614a88e47091ff"} Feb 26 11:39:01 crc kubenswrapper[4699]: I0226 11:39:01.694946 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wngln" event={"ID":"00de79e0-b495-44ac-ac69-461dae5cfcea","Type":"ContainerStarted","Data":"cda5dec024a6a906207b193d321e8bcde22d54b3fabfbb5a1562fbd3f13a94ef"} Feb 26 11:39:01 crc kubenswrapper[4699]: I0226 11:39:01.720740 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wngln" podStartSLOduration=2.540242452 podStartE2EDuration="6.720682741s" podCreationTimestamp="2026-02-26 11:38:55 +0000 UTC" firstStartedPulling="2026-02-26 11:38:56.641870717 +0000 UTC m=+1682.452697151" lastFinishedPulling="2026-02-26 11:39:00.822311006 +0000 UTC m=+1686.633137440" observedRunningTime="2026-02-26 11:39:01.714576287 +0000 UTC m=+1687.525402751" watchObservedRunningTime="2026-02-26 11:39:01.720682741 +0000 UTC m=+1687.531509175" Feb 26 11:39:05 crc kubenswrapper[4699]: I0226 11:39:05.628907 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wngln" Feb 26 11:39:05 crc kubenswrapper[4699]: I0226 11:39:05.629585 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wngln" Feb 26 11:39:05 crc kubenswrapper[4699]: I0226 11:39:05.680315 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wngln" Feb 26 11:39:07 crc kubenswrapper[4699]: I0226 11:39:07.261438 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:39:07 crc kubenswrapper[4699]: E0226 11:39:07.261823 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:39:15 crc kubenswrapper[4699]: I0226 11:39:15.682579 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wngln" Feb 26 11:39:15 crc kubenswrapper[4699]: I0226 11:39:15.749159 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wngln"] Feb 26 11:39:16 crc kubenswrapper[4699]: I0226 11:39:16.037579 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wngln" podUID="00de79e0-b495-44ac-ac69-461dae5cfcea" containerName="registry-server" containerID="cri-o://cda5dec024a6a906207b193d321e8bcde22d54b3fabfbb5a1562fbd3f13a94ef" gracePeriod=2 Feb 26 11:39:16 crc kubenswrapper[4699]: I0226 11:39:16.501024 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wngln" Feb 26 11:39:16 crc kubenswrapper[4699]: I0226 11:39:16.671646 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn72s\" (UniqueName: \"kubernetes.io/projected/00de79e0-b495-44ac-ac69-461dae5cfcea-kube-api-access-sn72s\") pod \"00de79e0-b495-44ac-ac69-461dae5cfcea\" (UID: \"00de79e0-b495-44ac-ac69-461dae5cfcea\") " Feb 26 11:39:16 crc kubenswrapper[4699]: I0226 11:39:16.671759 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00de79e0-b495-44ac-ac69-461dae5cfcea-utilities\") pod \"00de79e0-b495-44ac-ac69-461dae5cfcea\" (UID: \"00de79e0-b495-44ac-ac69-461dae5cfcea\") " Feb 26 11:39:16 crc kubenswrapper[4699]: I0226 11:39:16.671874 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00de79e0-b495-44ac-ac69-461dae5cfcea-catalog-content\") pod \"00de79e0-b495-44ac-ac69-461dae5cfcea\" (UID: \"00de79e0-b495-44ac-ac69-461dae5cfcea\") " Feb 26 11:39:16 crc kubenswrapper[4699]: I0226 11:39:16.673184 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00de79e0-b495-44ac-ac69-461dae5cfcea-utilities" (OuterVolumeSpecName: "utilities") pod "00de79e0-b495-44ac-ac69-461dae5cfcea" (UID: "00de79e0-b495-44ac-ac69-461dae5cfcea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:39:16 crc kubenswrapper[4699]: I0226 11:39:16.678888 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00de79e0-b495-44ac-ac69-461dae5cfcea-kube-api-access-sn72s" (OuterVolumeSpecName: "kube-api-access-sn72s") pod "00de79e0-b495-44ac-ac69-461dae5cfcea" (UID: "00de79e0-b495-44ac-ac69-461dae5cfcea"). InnerVolumeSpecName "kube-api-access-sn72s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:39:16 crc kubenswrapper[4699]: I0226 11:39:16.739556 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00de79e0-b495-44ac-ac69-461dae5cfcea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00de79e0-b495-44ac-ac69-461dae5cfcea" (UID: "00de79e0-b495-44ac-ac69-461dae5cfcea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:39:16 crc kubenswrapper[4699]: I0226 11:39:16.774709 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00de79e0-b495-44ac-ac69-461dae5cfcea-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:39:16 crc kubenswrapper[4699]: I0226 11:39:16.774761 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00de79e0-b495-44ac-ac69-461dae5cfcea-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:39:16 crc kubenswrapper[4699]: I0226 11:39:16.774783 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn72s\" (UniqueName: \"kubernetes.io/projected/00de79e0-b495-44ac-ac69-461dae5cfcea-kube-api-access-sn72s\") on node \"crc\" DevicePath \"\"" Feb 26 11:39:17 crc kubenswrapper[4699]: I0226 11:39:17.049179 4699 generic.go:334] "Generic (PLEG): container finished" podID="00de79e0-b495-44ac-ac69-461dae5cfcea" containerID="cda5dec024a6a906207b193d321e8bcde22d54b3fabfbb5a1562fbd3f13a94ef" exitCode=0 Feb 26 11:39:17 crc kubenswrapper[4699]: I0226 11:39:17.049238 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wngln" event={"ID":"00de79e0-b495-44ac-ac69-461dae5cfcea","Type":"ContainerDied","Data":"cda5dec024a6a906207b193d321e8bcde22d54b3fabfbb5a1562fbd3f13a94ef"} Feb 26 11:39:17 crc kubenswrapper[4699]: I0226 11:39:17.049274 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wngln" event={"ID":"00de79e0-b495-44ac-ac69-461dae5cfcea","Type":"ContainerDied","Data":"ea93986519d68d2ee2fa0d6490c8ca1431cde9b600c5babfa220cd098cbce583"} Feb 26 11:39:17 crc kubenswrapper[4699]: I0226 11:39:17.049323 4699 scope.go:117] "RemoveContainer" containerID="cda5dec024a6a906207b193d321e8bcde22d54b3fabfbb5a1562fbd3f13a94ef" Feb 26 11:39:17 crc kubenswrapper[4699]: I0226 11:39:17.049326 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wngln" Feb 26 11:39:17 crc kubenswrapper[4699]: I0226 11:39:17.075418 4699 scope.go:117] "RemoveContainer" containerID="55e33d9e4a2f48076bebaa29f774aa42b701e2bb71b502fe33614a88e47091ff" Feb 26 11:39:17 crc kubenswrapper[4699]: I0226 11:39:17.102176 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wngln"] Feb 26 11:39:17 crc kubenswrapper[4699]: I0226 11:39:17.114838 4699 scope.go:117] "RemoveContainer" containerID="1140853f450f2ac7dc6b93a4d99c33ede9a8bdc36ddcd35c196e7946c551da81" Feb 26 11:39:17 crc kubenswrapper[4699]: I0226 11:39:17.138365 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wngln"] Feb 26 11:39:17 crc kubenswrapper[4699]: I0226 11:39:17.153179 4699 scope.go:117] "RemoveContainer" containerID="cda5dec024a6a906207b193d321e8bcde22d54b3fabfbb5a1562fbd3f13a94ef" Feb 26 11:39:17 crc kubenswrapper[4699]: E0226 11:39:17.153932 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cda5dec024a6a906207b193d321e8bcde22d54b3fabfbb5a1562fbd3f13a94ef\": container with ID starting with cda5dec024a6a906207b193d321e8bcde22d54b3fabfbb5a1562fbd3f13a94ef not found: ID does not exist" containerID="cda5dec024a6a906207b193d321e8bcde22d54b3fabfbb5a1562fbd3f13a94ef" Feb 26 11:39:17 crc kubenswrapper[4699]: I0226 11:39:17.153964 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cda5dec024a6a906207b193d321e8bcde22d54b3fabfbb5a1562fbd3f13a94ef"} err="failed to get container status \"cda5dec024a6a906207b193d321e8bcde22d54b3fabfbb5a1562fbd3f13a94ef\": rpc error: code = NotFound desc = could not find container \"cda5dec024a6a906207b193d321e8bcde22d54b3fabfbb5a1562fbd3f13a94ef\": container with ID starting with cda5dec024a6a906207b193d321e8bcde22d54b3fabfbb5a1562fbd3f13a94ef not found: ID does not exist" Feb 26 11:39:17 crc kubenswrapper[4699]: I0226 11:39:17.153985 4699 scope.go:117] "RemoveContainer" containerID="55e33d9e4a2f48076bebaa29f774aa42b701e2bb71b502fe33614a88e47091ff" Feb 26 11:39:17 crc kubenswrapper[4699]: E0226 11:39:17.154559 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55e33d9e4a2f48076bebaa29f774aa42b701e2bb71b502fe33614a88e47091ff\": container with ID starting with 55e33d9e4a2f48076bebaa29f774aa42b701e2bb71b502fe33614a88e47091ff not found: ID does not exist" containerID="55e33d9e4a2f48076bebaa29f774aa42b701e2bb71b502fe33614a88e47091ff" Feb 26 11:39:17 crc kubenswrapper[4699]: I0226 11:39:17.154581 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55e33d9e4a2f48076bebaa29f774aa42b701e2bb71b502fe33614a88e47091ff"} err="failed to get container status \"55e33d9e4a2f48076bebaa29f774aa42b701e2bb71b502fe33614a88e47091ff\": rpc error: code = NotFound desc = could not find container \"55e33d9e4a2f48076bebaa29f774aa42b701e2bb71b502fe33614a88e47091ff\": container with ID starting with 55e33d9e4a2f48076bebaa29f774aa42b701e2bb71b502fe33614a88e47091ff not found: ID does not exist" Feb 26 11:39:17 crc kubenswrapper[4699]: I0226 11:39:17.154627 4699 scope.go:117] "RemoveContainer" containerID="1140853f450f2ac7dc6b93a4d99c33ede9a8bdc36ddcd35c196e7946c551da81" Feb 26 11:39:17 crc kubenswrapper[4699]: E0226 11:39:17.155086 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1140853f450f2ac7dc6b93a4d99c33ede9a8bdc36ddcd35c196e7946c551da81\": container with ID starting with 1140853f450f2ac7dc6b93a4d99c33ede9a8bdc36ddcd35c196e7946c551da81 not found: ID does not exist" containerID="1140853f450f2ac7dc6b93a4d99c33ede9a8bdc36ddcd35c196e7946c551da81" Feb 26 11:39:17 crc kubenswrapper[4699]: I0226 11:39:17.155130 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1140853f450f2ac7dc6b93a4d99c33ede9a8bdc36ddcd35c196e7946c551da81"} err="failed to get container status \"1140853f450f2ac7dc6b93a4d99c33ede9a8bdc36ddcd35c196e7946c551da81\": rpc error: code = NotFound desc = could not find container \"1140853f450f2ac7dc6b93a4d99c33ede9a8bdc36ddcd35c196e7946c551da81\": container with ID starting with 1140853f450f2ac7dc6b93a4d99c33ede9a8bdc36ddcd35c196e7946c551da81 not found: ID does not exist" Feb 26 11:39:18 crc kubenswrapper[4699]: I0226 11:39:18.271842 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00de79e0-b495-44ac-ac69-461dae5cfcea" path="/var/lib/kubelet/pods/00de79e0-b495-44ac-ac69-461dae5cfcea/volumes" Feb 26 11:39:19 crc kubenswrapper[4699]: I0226 11:39:19.261185 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:39:19 crc kubenswrapper[4699]: E0226 11:39:19.261536 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:39:30 crc kubenswrapper[4699]: I0226 11:39:30.260935 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:39:30 crc kubenswrapper[4699]: E0226 11:39:30.261782 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:39:45 crc kubenswrapper[4699]: I0226 11:39:45.261791 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:39:45 crc kubenswrapper[4699]: E0226 11:39:45.262649 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:39:55 crc kubenswrapper[4699]: I0226 11:39:55.163965 4699 generic.go:334] "Generic (PLEG): container finished" podID="fee4a36b-0896-43c1-9b23-3da3ae870cbe" containerID="9da4d0e1b71f7b3bc90f317d243afef6ee0b2480495e8fa6f0ce050f027878f5" exitCode=0 Feb 26 11:39:55 crc kubenswrapper[4699]: I0226 11:39:55.164045 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj" event={"ID":"fee4a36b-0896-43c1-9b23-3da3ae870cbe","Type":"ContainerDied","Data":"9da4d0e1b71f7b3bc90f317d243afef6ee0b2480495e8fa6f0ce050f027878f5"} Feb 26 11:39:56 crc kubenswrapper[4699]: I0226 11:39:56.269487 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:39:56 crc kubenswrapper[4699]: E0226 11:39:56.269725 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:39:56 crc kubenswrapper[4699]: I0226 11:39:56.595782 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj" Feb 26 11:39:56 crc kubenswrapper[4699]: I0226 11:39:56.768456 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee4a36b-0896-43c1-9b23-3da3ae870cbe-bootstrap-combined-ca-bundle\") pod \"fee4a36b-0896-43c1-9b23-3da3ae870cbe\" (UID: \"fee4a36b-0896-43c1-9b23-3da3ae870cbe\") " Feb 26 11:39:56 crc kubenswrapper[4699]: I0226 11:39:56.768509 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw4lj\" (UniqueName: \"kubernetes.io/projected/fee4a36b-0896-43c1-9b23-3da3ae870cbe-kube-api-access-rw4lj\") pod \"fee4a36b-0896-43c1-9b23-3da3ae870cbe\" (UID: \"fee4a36b-0896-43c1-9b23-3da3ae870cbe\") " Feb 26 11:39:56 crc kubenswrapper[4699]: I0226 11:39:56.768642 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fee4a36b-0896-43c1-9b23-3da3ae870cbe-inventory\") pod \"fee4a36b-0896-43c1-9b23-3da3ae870cbe\" (UID: \"fee4a36b-0896-43c1-9b23-3da3ae870cbe\") " Feb 26 11:39:56 crc kubenswrapper[4699]: I0226 11:39:56.768673 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fee4a36b-0896-43c1-9b23-3da3ae870cbe-ssh-key-openstack-edpm-ipam\") pod \"fee4a36b-0896-43c1-9b23-3da3ae870cbe\" (UID: \"fee4a36b-0896-43c1-9b23-3da3ae870cbe\") " Feb 26 11:39:56 crc kubenswrapper[4699]: I0226 11:39:56.774577 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee4a36b-0896-43c1-9b23-3da3ae870cbe-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "fee4a36b-0896-43c1-9b23-3da3ae870cbe" (UID: "fee4a36b-0896-43c1-9b23-3da3ae870cbe"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:39:56 crc kubenswrapper[4699]: I0226 11:39:56.774822 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fee4a36b-0896-43c1-9b23-3da3ae870cbe-kube-api-access-rw4lj" (OuterVolumeSpecName: "kube-api-access-rw4lj") pod "fee4a36b-0896-43c1-9b23-3da3ae870cbe" (UID: "fee4a36b-0896-43c1-9b23-3da3ae870cbe"). InnerVolumeSpecName "kube-api-access-rw4lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:39:56 crc kubenswrapper[4699]: I0226 11:39:56.800225 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee4a36b-0896-43c1-9b23-3da3ae870cbe-inventory" (OuterVolumeSpecName: "inventory") pod "fee4a36b-0896-43c1-9b23-3da3ae870cbe" (UID: "fee4a36b-0896-43c1-9b23-3da3ae870cbe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:39:56 crc kubenswrapper[4699]: I0226 11:39:56.806224 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee4a36b-0896-43c1-9b23-3da3ae870cbe-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fee4a36b-0896-43c1-9b23-3da3ae870cbe" (UID: "fee4a36b-0896-43c1-9b23-3da3ae870cbe"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:39:56 crc kubenswrapper[4699]: I0226 11:39:56.870574 4699 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee4a36b-0896-43c1-9b23-3da3ae870cbe-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:39:56 crc kubenswrapper[4699]: I0226 11:39:56.870611 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw4lj\" (UniqueName: \"kubernetes.io/projected/fee4a36b-0896-43c1-9b23-3da3ae870cbe-kube-api-access-rw4lj\") on node \"crc\" DevicePath \"\"" Feb 26 11:39:56 crc kubenswrapper[4699]: I0226 11:39:56.870621 4699 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fee4a36b-0896-43c1-9b23-3da3ae870cbe-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 11:39:56 crc kubenswrapper[4699]: I0226 11:39:56.870629 4699 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fee4a36b-0896-43c1-9b23-3da3ae870cbe-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.185445 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj" event={"ID":"fee4a36b-0896-43c1-9b23-3da3ae870cbe","Type":"ContainerDied","Data":"80e4ef9ff110025ca1ad9e0b0c1c51b00737c757e4cae0d1f58cc0b932613fd9"} Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.185488 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.185489 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80e4ef9ff110025ca1ad9e0b0c1c51b00737c757e4cae0d1f58cc0b932613fd9" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.277781 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f97wz"] Feb 26 11:39:57 crc kubenswrapper[4699]: E0226 11:39:57.278523 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00de79e0-b495-44ac-ac69-461dae5cfcea" containerName="registry-server" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.278545 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="00de79e0-b495-44ac-ac69-461dae5cfcea" containerName="registry-server" Feb 26 11:39:57 crc kubenswrapper[4699]: E0226 11:39:57.278576 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee4a36b-0896-43c1-9b23-3da3ae870cbe" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.278585 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee4a36b-0896-43c1-9b23-3da3ae870cbe" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 26 11:39:57 crc kubenswrapper[4699]: E0226 11:39:57.278608 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00de79e0-b495-44ac-ac69-461dae5cfcea" containerName="extract-utilities" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.278616 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="00de79e0-b495-44ac-ac69-461dae5cfcea" containerName="extract-utilities" Feb 26 11:39:57 crc kubenswrapper[4699]: E0226 11:39:57.278628 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00de79e0-b495-44ac-ac69-461dae5cfcea" containerName="extract-content" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.278635 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="00de79e0-b495-44ac-ac69-461dae5cfcea" containerName="extract-content" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.278892 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="00de79e0-b495-44ac-ac69-461dae5cfcea" containerName="registry-server" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.278922 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee4a36b-0896-43c1-9b23-3da3ae870cbe" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.279717 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f97wz" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.281958 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.282392 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.284061 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.284101 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-69sdb" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.295716 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f97wz"] Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.379553 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f97wz\" (UID: \"8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f97wz" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.379964 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f97wz\" (UID: \"8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f97wz" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.380077 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zj6f\" (UniqueName: \"kubernetes.io/projected/8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2-kube-api-access-4zj6f\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f97wz\" (UID: \"8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f97wz" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.482718 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f97wz\" (UID: \"8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f97wz" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.482839 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f97wz\" (UID: \"8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f97wz" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.482940 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zj6f\" (UniqueName: \"kubernetes.io/projected/8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2-kube-api-access-4zj6f\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f97wz\" (UID: \"8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f97wz" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.488801 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f97wz\" (UID: \"8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f97wz" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.488801 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f97wz\" (UID: \"8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f97wz" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.505755 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zj6f\" (UniqueName: \"kubernetes.io/projected/8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2-kube-api-access-4zj6f\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f97wz\" (UID: \"8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f97wz" Feb 26 11:39:57 crc kubenswrapper[4699]: I0226 11:39:57.601308 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f97wz" Feb 26 11:39:58 crc kubenswrapper[4699]: I0226 11:39:58.098196 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f97wz"] Feb 26 11:39:58 crc kubenswrapper[4699]: I0226 11:39:58.194398 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f97wz" event={"ID":"8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2","Type":"ContainerStarted","Data":"f077cdeff29985bc87c067f11fd69e3bb120e90af57ac246059fdb95f6bcb184"} Feb 26 11:40:00 crc kubenswrapper[4699]: I0226 11:40:00.133109 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535100-2fxw5"] Feb 26 11:40:00 crc kubenswrapper[4699]: I0226 11:40:00.135306 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535100-2fxw5" Feb 26 11:40:00 crc kubenswrapper[4699]: I0226 11:40:00.139861 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 11:40:00 crc kubenswrapper[4699]: I0226 11:40:00.140308 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 11:40:00 crc kubenswrapper[4699]: I0226 11:40:00.140797 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 11:40:00 crc kubenswrapper[4699]: I0226 11:40:00.150240 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjxz8\" (UniqueName: \"kubernetes.io/projected/db34348f-7e21-4666-8e45-c48a1fdbe2a4-kube-api-access-pjxz8\") pod \"auto-csr-approver-29535100-2fxw5\" (UID: \"db34348f-7e21-4666-8e45-c48a1fdbe2a4\") " pod="openshift-infra/auto-csr-approver-29535100-2fxw5" Feb 26 11:40:00 crc kubenswrapper[4699]: I0226 11:40:00.154556 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535100-2fxw5"] Feb 26 11:40:00 crc kubenswrapper[4699]: I0226 11:40:00.212075 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f97wz" event={"ID":"8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2","Type":"ContainerStarted","Data":"6b432756b4c02ac4dd161ed536fa1431f018acfe6fea2e615d58626a9b11073c"} Feb 26 11:40:00 crc kubenswrapper[4699]: I0226 11:40:00.250948 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f97wz" podStartSLOduration=2.015014334 podStartE2EDuration="3.250928939s" podCreationTimestamp="2026-02-26 11:39:57 +0000 UTC" firstStartedPulling="2026-02-26 11:39:58.102985337 +0000 UTC m=+1743.913811771" lastFinishedPulling="2026-02-26 11:39:59.338899942 +0000 UTC m=+1745.149726376" observedRunningTime="2026-02-26 11:40:00.247571743 +0000 UTC m=+1746.058398187" watchObservedRunningTime="2026-02-26 11:40:00.250928939 +0000 UTC m=+1746.061755373" Feb 26 11:40:00 crc kubenswrapper[4699]: I0226 11:40:00.252599 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjxz8\" (UniqueName: \"kubernetes.io/projected/db34348f-7e21-4666-8e45-c48a1fdbe2a4-kube-api-access-pjxz8\") pod \"auto-csr-approver-29535100-2fxw5\" (UID: \"db34348f-7e21-4666-8e45-c48a1fdbe2a4\") " pod="openshift-infra/auto-csr-approver-29535100-2fxw5" Feb 26 11:40:00 crc kubenswrapper[4699]: I0226 11:40:00.275681 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjxz8\" (UniqueName: \"kubernetes.io/projected/db34348f-7e21-4666-8e45-c48a1fdbe2a4-kube-api-access-pjxz8\") pod \"auto-csr-approver-29535100-2fxw5\" (UID: \"db34348f-7e21-4666-8e45-c48a1fdbe2a4\") " pod="openshift-infra/auto-csr-approver-29535100-2fxw5" Feb 26 11:40:00 crc kubenswrapper[4699]: I0226 11:40:00.463645 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535100-2fxw5" Feb 26 11:40:00 crc kubenswrapper[4699]: I0226 11:40:00.947247 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535100-2fxw5"] Feb 26 11:40:00 crc kubenswrapper[4699]: W0226 11:40:00.951424 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb34348f_7e21_4666_8e45_c48a1fdbe2a4.slice/crio-46becb21ca1b56f63c2a251fd67139c1dd9217c8a3b123d6942ff01d48839696 WatchSource:0}: Error finding container 46becb21ca1b56f63c2a251fd67139c1dd9217c8a3b123d6942ff01d48839696: Status 404 returned error can't find the container with id 46becb21ca1b56f63c2a251fd67139c1dd9217c8a3b123d6942ff01d48839696 Feb 26 11:40:01 crc kubenswrapper[4699]: I0226 11:40:01.227370 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535100-2fxw5" event={"ID":"db34348f-7e21-4666-8e45-c48a1fdbe2a4","Type":"ContainerStarted","Data":"46becb21ca1b56f63c2a251fd67139c1dd9217c8a3b123d6942ff01d48839696"} Feb 26 11:40:03 crc kubenswrapper[4699]: I0226 11:40:03.245575 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535100-2fxw5" event={"ID":"db34348f-7e21-4666-8e45-c48a1fdbe2a4","Type":"ContainerStarted","Data":"4505b88d80198e91d210a89e948ba5fb9b137a6a7006ae878e49e6ab4a45d98a"} Feb 26 11:40:03 crc kubenswrapper[4699]: I0226 11:40:03.262028 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535100-2fxw5" podStartSLOduration=1.457652541 podStartE2EDuration="3.262006001s" podCreationTimestamp="2026-02-26 11:40:00 +0000 UTC" firstStartedPulling="2026-02-26 11:40:00.954604655 +0000 UTC m=+1746.765431099" lastFinishedPulling="2026-02-26 11:40:02.758958135 +0000 UTC m=+1748.569784559" observedRunningTime="2026-02-26 11:40:03.260159118 +0000 UTC m=+1749.070985572" watchObservedRunningTime="2026-02-26 11:40:03.262006001 +0000 UTC m=+1749.072832445" Feb 26 11:40:04 crc kubenswrapper[4699]: I0226 11:40:04.262690 4699 generic.go:334] "Generic (PLEG): container finished" podID="db34348f-7e21-4666-8e45-c48a1fdbe2a4" containerID="4505b88d80198e91d210a89e948ba5fb9b137a6a7006ae878e49e6ab4a45d98a" exitCode=0 Feb 26 11:40:04 crc kubenswrapper[4699]: I0226 11:40:04.278836 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535100-2fxw5" event={"ID":"db34348f-7e21-4666-8e45-c48a1fdbe2a4","Type":"ContainerDied","Data":"4505b88d80198e91d210a89e948ba5fb9b137a6a7006ae878e49e6ab4a45d98a"} Feb 26 11:40:05 crc kubenswrapper[4699]: I0226 11:40:05.559362 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535100-2fxw5" Feb 26 11:40:05 crc kubenswrapper[4699]: I0226 11:40:05.754762 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjxz8\" (UniqueName: \"kubernetes.io/projected/db34348f-7e21-4666-8e45-c48a1fdbe2a4-kube-api-access-pjxz8\") pod \"db34348f-7e21-4666-8e45-c48a1fdbe2a4\" (UID: \"db34348f-7e21-4666-8e45-c48a1fdbe2a4\") " Feb 26 11:40:05 crc kubenswrapper[4699]: I0226 11:40:05.761485 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db34348f-7e21-4666-8e45-c48a1fdbe2a4-kube-api-access-pjxz8" (OuterVolumeSpecName: "kube-api-access-pjxz8") pod "db34348f-7e21-4666-8e45-c48a1fdbe2a4" (UID: "db34348f-7e21-4666-8e45-c48a1fdbe2a4"). InnerVolumeSpecName "kube-api-access-pjxz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:40:05 crc kubenswrapper[4699]: I0226 11:40:05.857080 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjxz8\" (UniqueName: \"kubernetes.io/projected/db34348f-7e21-4666-8e45-c48a1fdbe2a4-kube-api-access-pjxz8\") on node \"crc\" DevicePath \"\"" Feb 26 11:40:06 crc kubenswrapper[4699]: I0226 11:40:06.281365 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535100-2fxw5" event={"ID":"db34348f-7e21-4666-8e45-c48a1fdbe2a4","Type":"ContainerDied","Data":"46becb21ca1b56f63c2a251fd67139c1dd9217c8a3b123d6942ff01d48839696"} Feb 26 11:40:06 crc kubenswrapper[4699]: I0226 11:40:06.281417 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46becb21ca1b56f63c2a251fd67139c1dd9217c8a3b123d6942ff01d48839696" Feb 26 11:40:06 crc kubenswrapper[4699]: I0226 11:40:06.281438 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535100-2fxw5" Feb 26 11:40:06 crc kubenswrapper[4699]: I0226 11:40:06.340835 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535094-ccf5t"] Feb 26 11:40:06 crc kubenswrapper[4699]: I0226 11:40:06.350501 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535094-ccf5t"] Feb 26 11:40:07 crc kubenswrapper[4699]: I0226 11:40:07.263164 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:40:07 crc kubenswrapper[4699]: E0226 11:40:07.263600 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:40:08 crc kubenswrapper[4699]: I0226 11:40:08.271194 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a63cbb99-64c1-46fe-99eb-0d06cc310cba" path="/var/lib/kubelet/pods/a63cbb99-64c1-46fe-99eb-0d06cc310cba/volumes" Feb 26 11:40:19 crc kubenswrapper[4699]: I0226 11:40:19.262278 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:40:19 crc kubenswrapper[4699]: E0226 11:40:19.263157 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:40:28 crc kubenswrapper[4699]: I0226 11:40:28.922811 4699 scope.go:117] "RemoveContainer" containerID="2fbcb8eac2ddc22c3ecc04313ce75c8a329d85e31714a8bfe7dae5bd6310f0ad" Feb 26 11:40:28 crc kubenswrapper[4699]: I0226 11:40:28.988556 4699 scope.go:117] "RemoveContainer" containerID="50c7ddb03e58cd9791ab6f41d1755213bce0ea0826aec0f5b6934548dfaf9782" Feb 26 11:40:29 crc kubenswrapper[4699]: I0226 11:40:29.023826 4699 scope.go:117] "RemoveContainer" containerID="9bee82430e4d84a9497e3680da14bb7fec649ba1905937229370f30514994319" Feb 26 11:40:34 crc kubenswrapper[4699]: I0226 11:40:34.260839 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:40:34 crc kubenswrapper[4699]: E0226 11:40:34.261683 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:40:40 crc kubenswrapper[4699]: I0226 11:40:40.037424 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-29gg4"] Feb 26 11:40:40 crc kubenswrapper[4699]: I0226 11:40:40.047805 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8e68-account-create-update-bwkx8"] Feb 26 11:40:40 crc kubenswrapper[4699]: I0226 11:40:40.058244 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-29gg4"] Feb 26 11:40:40 crc kubenswrapper[4699]: I0226 11:40:40.069221 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8e68-account-create-update-bwkx8"] Feb 26 11:40:40 crc kubenswrapper[4699]: I0226 11:40:40.273493 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9392947-cd31-4afd-92c7-73bac0d4cbd3" path="/var/lib/kubelet/pods/e9392947-cd31-4afd-92c7-73bac0d4cbd3/volumes" Feb 26 11:40:40 crc kubenswrapper[4699]: I0226 11:40:40.274328 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6e3aace-8f02-410d-8e7e-4fa61336435b" path="/var/lib/kubelet/pods/f6e3aace-8f02-410d-8e7e-4fa61336435b/volumes" Feb 26 11:40:47 crc kubenswrapper[4699]: I0226 11:40:47.260513 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:40:47 crc kubenswrapper[4699]: E0226 11:40:47.261271 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:40:49 crc kubenswrapper[4699]: I0226 11:40:49.039982 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-nhpn8"] Feb 26 11:40:49 crc kubenswrapper[4699]: I0226 11:40:49.051319 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f9e8-account-create-update-zqq4d"] Feb 26 11:40:49 crc kubenswrapper[4699]: I0226 11:40:49.062496 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f9e8-account-create-update-zqq4d"] Feb 26 11:40:49 crc kubenswrapper[4699]: I0226 11:40:49.073519 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-nhpn8"] Feb 26 11:40:50 crc kubenswrapper[4699]: I0226 11:40:50.030351 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0fa1-account-create-update-l7dhx"] Feb 26 11:40:50 crc kubenswrapper[4699]: I0226 11:40:50.040379 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-0fa1-account-create-update-l7dhx"] Feb 26 11:40:50 crc kubenswrapper[4699]: I0226 11:40:50.048197 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-htqpz"] Feb 26 11:40:50 crc kubenswrapper[4699]: I0226 11:40:50.055702 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-htqpz"] Feb 26 11:40:50 crc kubenswrapper[4699]: I0226 11:40:50.271001 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e74821a-c4e5-4812-829d-c6b60b6657b8" path="/var/lib/kubelet/pods/0e74821a-c4e5-4812-829d-c6b60b6657b8/volumes" Feb 26 11:40:50 crc kubenswrapper[4699]: I0226 11:40:50.271670 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22d08e57-ba28-4614-8b11-2bd1bd4f836f" path="/var/lib/kubelet/pods/22d08e57-ba28-4614-8b11-2bd1bd4f836f/volumes" Feb 26 11:40:50 crc kubenswrapper[4699]: I0226 11:40:50.272243 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64b0134d-d882-4622-86a4-ab8172ee4fb2" path="/var/lib/kubelet/pods/64b0134d-d882-4622-86a4-ab8172ee4fb2/volumes" Feb 26 11:40:50 crc kubenswrapper[4699]: I0226 11:40:50.272806 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c209748-0c47-4bbb-883b-f4c245b6a156" path="/var/lib/kubelet/pods/9c209748-0c47-4bbb-883b-f4c245b6a156/volumes" Feb 26 11:40:59 crc kubenswrapper[4699]: I0226 11:40:59.261203 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:40:59 crc kubenswrapper[4699]: E0226 11:40:59.262015 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:41:05 crc kubenswrapper[4699]: I0226 11:41:05.042174 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-gjgfc"] Feb 26 11:41:05 crc kubenswrapper[4699]: I0226 11:41:05.054180 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-gjgfc"] Feb 26 11:41:06 crc kubenswrapper[4699]: I0226 11:41:06.276438 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c102f5c-cbaf-429e-b487-8b179f989720" path="/var/lib/kubelet/pods/7c102f5c-cbaf-429e-b487-8b179f989720/volumes" Feb 26 11:41:10 crc kubenswrapper[4699]: I0226 11:41:10.266188 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:41:10 crc kubenswrapper[4699]: E0226 11:41:10.269923 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:41:11 crc kubenswrapper[4699]: I0226 11:41:11.033244 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-5a9e-account-create-update-fzhw8"] Feb 26 11:41:11 crc kubenswrapper[4699]: I0226 11:41:11.049652 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-4fx8g"] Feb 26 11:41:11 crc kubenswrapper[4699]: I0226 11:41:11.065283 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-5a9e-account-create-update-fzhw8"] Feb 26 11:41:11 crc kubenswrapper[4699]: I0226 11:41:11.075871 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-4fx8g"] Feb 26 11:41:11 crc kubenswrapper[4699]: I0226 11:41:11.087635 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-bl9wp"] Feb 26 11:41:11 crc kubenswrapper[4699]: I0226 11:41:11.096280 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a7a2-account-create-update-l2mt4"] Feb 26 11:41:11 crc kubenswrapper[4699]: I0226 11:41:11.105031 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f3b2-account-create-update-xhgnq"] Feb 26 11:41:11 crc kubenswrapper[4699]: I0226 11:41:11.112883 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f3b2-account-create-update-xhgnq"] Feb 26 11:41:11 crc kubenswrapper[4699]: I0226 11:41:11.127365 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a7a2-account-create-update-l2mt4"] Feb 26 11:41:11 crc kubenswrapper[4699]: I0226 11:41:11.141433 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-bl9wp"] Feb 26 11:41:11 crc kubenswrapper[4699]: I0226 11:41:11.152671 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-v77r5"] Feb 26 11:41:11 crc kubenswrapper[4699]: I0226 11:41:11.160997 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-v77r5"] Feb 26 11:41:12 crc kubenswrapper[4699]: I0226 11:41:12.271321 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1029eddb-2336-4ec5-af4a-b8fed82d3d55" path="/var/lib/kubelet/pods/1029eddb-2336-4ec5-af4a-b8fed82d3d55/volumes" Feb 26 11:41:12 crc kubenswrapper[4699]: I0226 11:41:12.272924 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c910eba-ce23-4fd9-b08a-54b96fe6a2da" path="/var/lib/kubelet/pods/4c910eba-ce23-4fd9-b08a-54b96fe6a2da/volumes" Feb 26 11:41:12 crc kubenswrapper[4699]: I0226 11:41:12.274147 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c9e36d9-5d53-46d8-a91a-22dc9338ab58" path="/var/lib/kubelet/pods/5c9e36d9-5d53-46d8-a91a-22dc9338ab58/volumes" Feb 26 11:41:12 crc kubenswrapper[4699]: I0226 11:41:12.275228 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="758bbe1c-d826-47f7-aff6-54e9fc4ebe63" path="/var/lib/kubelet/pods/758bbe1c-d826-47f7-aff6-54e9fc4ebe63/volumes" Feb 26 11:41:12 crc kubenswrapper[4699]: I0226 11:41:12.276930 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a68fa18-1c49-4d3d-bc5f-75763944d818" path="/var/lib/kubelet/pods/7a68fa18-1c49-4d3d-bc5f-75763944d818/volumes" Feb 26 11:41:12 crc kubenswrapper[4699]: I0226 11:41:12.277900 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c25243e-b6d9-40f5-9c3b-31947cf74cc9" path="/var/lib/kubelet/pods/8c25243e-b6d9-40f5-9c3b-31947cf74cc9/volumes" Feb 26 11:41:15 crc kubenswrapper[4699]: I0226 11:41:15.033896 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-v9z8k"] Feb 26 11:41:15 crc kubenswrapper[4699]: I0226 11:41:15.047136 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-nblvp"] Feb 26 11:41:15 crc kubenswrapper[4699]: I0226 11:41:15.059391 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-v9z8k"] Feb 26 11:41:15 crc kubenswrapper[4699]: I0226 11:41:15.067006 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-nblvp"] Feb 26 11:41:16 crc kubenswrapper[4699]: I0226 11:41:16.272829 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72c1d656-4f85-483b-b7a2-6132b71ae093" path="/var/lib/kubelet/pods/72c1d656-4f85-483b-b7a2-6132b71ae093/volumes" Feb 26 11:41:16 crc kubenswrapper[4699]: I0226 11:41:16.274632 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f040612-306e-4ce2-b289-ed5be7bbc9e3" path="/var/lib/kubelet/pods/7f040612-306e-4ce2-b289-ed5be7bbc9e3/volumes" Feb 26 11:41:22 crc kubenswrapper[4699]: I0226 11:41:22.262110 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:41:22 crc kubenswrapper[4699]: E0226 11:41:22.263067 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:41:29 crc kubenswrapper[4699]: I0226 11:41:29.159282 4699 scope.go:117] "RemoveContainer" containerID="d84c1ad7d451293243927fb877d730897ca18c570d340c3870da5a49cf7b4e49" Feb 26 11:41:29 crc kubenswrapper[4699]: I0226 11:41:29.188024 4699 scope.go:117] "RemoveContainer" containerID="f9bc95d14d4ca0f4150bed4b727cc55b90093e4c3307ebc23256f5bd6248badb" Feb 26 11:41:29 crc kubenswrapper[4699]: I0226 11:41:29.234566 4699 scope.go:117] "RemoveContainer" containerID="5b4e9b46d7abb3978f9445cbfeebb825f9cd664cf115705fdae6f65a2a171de8" Feb 26 11:41:29 crc kubenswrapper[4699]: I0226 11:41:29.310993 4699 scope.go:117] "RemoveContainer" containerID="99b2baa30a79cd9b1afa4299366118e58d2c6c18512f6454267d08d3b636f3e6" Feb 26 11:41:29 crc kubenswrapper[4699]: I0226 11:41:29.390571 4699 scope.go:117] "RemoveContainer" containerID="8ac6484a77ece8a11d14d59104b361e660535022ac1b3f3359289cdf598c1ea3" Feb 26 11:41:29 crc kubenswrapper[4699]: I0226 11:41:29.413325 4699 scope.go:117] "RemoveContainer" containerID="9e6e239d14eb5fdc0f0fee3107f485263c4c1938d985d9c817ca4f3885c7de71" Feb 26 11:41:29 crc kubenswrapper[4699]: I0226 11:41:29.455536 4699 scope.go:117] "RemoveContainer" containerID="e9c4f64540efb8ca94268435547206be7e8a21ea869414c0e0fe3fdc2ad23ae0" Feb 26 11:41:29 crc kubenswrapper[4699]: I0226 11:41:29.473750 4699 scope.go:117] "RemoveContainer" containerID="0d9733430c4e718e7aff62771d81bae98ffdfc65e518351b1e877ae065bfd725" Feb 26 11:41:29 crc kubenswrapper[4699]: I0226 11:41:29.496544 4699 scope.go:117] "RemoveContainer" containerID="7c9888c6347c41b14207598f1324ae87027fe21cf208ac04db043c3350762dde" Feb 26 11:41:29 crc kubenswrapper[4699]: I0226 11:41:29.523764 4699 scope.go:117] "RemoveContainer" containerID="02517dfaa484539c60d2ef72e32d7a113f0b9a11e109ec31ac01691b7f015d05" Feb 26 11:41:29 crc kubenswrapper[4699]: I0226 11:41:29.544498 4699 scope.go:117] "RemoveContainer" containerID="91516e9d3caed541543b28d1d1f9c624822ee3d8a280a0f3e6e9514175f1fe30" Feb 26 11:41:29 crc kubenswrapper[4699]: I0226 11:41:29.569015 4699 scope.go:117] "RemoveContainer" containerID="c5f501a1150c4caded935575b10f8f9230324616853238eace0db08d01347483" Feb 26 11:41:29 crc kubenswrapper[4699]: I0226 11:41:29.590247 4699 scope.go:117] "RemoveContainer" containerID="6bf24901f54aea8222e7ac0b7dea606ea0a09d83f0dad7544b8e7bc98249b1e8" Feb 26 11:41:29 crc kubenswrapper[4699]: I0226 11:41:29.619016 4699 scope.go:117] "RemoveContainer" containerID="6a7d35b314cb71b7aea626b804eac24b58050ec797d6079e6362282e3f1a7a28" Feb 26 11:41:29 crc kubenswrapper[4699]: I0226 11:41:29.645867 4699 scope.go:117] "RemoveContainer" containerID="f56c01ae851446ecb80715a4bf6a848caa81425dc5709a8852bd80e336fdb67f" Feb 26 11:41:33 crc kubenswrapper[4699]: I0226 11:41:33.087135 4699 generic.go:334] "Generic (PLEG): container finished" podID="8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2" containerID="6b432756b4c02ac4dd161ed536fa1431f018acfe6fea2e615d58626a9b11073c" exitCode=0 Feb 26 11:41:33 crc kubenswrapper[4699]: I0226 11:41:33.087178 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f97wz" event={"ID":"8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2","Type":"ContainerDied","Data":"6b432756b4c02ac4dd161ed536fa1431f018acfe6fea2e615d58626a9b11073c"} Feb 26 11:41:34 crc kubenswrapper[4699]: I0226 11:41:34.261028 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:41:34 crc kubenswrapper[4699]: E0226 11:41:34.261919 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:41:34 crc kubenswrapper[4699]: I0226 11:41:34.527572 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f97wz" Feb 26 11:41:34 crc kubenswrapper[4699]: I0226 11:41:34.701137 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2-inventory\") pod \"8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2\" (UID: \"8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2\") " Feb 26 11:41:34 crc kubenswrapper[4699]: I0226 11:41:34.701268 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2-ssh-key-openstack-edpm-ipam\") pod \"8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2\" (UID: \"8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2\") " Feb 26 11:41:34 crc kubenswrapper[4699]: I0226 11:41:34.701438 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zj6f\" (UniqueName: \"kubernetes.io/projected/8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2-kube-api-access-4zj6f\") pod \"8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2\" (UID: \"8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2\") " Feb 26 11:41:34 crc kubenswrapper[4699]: I0226 11:41:34.707975 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2-kube-api-access-4zj6f" (OuterVolumeSpecName: "kube-api-access-4zj6f") pod "8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2" (UID: "8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2"). InnerVolumeSpecName "kube-api-access-4zj6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:41:34 crc kubenswrapper[4699]: I0226 11:41:34.733769 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2-inventory" (OuterVolumeSpecName: "inventory") pod "8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2" (UID: "8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:41:34 crc kubenswrapper[4699]: I0226 11:41:34.738000 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2" (UID: "8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:41:34 crc kubenswrapper[4699]: I0226 11:41:34.803825 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zj6f\" (UniqueName: \"kubernetes.io/projected/8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2-kube-api-access-4zj6f\") on node \"crc\" DevicePath \"\"" Feb 26 11:41:34 crc kubenswrapper[4699]: I0226 11:41:34.804202 4699 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 11:41:34 crc kubenswrapper[4699]: I0226 11:41:34.804213 4699 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.105804 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f97wz" event={"ID":"8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2","Type":"ContainerDied","Data":"f077cdeff29985bc87c067f11fd69e3bb120e90af57ac246059fdb95f6bcb184"} Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.106043 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f077cdeff29985bc87c067f11fd69e3bb120e90af57ac246059fdb95f6bcb184" Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.105877 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f97wz" Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.194979 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-86gl7"] Feb 26 11:41:35 crc kubenswrapper[4699]: E0226 11:41:35.195689 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.195715 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 26 11:41:35 crc kubenswrapper[4699]: E0226 11:41:35.195773 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db34348f-7e21-4666-8e45-c48a1fdbe2a4" containerName="oc" Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.195783 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="db34348f-7e21-4666-8e45-c48a1fdbe2a4" containerName="oc" Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.196037 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="db34348f-7e21-4666-8e45-c48a1fdbe2a4" containerName="oc" Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.196071 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.196779 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-86gl7" Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.199177 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-69sdb" Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.199394 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.199575 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.199760 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.211412 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-86gl7"] Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.313378 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4wqj\" (UniqueName: \"kubernetes.io/projected/b1a06be0-15ce-4abd-b9e7-7e11e789bd64-kube-api-access-z4wqj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-86gl7\" (UID: \"b1a06be0-15ce-4abd-b9e7-7e11e789bd64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-86gl7" Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.313480 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1a06be0-15ce-4abd-b9e7-7e11e789bd64-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-86gl7\" (UID: \"b1a06be0-15ce-4abd-b9e7-7e11e789bd64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-86gl7" Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.313712 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1a06be0-15ce-4abd-b9e7-7e11e789bd64-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-86gl7\" (UID: \"b1a06be0-15ce-4abd-b9e7-7e11e789bd64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-86gl7" Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.415464 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4wqj\" (UniqueName: \"kubernetes.io/projected/b1a06be0-15ce-4abd-b9e7-7e11e789bd64-kube-api-access-z4wqj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-86gl7\" (UID: \"b1a06be0-15ce-4abd-b9e7-7e11e789bd64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-86gl7" Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.416150 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1a06be0-15ce-4abd-b9e7-7e11e789bd64-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-86gl7\" (UID: \"b1a06be0-15ce-4abd-b9e7-7e11e789bd64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-86gl7" Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.416278 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1a06be0-15ce-4abd-b9e7-7e11e789bd64-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-86gl7\" (UID: \"b1a06be0-15ce-4abd-b9e7-7e11e789bd64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-86gl7" Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.420491 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1a06be0-15ce-4abd-b9e7-7e11e789bd64-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-86gl7\" (UID: \"b1a06be0-15ce-4abd-b9e7-7e11e789bd64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-86gl7" Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.427833 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1a06be0-15ce-4abd-b9e7-7e11e789bd64-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-86gl7\" (UID: \"b1a06be0-15ce-4abd-b9e7-7e11e789bd64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-86gl7" Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.435595 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4wqj\" (UniqueName: \"kubernetes.io/projected/b1a06be0-15ce-4abd-b9e7-7e11e789bd64-kube-api-access-z4wqj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-86gl7\" (UID: \"b1a06be0-15ce-4abd-b9e7-7e11e789bd64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-86gl7" Feb 26 11:41:35 crc kubenswrapper[4699]: I0226 11:41:35.512840 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-86gl7" Feb 26 11:41:36 crc kubenswrapper[4699]: I0226 11:41:36.064752 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-86gl7"] Feb 26 11:41:36 crc kubenswrapper[4699]: I0226 11:41:36.071619 4699 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 11:41:36 crc kubenswrapper[4699]: I0226 11:41:36.114791 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-86gl7" event={"ID":"b1a06be0-15ce-4abd-b9e7-7e11e789bd64","Type":"ContainerStarted","Data":"db75ae825dbb40d97a2b9db69df2b648d27c8bcd6afdccffa8c07497a1f62677"} Feb 26 11:41:37 crc kubenswrapper[4699]: I0226 11:41:37.127755 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-86gl7" event={"ID":"b1a06be0-15ce-4abd-b9e7-7e11e789bd64","Type":"ContainerStarted","Data":"06dd2f994e026e3d5c71102e70c0d33cced4374bc16162d705261194153c852c"} Feb 26 11:41:37 crc kubenswrapper[4699]: I0226 11:41:37.151318 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-86gl7" podStartSLOduration=1.7427489729999999 podStartE2EDuration="2.151292805s" podCreationTimestamp="2026-02-26 11:41:35 +0000 UTC" firstStartedPulling="2026-02-26 11:41:36.07127231 +0000 UTC m=+1841.882098754" lastFinishedPulling="2026-02-26 11:41:36.479816152 +0000 UTC m=+1842.290642586" observedRunningTime="2026-02-26 11:41:37.141816701 +0000 UTC m=+1842.952643135" watchObservedRunningTime="2026-02-26 11:41:37.151292805 +0000 UTC m=+1842.962119239" Feb 26 11:41:45 crc kubenswrapper[4699]: I0226 11:41:45.261243 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:41:45 crc kubenswrapper[4699]: E0226 11:41:45.262038 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:41:46 crc kubenswrapper[4699]: I0226 11:41:46.615494 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-dr78q"] Feb 26 11:41:46 crc kubenswrapper[4699]: I0226 11:41:46.631965 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-dr78q"] Feb 26 11:41:48 crc kubenswrapper[4699]: I0226 11:41:48.270467 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae813248-510e-4b19-bcd8-39cefca6cd37" path="/var/lib/kubelet/pods/ae813248-510e-4b19-bcd8-39cefca6cd37/volumes" Feb 26 11:41:53 crc kubenswrapper[4699]: I0226 11:41:53.027435 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-7g59c"] Feb 26 11:41:53 crc kubenswrapper[4699]: I0226 11:41:53.036400 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-7g59c"] Feb 26 11:41:54 crc kubenswrapper[4699]: I0226 11:41:54.274100 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d45d20cb-c561-4b84-b327-9b096865e8bb" path="/var/lib/kubelet/pods/d45d20cb-c561-4b84-b327-9b096865e8bb/volumes" Feb 26 11:41:58 crc kubenswrapper[4699]: I0226 11:41:58.261910 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:41:58 crc kubenswrapper[4699]: E0226 11:41:58.263466 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:42:00 crc kubenswrapper[4699]: I0226 11:42:00.045857 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-z6w9z"] Feb 26 11:42:00 crc kubenswrapper[4699]: I0226 11:42:00.058844 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-z6w9z"] Feb 26 11:42:00 crc kubenswrapper[4699]: I0226 11:42:00.070356 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-28v5g"] Feb 26 11:42:00 crc kubenswrapper[4699]: I0226 11:42:00.078311 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-28v5g"] Feb 26 11:42:00 crc kubenswrapper[4699]: I0226 11:42:00.142407 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535102-2zbvr"] Feb 26 11:42:00 crc kubenswrapper[4699]: I0226 11:42:00.144323 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535102-2zbvr" Feb 26 11:42:00 crc kubenswrapper[4699]: I0226 11:42:00.147060 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 11:42:00 crc kubenswrapper[4699]: I0226 11:42:00.147637 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 11:42:00 crc kubenswrapper[4699]: I0226 11:42:00.147785 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 11:42:00 crc kubenswrapper[4699]: I0226 11:42:00.165975 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535102-2zbvr"] Feb 26 11:42:00 crc kubenswrapper[4699]: I0226 11:42:00.552345 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47a9d008-5b7e-4866-b92b-efcb60cbfdb0" path="/var/lib/kubelet/pods/47a9d008-5b7e-4866-b92b-efcb60cbfdb0/volumes" Feb 26 11:42:00 crc kubenswrapper[4699]: I0226 11:42:00.553206 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b33c7b6e-a78a-4a10-848c-a65d01deee0b" path="/var/lib/kubelet/pods/b33c7b6e-a78a-4a10-848c-a65d01deee0b/volumes" Feb 26 11:42:00 crc kubenswrapper[4699]: I0226 11:42:00.640586 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d22sr\" (UniqueName: \"kubernetes.io/projected/1b7b6e2c-94b0-4cfd-9b33-f988c4a8c984-kube-api-access-d22sr\") pod \"auto-csr-approver-29535102-2zbvr\" (UID: \"1b7b6e2c-94b0-4cfd-9b33-f988c4a8c984\") " pod="openshift-infra/auto-csr-approver-29535102-2zbvr" Feb 26 11:42:00 crc kubenswrapper[4699]: I0226 11:42:00.743022 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d22sr\" (UniqueName: \"kubernetes.io/projected/1b7b6e2c-94b0-4cfd-9b33-f988c4a8c984-kube-api-access-d22sr\") pod \"auto-csr-approver-29535102-2zbvr\" (UID: \"1b7b6e2c-94b0-4cfd-9b33-f988c4a8c984\") " pod="openshift-infra/auto-csr-approver-29535102-2zbvr" Feb 26 11:42:00 crc kubenswrapper[4699]: I0226 11:42:00.761828 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d22sr\" (UniqueName: \"kubernetes.io/projected/1b7b6e2c-94b0-4cfd-9b33-f988c4a8c984-kube-api-access-d22sr\") pod \"auto-csr-approver-29535102-2zbvr\" (UID: \"1b7b6e2c-94b0-4cfd-9b33-f988c4a8c984\") " pod="openshift-infra/auto-csr-approver-29535102-2zbvr" Feb 26 11:42:00 crc kubenswrapper[4699]: I0226 11:42:00.858871 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535102-2zbvr" Feb 26 11:42:01 crc kubenswrapper[4699]: I0226 11:42:01.329694 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535102-2zbvr"] Feb 26 11:42:01 crc kubenswrapper[4699]: I0226 11:42:01.705863 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535102-2zbvr" event={"ID":"1b7b6e2c-94b0-4cfd-9b33-f988c4a8c984","Type":"ContainerStarted","Data":"a18abcf8f2dfd7199fbcf5f7f1c9ab4491141187d87e951eae13077028a31efd"} Feb 26 11:42:03 crc kubenswrapper[4699]: I0226 11:42:03.726240 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535102-2zbvr" event={"ID":"1b7b6e2c-94b0-4cfd-9b33-f988c4a8c984","Type":"ContainerStarted","Data":"dfc62ad99cdddeccaa0a04e48b0be130dad6cc30569fc90d45e5fa7beabda285"} Feb 26 11:42:03 crc kubenswrapper[4699]: I0226 11:42:03.740417 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535102-2zbvr" podStartSLOduration=1.7117044529999998 podStartE2EDuration="3.74039234s" podCreationTimestamp="2026-02-26 11:42:00 +0000 UTC" firstStartedPulling="2026-02-26 11:42:01.332841617 +0000 UTC m=+1867.143668051" lastFinishedPulling="2026-02-26 11:42:03.361529494 +0000 UTC m=+1869.172355938" observedRunningTime="2026-02-26 11:42:03.739182907 +0000 UTC m=+1869.550009361" watchObservedRunningTime="2026-02-26 11:42:03.74039234 +0000 UTC m=+1869.551218784" Feb 26 11:42:04 crc kubenswrapper[4699]: I0226 11:42:04.741092 4699 generic.go:334] "Generic (PLEG): container finished" podID="1b7b6e2c-94b0-4cfd-9b33-f988c4a8c984" containerID="dfc62ad99cdddeccaa0a04e48b0be130dad6cc30569fc90d45e5fa7beabda285" exitCode=0 Feb 26 11:42:04 crc kubenswrapper[4699]: I0226 11:42:04.741173 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535102-2zbvr" event={"ID":"1b7b6e2c-94b0-4cfd-9b33-f988c4a8c984","Type":"ContainerDied","Data":"dfc62ad99cdddeccaa0a04e48b0be130dad6cc30569fc90d45e5fa7beabda285"} Feb 26 11:42:06 crc kubenswrapper[4699]: I0226 11:42:06.106269 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535102-2zbvr" Feb 26 11:42:06 crc kubenswrapper[4699]: I0226 11:42:06.376007 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d22sr\" (UniqueName: \"kubernetes.io/projected/1b7b6e2c-94b0-4cfd-9b33-f988c4a8c984-kube-api-access-d22sr\") pod \"1b7b6e2c-94b0-4cfd-9b33-f988c4a8c984\" (UID: \"1b7b6e2c-94b0-4cfd-9b33-f988c4a8c984\") " Feb 26 11:42:06 crc kubenswrapper[4699]: I0226 11:42:06.386143 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b7b6e2c-94b0-4cfd-9b33-f988c4a8c984-kube-api-access-d22sr" (OuterVolumeSpecName: "kube-api-access-d22sr") pod "1b7b6e2c-94b0-4cfd-9b33-f988c4a8c984" (UID: "1b7b6e2c-94b0-4cfd-9b33-f988c4a8c984"). InnerVolumeSpecName "kube-api-access-d22sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:42:06 crc kubenswrapper[4699]: I0226 11:42:06.482156 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d22sr\" (UniqueName: \"kubernetes.io/projected/1b7b6e2c-94b0-4cfd-9b33-f988c4a8c984-kube-api-access-d22sr\") on node \"crc\" DevicePath \"\"" Feb 26 11:42:06 crc kubenswrapper[4699]: I0226 11:42:06.827521 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535102-2zbvr" event={"ID":"1b7b6e2c-94b0-4cfd-9b33-f988c4a8c984","Type":"ContainerDied","Data":"a18abcf8f2dfd7199fbcf5f7f1c9ab4491141187d87e951eae13077028a31efd"} Feb 26 11:42:06 crc kubenswrapper[4699]: I0226 11:42:06.827581 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a18abcf8f2dfd7199fbcf5f7f1c9ab4491141187d87e951eae13077028a31efd" Feb 26 11:42:06 crc kubenswrapper[4699]: I0226 11:42:06.827655 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535102-2zbvr" Feb 26 11:42:07 crc kubenswrapper[4699]: I0226 11:42:07.009871 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535096-xr7rk"] Feb 26 11:42:07 crc kubenswrapper[4699]: I0226 11:42:07.022829 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535096-xr7rk"] Feb 26 11:42:08 crc kubenswrapper[4699]: I0226 11:42:08.276437 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b65e61c-3853-4fd6-93c2-9d13c6776589" path="/var/lib/kubelet/pods/6b65e61c-3853-4fd6-93c2-9d13c6776589/volumes" Feb 26 11:42:11 crc kubenswrapper[4699]: I0226 11:42:11.260692 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:42:11 crc kubenswrapper[4699]: E0226 11:42:11.261513 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:42:23 crc kubenswrapper[4699]: I0226 11:42:23.261570 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:42:23 crc kubenswrapper[4699]: E0226 11:42:23.262519 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:42:24 crc kubenswrapper[4699]: I0226 11:42:24.031327 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-f49xd"] Feb 26 11:42:24 crc kubenswrapper[4699]: I0226 11:42:24.038749 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-f49xd"] Feb 26 11:42:24 crc kubenswrapper[4699]: I0226 11:42:24.270614 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8426fd89-9eba-46fa-8611-e98cc7636b41" path="/var/lib/kubelet/pods/8426fd89-9eba-46fa-8611-e98cc7636b41/volumes" Feb 26 11:42:29 crc kubenswrapper[4699]: I0226 11:42:29.969685 4699 scope.go:117] "RemoveContainer" containerID="861736c6decfb2ac1c3010699205e1df4da771409780863184ec8e9136dd76db" Feb 26 11:42:30 crc kubenswrapper[4699]: I0226 11:42:30.013544 4699 scope.go:117] "RemoveContainer" containerID="4266f5dcbf67cb6303072faf9cd69cd6aabcaee0bb9544fa39ab82b24cc3c4e5" Feb 26 11:42:30 crc kubenswrapper[4699]: I0226 11:42:30.059273 4699 scope.go:117] "RemoveContainer" containerID="45bdc052e6dc259f4ccec396b223ed5d541f623efae769fc3c166913b1ca187a" Feb 26 11:42:30 crc kubenswrapper[4699]: I0226 11:42:30.153043 4699 scope.go:117] "RemoveContainer" containerID="2cec29afd9941e14f3e1571b5331427d3b1faa6723571c88143afc902d980bd2" Feb 26 11:42:30 crc kubenswrapper[4699]: I0226 11:42:30.215656 4699 scope.go:117] "RemoveContainer" containerID="dd9ce01dbb3d28e8559eda1261c169a7dbac7ba191f3aabd0c7a5d33511f3c12" Feb 26 11:42:30 crc kubenswrapper[4699]: I0226 11:42:30.273688 4699 scope.go:117] "RemoveContainer" containerID="0eab0de6a835999edb566f7a018ef04e992296918bfb17f761cbea8ef8c3775a" Feb 26 11:42:38 crc kubenswrapper[4699]: I0226 11:42:38.260944 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:42:38 crc kubenswrapper[4699]: E0226 11:42:38.261699 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:42:52 crc kubenswrapper[4699]: I0226 11:42:52.260493 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:42:53 crc kubenswrapper[4699]: I0226 11:42:53.018453 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerStarted","Data":"6ef034a72d27c84dbd807adb1a50ce258b1b8022f1d940a8fb612e62f1d33345"} Feb 26 11:42:55 crc kubenswrapper[4699]: I0226 11:42:55.037012 4699 generic.go:334] "Generic (PLEG): container finished" podID="b1a06be0-15ce-4abd-b9e7-7e11e789bd64" containerID="06dd2f994e026e3d5c71102e70c0d33cced4374bc16162d705261194153c852c" exitCode=0 Feb 26 11:42:55 crc kubenswrapper[4699]: I0226 11:42:55.037175 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-86gl7" event={"ID":"b1a06be0-15ce-4abd-b9e7-7e11e789bd64","Type":"ContainerDied","Data":"06dd2f994e026e3d5c71102e70c0d33cced4374bc16162d705261194153c852c"} Feb 26 11:42:56 crc kubenswrapper[4699]: I0226 11:42:56.459094 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-86gl7" Feb 26 11:42:56 crc kubenswrapper[4699]: I0226 11:42:56.590705 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4wqj\" (UniqueName: \"kubernetes.io/projected/b1a06be0-15ce-4abd-b9e7-7e11e789bd64-kube-api-access-z4wqj\") pod \"b1a06be0-15ce-4abd-b9e7-7e11e789bd64\" (UID: \"b1a06be0-15ce-4abd-b9e7-7e11e789bd64\") " Feb 26 11:42:56 crc kubenswrapper[4699]: I0226 11:42:56.590862 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1a06be0-15ce-4abd-b9e7-7e11e789bd64-ssh-key-openstack-edpm-ipam\") pod \"b1a06be0-15ce-4abd-b9e7-7e11e789bd64\" (UID: \"b1a06be0-15ce-4abd-b9e7-7e11e789bd64\") " Feb 26 11:42:56 crc kubenswrapper[4699]: I0226 11:42:56.591079 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1a06be0-15ce-4abd-b9e7-7e11e789bd64-inventory\") pod \"b1a06be0-15ce-4abd-b9e7-7e11e789bd64\" (UID: \"b1a06be0-15ce-4abd-b9e7-7e11e789bd64\") " Feb 26 11:42:56 crc kubenswrapper[4699]: I0226 11:42:56.596761 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1a06be0-15ce-4abd-b9e7-7e11e789bd64-kube-api-access-z4wqj" (OuterVolumeSpecName: "kube-api-access-z4wqj") pod "b1a06be0-15ce-4abd-b9e7-7e11e789bd64" (UID: "b1a06be0-15ce-4abd-b9e7-7e11e789bd64"). InnerVolumeSpecName "kube-api-access-z4wqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:42:56 crc kubenswrapper[4699]: I0226 11:42:56.619138 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1a06be0-15ce-4abd-b9e7-7e11e789bd64-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b1a06be0-15ce-4abd-b9e7-7e11e789bd64" (UID: "b1a06be0-15ce-4abd-b9e7-7e11e789bd64"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:42:56 crc kubenswrapper[4699]: I0226 11:42:56.632661 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1a06be0-15ce-4abd-b9e7-7e11e789bd64-inventory" (OuterVolumeSpecName: "inventory") pod "b1a06be0-15ce-4abd-b9e7-7e11e789bd64" (UID: "b1a06be0-15ce-4abd-b9e7-7e11e789bd64"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:42:56 crc kubenswrapper[4699]: I0226 11:42:56.693618 4699 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1a06be0-15ce-4abd-b9e7-7e11e789bd64-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 11:42:56 crc kubenswrapper[4699]: I0226 11:42:56.693655 4699 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1a06be0-15ce-4abd-b9e7-7e11e789bd64-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 11:42:56 crc kubenswrapper[4699]: I0226 11:42:56.693666 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4wqj\" (UniqueName: \"kubernetes.io/projected/b1a06be0-15ce-4abd-b9e7-7e11e789bd64-kube-api-access-z4wqj\") on node \"crc\" DevicePath \"\"" Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.056009 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-86gl7" event={"ID":"b1a06be0-15ce-4abd-b9e7-7e11e789bd64","Type":"ContainerDied","Data":"db75ae825dbb40d97a2b9db69df2b648d27c8bcd6afdccffa8c07497a1f62677"} Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.056339 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db75ae825dbb40d97a2b9db69df2b648d27c8bcd6afdccffa8c07497a1f62677" Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.056189 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-86gl7" Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.149561 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9npsm"] Feb 26 11:42:57 crc kubenswrapper[4699]: E0226 11:42:57.150022 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a06be0-15ce-4abd-b9e7-7e11e789bd64" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.150042 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a06be0-15ce-4abd-b9e7-7e11e789bd64" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 26 11:42:57 crc kubenswrapper[4699]: E0226 11:42:57.150067 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b7b6e2c-94b0-4cfd-9b33-f988c4a8c984" containerName="oc" Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.150074 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b7b6e2c-94b0-4cfd-9b33-f988c4a8c984" containerName="oc" Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.150262 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b7b6e2c-94b0-4cfd-9b33-f988c4a8c984" containerName="oc" Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.150303 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1a06be0-15ce-4abd-b9e7-7e11e789bd64" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.153265 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9npsm" Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.155737 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.156494 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.156666 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-69sdb" Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.157752 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.171268 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9npsm"] Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.305415 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs8ws\" (UniqueName: \"kubernetes.io/projected/974c869a-b430-4a83-81d0-ece37d67c0b0-kube-api-access-gs8ws\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9npsm\" (UID: \"974c869a-b430-4a83-81d0-ece37d67c0b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9npsm" Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.305764 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/974c869a-b430-4a83-81d0-ece37d67c0b0-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9npsm\" (UID: \"974c869a-b430-4a83-81d0-ece37d67c0b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9npsm" Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.306197 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/974c869a-b430-4a83-81d0-ece37d67c0b0-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9npsm\" (UID: \"974c869a-b430-4a83-81d0-ece37d67c0b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9npsm" Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.408383 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/974c869a-b430-4a83-81d0-ece37d67c0b0-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9npsm\" (UID: \"974c869a-b430-4a83-81d0-ece37d67c0b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9npsm" Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.408481 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs8ws\" (UniqueName: \"kubernetes.io/projected/974c869a-b430-4a83-81d0-ece37d67c0b0-kube-api-access-gs8ws\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9npsm\" (UID: \"974c869a-b430-4a83-81d0-ece37d67c0b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9npsm" Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.408526 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/974c869a-b430-4a83-81d0-ece37d67c0b0-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9npsm\" (UID: \"974c869a-b430-4a83-81d0-ece37d67c0b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9npsm" Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.420010 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/974c869a-b430-4a83-81d0-ece37d67c0b0-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9npsm\" (UID: \"974c869a-b430-4a83-81d0-ece37d67c0b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9npsm" Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.420239 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/974c869a-b430-4a83-81d0-ece37d67c0b0-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9npsm\" (UID: \"974c869a-b430-4a83-81d0-ece37d67c0b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9npsm" Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.433133 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs8ws\" (UniqueName: \"kubernetes.io/projected/974c869a-b430-4a83-81d0-ece37d67c0b0-kube-api-access-gs8ws\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9npsm\" (UID: \"974c869a-b430-4a83-81d0-ece37d67c0b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9npsm" Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.473312 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9npsm" Feb 26 11:42:57 crc kubenswrapper[4699]: I0226 11:42:57.964622 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9npsm"] Feb 26 11:42:58 crc kubenswrapper[4699]: I0226 11:42:58.068165 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9npsm" event={"ID":"974c869a-b430-4a83-81d0-ece37d67c0b0","Type":"ContainerStarted","Data":"069297dd71fe712a0a36e6e82a7ee33d0dad62eba7903614e0f1c84d725d3c0f"} Feb 26 11:43:00 crc kubenswrapper[4699]: I0226 11:43:00.048990 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-snmfx"] Feb 26 11:43:00 crc kubenswrapper[4699]: I0226 11:43:00.059475 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-62mhs"] Feb 26 11:43:00 crc kubenswrapper[4699]: I0226 11:43:00.068864 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-snmfx"] Feb 26 11:43:00 crc kubenswrapper[4699]: I0226 11:43:00.081135 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-3146-account-create-update-xf6c8"] Feb 26 11:43:00 crc kubenswrapper[4699]: I0226 11:43:00.089768 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-62mhs"] Feb 26 11:43:00 crc kubenswrapper[4699]: I0226 11:43:00.092083 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9npsm" event={"ID":"974c869a-b430-4a83-81d0-ece37d67c0b0","Type":"ContainerStarted","Data":"047a7bcc737231590d42107b96e6ff16ff3d82797549985bb5d0845e611f758d"} Feb 26 11:43:00 crc kubenswrapper[4699]: I0226 11:43:00.098933 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-3146-account-create-update-xf6c8"] Feb 26 11:43:00 crc kubenswrapper[4699]: I0226 11:43:00.270816 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c5acc31-dbe4-4698-8346-9a0dbc05234b" path="/var/lib/kubelet/pods/6c5acc31-dbe4-4698-8346-9a0dbc05234b/volumes" Feb 26 11:43:00 crc kubenswrapper[4699]: I0226 11:43:00.271447 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e54b257-33a7-43bd-80c5-30915ae82341" path="/var/lib/kubelet/pods/7e54b257-33a7-43bd-80c5-30915ae82341/volumes" Feb 26 11:43:00 crc kubenswrapper[4699]: I0226 11:43:00.272062 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4a229eb-75a5-41b1-8342-53a3a1b433a0" path="/var/lib/kubelet/pods/f4a229eb-75a5-41b1-8342-53a3a1b433a0/volumes" Feb 26 11:43:01 crc kubenswrapper[4699]: I0226 11:43:01.019536 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9npsm" podStartSLOduration=2.430598463 podStartE2EDuration="4.01949415s" podCreationTimestamp="2026-02-26 11:42:57 +0000 UTC" firstStartedPulling="2026-02-26 11:42:57.965938692 +0000 UTC m=+1923.776765126" lastFinishedPulling="2026-02-26 11:42:59.554834379 +0000 UTC m=+1925.365660813" observedRunningTime="2026-02-26 11:43:00.114687318 +0000 UTC m=+1925.925513762" watchObservedRunningTime="2026-02-26 11:43:01.01949415 +0000 UTC m=+1926.830320584" Feb 26 11:43:01 crc kubenswrapper[4699]: I0226 11:43:01.032219 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-66cf-account-create-update-qvvdk"] Feb 26 11:43:01 crc kubenswrapper[4699]: I0226 11:43:01.044565 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-43f0-account-create-update-vgmlz"] Feb 26 11:43:01 crc kubenswrapper[4699]: I0226 11:43:01.055504 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-66cf-account-create-update-qvvdk"] Feb 26 11:43:01 crc kubenswrapper[4699]: I0226 11:43:01.065396 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-hq69l"] Feb 26 11:43:01 crc kubenswrapper[4699]: I0226 11:43:01.074240 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-43f0-account-create-update-vgmlz"] Feb 26 11:43:01 crc kubenswrapper[4699]: I0226 11:43:01.081929 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-hq69l"] Feb 26 11:43:02 crc kubenswrapper[4699]: I0226 11:43:02.271042 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86425865-434f-43e8-9592-e890078837a2" path="/var/lib/kubelet/pods/86425865-434f-43e8-9592-e890078837a2/volumes" Feb 26 11:43:02 crc kubenswrapper[4699]: I0226 11:43:02.271810 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dea40818-89fa-4b78-9833-82635861fee1" path="/var/lib/kubelet/pods/dea40818-89fa-4b78-9833-82635861fee1/volumes" Feb 26 11:43:02 crc kubenswrapper[4699]: I0226 11:43:02.272394 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f99c6b36-a5f6-4f0b-973f-dfa853d2c558" path="/var/lib/kubelet/pods/f99c6b36-a5f6-4f0b-973f-dfa853d2c558/volumes" Feb 26 11:43:05 crc kubenswrapper[4699]: I0226 11:43:05.133909 4699 generic.go:334] "Generic (PLEG): container finished" podID="974c869a-b430-4a83-81d0-ece37d67c0b0" containerID="047a7bcc737231590d42107b96e6ff16ff3d82797549985bb5d0845e611f758d" exitCode=0 Feb 26 11:43:05 crc kubenswrapper[4699]: I0226 11:43:05.133966 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9npsm" event={"ID":"974c869a-b430-4a83-81d0-ece37d67c0b0","Type":"ContainerDied","Data":"047a7bcc737231590d42107b96e6ff16ff3d82797549985bb5d0845e611f758d"} Feb 26 11:43:06 crc kubenswrapper[4699]: I0226 11:43:06.537020 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9npsm" Feb 26 11:43:06 crc kubenswrapper[4699]: I0226 11:43:06.689762 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/974c869a-b430-4a83-81d0-ece37d67c0b0-inventory\") pod \"974c869a-b430-4a83-81d0-ece37d67c0b0\" (UID: \"974c869a-b430-4a83-81d0-ece37d67c0b0\") " Feb 26 11:43:06 crc kubenswrapper[4699]: I0226 11:43:06.689851 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs8ws\" (UniqueName: \"kubernetes.io/projected/974c869a-b430-4a83-81d0-ece37d67c0b0-kube-api-access-gs8ws\") pod \"974c869a-b430-4a83-81d0-ece37d67c0b0\" (UID: \"974c869a-b430-4a83-81d0-ece37d67c0b0\") " Feb 26 11:43:06 crc kubenswrapper[4699]: I0226 11:43:06.689991 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/974c869a-b430-4a83-81d0-ece37d67c0b0-ssh-key-openstack-edpm-ipam\") pod \"974c869a-b430-4a83-81d0-ece37d67c0b0\" (UID: \"974c869a-b430-4a83-81d0-ece37d67c0b0\") " Feb 26 11:43:06 crc kubenswrapper[4699]: I0226 11:43:06.696775 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/974c869a-b430-4a83-81d0-ece37d67c0b0-kube-api-access-gs8ws" (OuterVolumeSpecName: "kube-api-access-gs8ws") pod "974c869a-b430-4a83-81d0-ece37d67c0b0" (UID: "974c869a-b430-4a83-81d0-ece37d67c0b0"). InnerVolumeSpecName "kube-api-access-gs8ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:43:06 crc kubenswrapper[4699]: I0226 11:43:06.716027 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/974c869a-b430-4a83-81d0-ece37d67c0b0-inventory" (OuterVolumeSpecName: "inventory") pod "974c869a-b430-4a83-81d0-ece37d67c0b0" (UID: "974c869a-b430-4a83-81d0-ece37d67c0b0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:43:06 crc kubenswrapper[4699]: I0226 11:43:06.722023 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/974c869a-b430-4a83-81d0-ece37d67c0b0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "974c869a-b430-4a83-81d0-ece37d67c0b0" (UID: "974c869a-b430-4a83-81d0-ece37d67c0b0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:43:06 crc kubenswrapper[4699]: I0226 11:43:06.793302 4699 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/974c869a-b430-4a83-81d0-ece37d67c0b0-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 11:43:06 crc kubenswrapper[4699]: I0226 11:43:06.793448 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs8ws\" (UniqueName: \"kubernetes.io/projected/974c869a-b430-4a83-81d0-ece37d67c0b0-kube-api-access-gs8ws\") on node \"crc\" DevicePath \"\"" Feb 26 11:43:06 crc kubenswrapper[4699]: I0226 11:43:06.793536 4699 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/974c869a-b430-4a83-81d0-ece37d67c0b0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 11:43:07 crc kubenswrapper[4699]: I0226 11:43:07.153013 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9npsm" event={"ID":"974c869a-b430-4a83-81d0-ece37d67c0b0","Type":"ContainerDied","Data":"069297dd71fe712a0a36e6e82a7ee33d0dad62eba7903614e0f1c84d725d3c0f"} Feb 26 11:43:07 crc kubenswrapper[4699]: I0226 11:43:07.153049 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9npsm" Feb 26 11:43:07 crc kubenswrapper[4699]: I0226 11:43:07.153059 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="069297dd71fe712a0a36e6e82a7ee33d0dad62eba7903614e0f1c84d725d3c0f" Feb 26 11:43:07 crc kubenswrapper[4699]: I0226 11:43:07.258982 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mlb2f"] Feb 26 11:43:07 crc kubenswrapper[4699]: E0226 11:43:07.259732 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="974c869a-b430-4a83-81d0-ece37d67c0b0" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 26 11:43:07 crc kubenswrapper[4699]: I0226 11:43:07.259758 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="974c869a-b430-4a83-81d0-ece37d67c0b0" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 26 11:43:07 crc kubenswrapper[4699]: I0226 11:43:07.259952 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="974c869a-b430-4a83-81d0-ece37d67c0b0" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 26 11:43:07 crc kubenswrapper[4699]: I0226 11:43:07.261038 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mlb2f" Feb 26 11:43:07 crc kubenswrapper[4699]: I0226 11:43:07.266186 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-69sdb" Feb 26 11:43:07 crc kubenswrapper[4699]: I0226 11:43:07.267453 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 11:43:07 crc kubenswrapper[4699]: I0226 11:43:07.267453 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 11:43:07 crc kubenswrapper[4699]: I0226 11:43:07.267665 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 11:43:07 crc kubenswrapper[4699]: I0226 11:43:07.273865 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mlb2f"] Feb 26 11:43:07 crc kubenswrapper[4699]: I0226 11:43:07.405093 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh4kk\" (UniqueName: \"kubernetes.io/projected/ac66647f-74c0-4a4e-9925-e47cd90568a1-kube-api-access-rh4kk\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mlb2f\" (UID: \"ac66647f-74c0-4a4e-9925-e47cd90568a1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mlb2f" Feb 26 11:43:07 crc kubenswrapper[4699]: I0226 11:43:07.405161 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac66647f-74c0-4a4e-9925-e47cd90568a1-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mlb2f\" (UID: \"ac66647f-74c0-4a4e-9925-e47cd90568a1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mlb2f" Feb 26 11:43:07 crc kubenswrapper[4699]: I0226 11:43:07.405384 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac66647f-74c0-4a4e-9925-e47cd90568a1-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mlb2f\" (UID: \"ac66647f-74c0-4a4e-9925-e47cd90568a1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mlb2f" Feb 26 11:43:07 crc kubenswrapper[4699]: I0226 11:43:07.508049 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh4kk\" (UniqueName: \"kubernetes.io/projected/ac66647f-74c0-4a4e-9925-e47cd90568a1-kube-api-access-rh4kk\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mlb2f\" (UID: \"ac66647f-74c0-4a4e-9925-e47cd90568a1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mlb2f" Feb 26 11:43:07 crc kubenswrapper[4699]: I0226 11:43:07.508172 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac66647f-74c0-4a4e-9925-e47cd90568a1-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mlb2f\" (UID: \"ac66647f-74c0-4a4e-9925-e47cd90568a1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mlb2f" Feb 26 11:43:07 crc kubenswrapper[4699]: I0226 11:43:07.508237 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac66647f-74c0-4a4e-9925-e47cd90568a1-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mlb2f\" (UID: \"ac66647f-74c0-4a4e-9925-e47cd90568a1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mlb2f" Feb 26 11:43:07 crc kubenswrapper[4699]: I0226 11:43:07.513018 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac66647f-74c0-4a4e-9925-e47cd90568a1-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mlb2f\" (UID: \"ac66647f-74c0-4a4e-9925-e47cd90568a1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mlb2f" Feb 26 11:43:07 crc kubenswrapper[4699]: I0226 11:43:07.516723 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac66647f-74c0-4a4e-9925-e47cd90568a1-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mlb2f\" (UID: \"ac66647f-74c0-4a4e-9925-e47cd90568a1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mlb2f" Feb 26 11:43:07 crc kubenswrapper[4699]: I0226 11:43:07.525514 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh4kk\" (UniqueName: \"kubernetes.io/projected/ac66647f-74c0-4a4e-9925-e47cd90568a1-kube-api-access-rh4kk\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mlb2f\" (UID: \"ac66647f-74c0-4a4e-9925-e47cd90568a1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mlb2f" Feb 26 11:43:07 crc kubenswrapper[4699]: I0226 11:43:07.577175 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mlb2f" Feb 26 11:43:08 crc kubenswrapper[4699]: I0226 11:43:08.140523 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mlb2f"] Feb 26 11:43:08 crc kubenswrapper[4699]: W0226 11:43:08.152445 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac66647f_74c0_4a4e_9925_e47cd90568a1.slice/crio-589d89715ee8910d0add53e05a162f3c15e96c44d15bb00039839ce8af8bf08c WatchSource:0}: Error finding container 589d89715ee8910d0add53e05a162f3c15e96c44d15bb00039839ce8af8bf08c: Status 404 returned error can't find the container with id 589d89715ee8910d0add53e05a162f3c15e96c44d15bb00039839ce8af8bf08c Feb 26 11:43:08 crc kubenswrapper[4699]: I0226 11:43:08.163603 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mlb2f" event={"ID":"ac66647f-74c0-4a4e-9925-e47cd90568a1","Type":"ContainerStarted","Data":"589d89715ee8910d0add53e05a162f3c15e96c44d15bb00039839ce8af8bf08c"} Feb 26 11:43:09 crc kubenswrapper[4699]: I0226 11:43:09.176463 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mlb2f" event={"ID":"ac66647f-74c0-4a4e-9925-e47cd90568a1","Type":"ContainerStarted","Data":"37c024ee15929d11af3667b4c33bbdf3d64440abcac66b262307ff7f2f9f1b7f"} Feb 26 11:43:09 crc kubenswrapper[4699]: I0226 11:43:09.201744 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mlb2f" podStartSLOduration=1.72808494 podStartE2EDuration="2.20171633s" podCreationTimestamp="2026-02-26 11:43:07 +0000 UTC" firstStartedPulling="2026-02-26 11:43:08.156381411 +0000 UTC m=+1933.967207845" lastFinishedPulling="2026-02-26 11:43:08.630012801 +0000 UTC m=+1934.440839235" observedRunningTime="2026-02-26 11:43:09.196800823 +0000 UTC m=+1935.007627277" watchObservedRunningTime="2026-02-26 11:43:09.20171633 +0000 UTC m=+1935.012542774" Feb 26 11:43:28 crc kubenswrapper[4699]: I0226 11:43:28.039699 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vx5jv"] Feb 26 11:43:28 crc kubenswrapper[4699]: I0226 11:43:28.047696 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vx5jv"] Feb 26 11:43:28 crc kubenswrapper[4699]: I0226 11:43:28.275244 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef20f352-fa9c-4bc8-875d-d537f00f75d5" path="/var/lib/kubelet/pods/ef20f352-fa9c-4bc8-875d-d537f00f75d5/volumes" Feb 26 11:43:30 crc kubenswrapper[4699]: I0226 11:43:30.444461 4699 scope.go:117] "RemoveContainer" containerID="b4034fed15cab382c6c5fd47ff21f822b9c9aa9789392181d8ca9fe59c0d233d" Feb 26 11:43:30 crc kubenswrapper[4699]: I0226 11:43:30.511520 4699 scope.go:117] "RemoveContainer" containerID="e2f8c469ec04f6028bf261997ea76ce892a579e71cd0b1e3cbda4d1a898468a0" Feb 26 11:43:30 crc kubenswrapper[4699]: I0226 11:43:30.537385 4699 scope.go:117] "RemoveContainer" containerID="b2b62d6d79c5c992c3884d7e4c7aa453502b8500701d02db975cc913cb332656" Feb 26 11:43:30 crc kubenswrapper[4699]: I0226 11:43:30.592986 4699 scope.go:117] "RemoveContainer" containerID="9eff27ca91f87caa5ed2a02975a6d6bc2e239264a6a323e5cbc0471084500265" Feb 26 11:43:30 crc kubenswrapper[4699]: I0226 11:43:30.687900 4699 scope.go:117] "RemoveContainer" containerID="853cdd9a99dcd559f8a9a9863c9ecd3351cc72fb23481557abd22c41a3816b2d" Feb 26 11:43:30 crc kubenswrapper[4699]: I0226 11:43:30.721886 4699 scope.go:117] "RemoveContainer" containerID="ea224b941b0465af7d8b7b7d5e0297ed56d62f796e3b6566730ce00cb01d16ec" Feb 26 11:43:30 crc kubenswrapper[4699]: I0226 11:43:30.772672 4699 scope.go:117] "RemoveContainer" containerID="0569f07824e60d0703bc892d604ca5230523b1fde72c768bd283ae0d47703780" Feb 26 11:43:44 crc kubenswrapper[4699]: I0226 11:43:44.518778 4699 generic.go:334] "Generic (PLEG): container finished" podID="ac66647f-74c0-4a4e-9925-e47cd90568a1" containerID="37c024ee15929d11af3667b4c33bbdf3d64440abcac66b262307ff7f2f9f1b7f" exitCode=0 Feb 26 11:43:44 crc kubenswrapper[4699]: I0226 11:43:44.518890 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mlb2f" event={"ID":"ac66647f-74c0-4a4e-9925-e47cd90568a1","Type":"ContainerDied","Data":"37c024ee15929d11af3667b4c33bbdf3d64440abcac66b262307ff7f2f9f1b7f"} Feb 26 11:43:45 crc kubenswrapper[4699]: I0226 11:43:45.938713 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mlb2f" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.039019 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-mcdml"] Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.046342 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-mcdml"] Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.090976 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh4kk\" (UniqueName: \"kubernetes.io/projected/ac66647f-74c0-4a4e-9925-e47cd90568a1-kube-api-access-rh4kk\") pod \"ac66647f-74c0-4a4e-9925-e47cd90568a1\" (UID: \"ac66647f-74c0-4a4e-9925-e47cd90568a1\") " Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.091024 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac66647f-74c0-4a4e-9925-e47cd90568a1-inventory\") pod \"ac66647f-74c0-4a4e-9925-e47cd90568a1\" (UID: \"ac66647f-74c0-4a4e-9925-e47cd90568a1\") " Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.091229 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac66647f-74c0-4a4e-9925-e47cd90568a1-ssh-key-openstack-edpm-ipam\") pod \"ac66647f-74c0-4a4e-9925-e47cd90568a1\" (UID: \"ac66647f-74c0-4a4e-9925-e47cd90568a1\") " Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.097611 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac66647f-74c0-4a4e-9925-e47cd90568a1-kube-api-access-rh4kk" (OuterVolumeSpecName: "kube-api-access-rh4kk") pod "ac66647f-74c0-4a4e-9925-e47cd90568a1" (UID: "ac66647f-74c0-4a4e-9925-e47cd90568a1"). InnerVolumeSpecName "kube-api-access-rh4kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.120052 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac66647f-74c0-4a4e-9925-e47cd90568a1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ac66647f-74c0-4a4e-9925-e47cd90568a1" (UID: "ac66647f-74c0-4a4e-9925-e47cd90568a1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.131436 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac66647f-74c0-4a4e-9925-e47cd90568a1-inventory" (OuterVolumeSpecName: "inventory") pod "ac66647f-74c0-4a4e-9925-e47cd90568a1" (UID: "ac66647f-74c0-4a4e-9925-e47cd90568a1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.192879 4699 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac66647f-74c0-4a4e-9925-e47cd90568a1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.192910 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh4kk\" (UniqueName: \"kubernetes.io/projected/ac66647f-74c0-4a4e-9925-e47cd90568a1-kube-api-access-rh4kk\") on node \"crc\" DevicePath \"\"" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.192919 4699 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac66647f-74c0-4a4e-9925-e47cd90568a1-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.270558 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f528c9c1-4318-4d46-9b02-43f955e04009" path="/var/lib/kubelet/pods/f528c9c1-4318-4d46-9b02-43f955e04009/volumes" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.549249 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mlb2f" event={"ID":"ac66647f-74c0-4a4e-9925-e47cd90568a1","Type":"ContainerDied","Data":"589d89715ee8910d0add53e05a162f3c15e96c44d15bb00039839ce8af8bf08c"} Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.549641 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="589d89715ee8910d0add53e05a162f3c15e96c44d15bb00039839ce8af8bf08c" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.549617 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mlb2f" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.639838 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h9q25"] Feb 26 11:43:46 crc kubenswrapper[4699]: E0226 11:43:46.640809 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac66647f-74c0-4a4e-9925-e47cd90568a1" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.640894 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac66647f-74c0-4a4e-9925-e47cd90568a1" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.641196 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac66647f-74c0-4a4e-9925-e47cd90568a1" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.641891 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h9q25" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.648199 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.648499 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.648220 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-69sdb" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.661664 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.667616 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h9q25"] Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.806458 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/85e0d37e-fb25-4bbc-afe5-7e6ab304390c-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-h9q25\" (UID: \"85e0d37e-fb25-4bbc-afe5-7e6ab304390c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h9q25" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.806749 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85e0d37e-fb25-4bbc-afe5-7e6ab304390c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-h9q25\" (UID: \"85e0d37e-fb25-4bbc-afe5-7e6ab304390c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h9q25" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.807264 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv2ll\" (UniqueName: \"kubernetes.io/projected/85e0d37e-fb25-4bbc-afe5-7e6ab304390c-kube-api-access-qv2ll\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-h9q25\" (UID: \"85e0d37e-fb25-4bbc-afe5-7e6ab304390c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h9q25" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.909921 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv2ll\" (UniqueName: \"kubernetes.io/projected/85e0d37e-fb25-4bbc-afe5-7e6ab304390c-kube-api-access-qv2ll\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-h9q25\" (UID: \"85e0d37e-fb25-4bbc-afe5-7e6ab304390c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h9q25" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.909988 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/85e0d37e-fb25-4bbc-afe5-7e6ab304390c-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-h9q25\" (UID: \"85e0d37e-fb25-4bbc-afe5-7e6ab304390c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h9q25" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.910099 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85e0d37e-fb25-4bbc-afe5-7e6ab304390c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-h9q25\" (UID: \"85e0d37e-fb25-4bbc-afe5-7e6ab304390c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h9q25" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.915080 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85e0d37e-fb25-4bbc-afe5-7e6ab304390c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-h9q25\" (UID: \"85e0d37e-fb25-4bbc-afe5-7e6ab304390c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h9q25" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.915232 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/85e0d37e-fb25-4bbc-afe5-7e6ab304390c-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-h9q25\" (UID: \"85e0d37e-fb25-4bbc-afe5-7e6ab304390c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h9q25" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.938788 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv2ll\" (UniqueName: \"kubernetes.io/projected/85e0d37e-fb25-4bbc-afe5-7e6ab304390c-kube-api-access-qv2ll\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-h9q25\" (UID: \"85e0d37e-fb25-4bbc-afe5-7e6ab304390c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h9q25" Feb 26 11:43:46 crc kubenswrapper[4699]: I0226 11:43:46.970732 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h9q25" Feb 26 11:43:47 crc kubenswrapper[4699]: I0226 11:43:47.491315 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h9q25"] Feb 26 11:43:47 crc kubenswrapper[4699]: I0226 11:43:47.558304 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h9q25" event={"ID":"85e0d37e-fb25-4bbc-afe5-7e6ab304390c","Type":"ContainerStarted","Data":"0a2e58c697cadea58aebf86626b65bde4f82fab06f41769728b26cd2783dc764"} Feb 26 11:43:48 crc kubenswrapper[4699]: I0226 11:43:48.567195 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h9q25" event={"ID":"85e0d37e-fb25-4bbc-afe5-7e6ab304390c","Type":"ContainerStarted","Data":"ba3628d7d80420e0729959a1fcf9d498ca569fe68feae991747716a7c0d13fa3"} Feb 26 11:43:48 crc kubenswrapper[4699]: I0226 11:43:48.588238 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h9q25" podStartSLOduration=2.164664608 podStartE2EDuration="2.588066053s" podCreationTimestamp="2026-02-26 11:43:46 +0000 UTC" firstStartedPulling="2026-02-26 11:43:47.499482235 +0000 UTC m=+1973.310308669" lastFinishedPulling="2026-02-26 11:43:47.92288368 +0000 UTC m=+1973.733710114" observedRunningTime="2026-02-26 11:43:48.57921296 +0000 UTC m=+1974.390039414" watchObservedRunningTime="2026-02-26 11:43:48.588066053 +0000 UTC m=+1974.398892487" Feb 26 11:43:51 crc kubenswrapper[4699]: I0226 11:43:51.037686 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dz84d"] Feb 26 11:43:51 crc kubenswrapper[4699]: I0226 11:43:51.046682 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dz84d"] Feb 26 11:43:52 crc kubenswrapper[4699]: I0226 11:43:52.272518 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a" path="/var/lib/kubelet/pods/b5fb37dd-bd18-4ada-97c4-3ff3e3555d8a/volumes" Feb 26 11:44:00 crc kubenswrapper[4699]: I0226 11:44:00.138107 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535104-r58dw"] Feb 26 11:44:00 crc kubenswrapper[4699]: I0226 11:44:00.140865 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535104-r58dw" Feb 26 11:44:00 crc kubenswrapper[4699]: I0226 11:44:00.142608 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 11:44:00 crc kubenswrapper[4699]: I0226 11:44:00.144183 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 11:44:00 crc kubenswrapper[4699]: I0226 11:44:00.145089 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 11:44:00 crc kubenswrapper[4699]: I0226 11:44:00.149070 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535104-r58dw"] Feb 26 11:44:00 crc kubenswrapper[4699]: I0226 11:44:00.297497 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqj9x\" (UniqueName: \"kubernetes.io/projected/3a59d7ac-e643-4693-9c6b-994f1fadd83d-kube-api-access-nqj9x\") pod \"auto-csr-approver-29535104-r58dw\" (UID: \"3a59d7ac-e643-4693-9c6b-994f1fadd83d\") " pod="openshift-infra/auto-csr-approver-29535104-r58dw" Feb 26 11:44:00 crc kubenswrapper[4699]: I0226 11:44:00.399291 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqj9x\" (UniqueName: \"kubernetes.io/projected/3a59d7ac-e643-4693-9c6b-994f1fadd83d-kube-api-access-nqj9x\") pod \"auto-csr-approver-29535104-r58dw\" (UID: \"3a59d7ac-e643-4693-9c6b-994f1fadd83d\") " pod="openshift-infra/auto-csr-approver-29535104-r58dw" Feb 26 11:44:00 crc kubenswrapper[4699]: I0226 11:44:00.420101 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqj9x\" (UniqueName: \"kubernetes.io/projected/3a59d7ac-e643-4693-9c6b-994f1fadd83d-kube-api-access-nqj9x\") pod \"auto-csr-approver-29535104-r58dw\" (UID: \"3a59d7ac-e643-4693-9c6b-994f1fadd83d\") " pod="openshift-infra/auto-csr-approver-29535104-r58dw" Feb 26 11:44:00 crc kubenswrapper[4699]: I0226 11:44:00.470909 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535104-r58dw" Feb 26 11:44:00 crc kubenswrapper[4699]: I0226 11:44:00.944392 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535104-r58dw"] Feb 26 11:44:01 crc kubenswrapper[4699]: I0226 11:44:01.719081 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535104-r58dw" event={"ID":"3a59d7ac-e643-4693-9c6b-994f1fadd83d","Type":"ContainerStarted","Data":"81727e4e4d4367b44e1f05a5ee53466b5819100a16f232ecabf7d87d7d5e9e95"} Feb 26 11:44:03 crc kubenswrapper[4699]: I0226 11:44:03.742864 4699 generic.go:334] "Generic (PLEG): container finished" podID="3a59d7ac-e643-4693-9c6b-994f1fadd83d" containerID="6dd92189791b2617628aa3e717314eb02f69fda3f8d5e7e8ceb2bcddb537435f" exitCode=0 Feb 26 11:44:03 crc kubenswrapper[4699]: I0226 11:44:03.742979 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535104-r58dw" event={"ID":"3a59d7ac-e643-4693-9c6b-994f1fadd83d","Type":"ContainerDied","Data":"6dd92189791b2617628aa3e717314eb02f69fda3f8d5e7e8ceb2bcddb537435f"} Feb 26 11:44:05 crc kubenswrapper[4699]: I0226 11:44:05.101172 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535104-r58dw" Feb 26 11:44:05 crc kubenswrapper[4699]: I0226 11:44:05.202100 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqj9x\" (UniqueName: \"kubernetes.io/projected/3a59d7ac-e643-4693-9c6b-994f1fadd83d-kube-api-access-nqj9x\") pod \"3a59d7ac-e643-4693-9c6b-994f1fadd83d\" (UID: \"3a59d7ac-e643-4693-9c6b-994f1fadd83d\") " Feb 26 11:44:05 crc kubenswrapper[4699]: I0226 11:44:05.208568 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a59d7ac-e643-4693-9c6b-994f1fadd83d-kube-api-access-nqj9x" (OuterVolumeSpecName: "kube-api-access-nqj9x") pod "3a59d7ac-e643-4693-9c6b-994f1fadd83d" (UID: "3a59d7ac-e643-4693-9c6b-994f1fadd83d"). InnerVolumeSpecName "kube-api-access-nqj9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:44:05 crc kubenswrapper[4699]: I0226 11:44:05.305088 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqj9x\" (UniqueName: \"kubernetes.io/projected/3a59d7ac-e643-4693-9c6b-994f1fadd83d-kube-api-access-nqj9x\") on node \"crc\" DevicePath \"\"" Feb 26 11:44:05 crc kubenswrapper[4699]: I0226 11:44:05.761567 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535104-r58dw" event={"ID":"3a59d7ac-e643-4693-9c6b-994f1fadd83d","Type":"ContainerDied","Data":"81727e4e4d4367b44e1f05a5ee53466b5819100a16f232ecabf7d87d7d5e9e95"} Feb 26 11:44:05 crc kubenswrapper[4699]: I0226 11:44:05.761932 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81727e4e4d4367b44e1f05a5ee53466b5819100a16f232ecabf7d87d7d5e9e95" Feb 26 11:44:05 crc kubenswrapper[4699]: I0226 11:44:05.761614 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535104-r58dw" Feb 26 11:44:06 crc kubenswrapper[4699]: I0226 11:44:06.178411 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535098-km5z4"] Feb 26 11:44:06 crc kubenswrapper[4699]: I0226 11:44:06.189865 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535098-km5z4"] Feb 26 11:44:06 crc kubenswrapper[4699]: I0226 11:44:06.273496 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54818b28-fa0f-4021-9dc0-57f3186f3e64" path="/var/lib/kubelet/pods/54818b28-fa0f-4021-9dc0-57f3186f3e64/volumes" Feb 26 11:44:30 crc kubenswrapper[4699]: I0226 11:44:30.929224 4699 scope.go:117] "RemoveContainer" containerID="6a0914a3db1c0b6e1b3a5a9cf2e1d8ac0e44a6dc0eb35fc159954e4b3f365a3d" Feb 26 11:44:30 crc kubenswrapper[4699]: I0226 11:44:30.999716 4699 scope.go:117] "RemoveContainer" containerID="2cee4e67f7ca1be08a16734a80281eca2dc16bb5d20a6d285f430706b65292fe" Feb 26 11:44:31 crc kubenswrapper[4699]: I0226 11:44:31.047877 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-77cbz"] Feb 26 11:44:31 crc kubenswrapper[4699]: I0226 11:44:31.054412 4699 scope.go:117] "RemoveContainer" containerID="1b1986eede2e3874e8730ee539f7fe36f87c4471b7b1fdf2129756beebd0a599" Feb 26 11:44:31 crc kubenswrapper[4699]: I0226 11:44:31.058149 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-77cbz"] Feb 26 11:44:32 crc kubenswrapper[4699]: I0226 11:44:32.273367 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="462a2449-2712-4bb7-9ec9-6e09a1800361" path="/var/lib/kubelet/pods/462a2449-2712-4bb7-9ec9-6e09a1800361/volumes" Feb 26 11:44:35 crc kubenswrapper[4699]: I0226 11:44:35.037571 4699 generic.go:334] "Generic (PLEG): container finished" podID="85e0d37e-fb25-4bbc-afe5-7e6ab304390c" containerID="ba3628d7d80420e0729959a1fcf9d498ca569fe68feae991747716a7c0d13fa3" exitCode=0 Feb 26 11:44:35 crc kubenswrapper[4699]: I0226 11:44:35.037678 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h9q25" event={"ID":"85e0d37e-fb25-4bbc-afe5-7e6ab304390c","Type":"ContainerDied","Data":"ba3628d7d80420e0729959a1fcf9d498ca569fe68feae991747716a7c0d13fa3"} Feb 26 11:44:36 crc kubenswrapper[4699]: I0226 11:44:36.519817 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h9q25" Feb 26 11:44:36 crc kubenswrapper[4699]: I0226 11:44:36.650171 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/85e0d37e-fb25-4bbc-afe5-7e6ab304390c-ssh-key-openstack-edpm-ipam\") pod \"85e0d37e-fb25-4bbc-afe5-7e6ab304390c\" (UID: \"85e0d37e-fb25-4bbc-afe5-7e6ab304390c\") " Feb 26 11:44:36 crc kubenswrapper[4699]: I0226 11:44:36.650225 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv2ll\" (UniqueName: \"kubernetes.io/projected/85e0d37e-fb25-4bbc-afe5-7e6ab304390c-kube-api-access-qv2ll\") pod \"85e0d37e-fb25-4bbc-afe5-7e6ab304390c\" (UID: \"85e0d37e-fb25-4bbc-afe5-7e6ab304390c\") " Feb 26 11:44:36 crc kubenswrapper[4699]: I0226 11:44:36.650273 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85e0d37e-fb25-4bbc-afe5-7e6ab304390c-inventory\") pod \"85e0d37e-fb25-4bbc-afe5-7e6ab304390c\" (UID: \"85e0d37e-fb25-4bbc-afe5-7e6ab304390c\") " Feb 26 11:44:36 crc kubenswrapper[4699]: I0226 11:44:36.657620 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85e0d37e-fb25-4bbc-afe5-7e6ab304390c-kube-api-access-qv2ll" (OuterVolumeSpecName: "kube-api-access-qv2ll") pod "85e0d37e-fb25-4bbc-afe5-7e6ab304390c" (UID: "85e0d37e-fb25-4bbc-afe5-7e6ab304390c"). InnerVolumeSpecName "kube-api-access-qv2ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:44:36 crc kubenswrapper[4699]: I0226 11:44:36.680044 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e0d37e-fb25-4bbc-afe5-7e6ab304390c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "85e0d37e-fb25-4bbc-afe5-7e6ab304390c" (UID: "85e0d37e-fb25-4bbc-afe5-7e6ab304390c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:44:36 crc kubenswrapper[4699]: I0226 11:44:36.680097 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e0d37e-fb25-4bbc-afe5-7e6ab304390c-inventory" (OuterVolumeSpecName: "inventory") pod "85e0d37e-fb25-4bbc-afe5-7e6ab304390c" (UID: "85e0d37e-fb25-4bbc-afe5-7e6ab304390c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:44:36 crc kubenswrapper[4699]: I0226 11:44:36.752901 4699 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/85e0d37e-fb25-4bbc-afe5-7e6ab304390c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 11:44:36 crc kubenswrapper[4699]: I0226 11:44:36.753198 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv2ll\" (UniqueName: \"kubernetes.io/projected/85e0d37e-fb25-4bbc-afe5-7e6ab304390c-kube-api-access-qv2ll\") on node \"crc\" DevicePath \"\"" Feb 26 11:44:36 crc kubenswrapper[4699]: I0226 11:44:36.753325 4699 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85e0d37e-fb25-4bbc-afe5-7e6ab304390c-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.058069 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h9q25" event={"ID":"85e0d37e-fb25-4bbc-afe5-7e6ab304390c","Type":"ContainerDied","Data":"0a2e58c697cadea58aebf86626b65bde4f82fab06f41769728b26cd2783dc764"} Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.058423 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a2e58c697cadea58aebf86626b65bde4f82fab06f41769728b26cd2783dc764" Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.058183 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h9q25" Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.141854 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-t4sjg"] Feb 26 11:44:37 crc kubenswrapper[4699]: E0226 11:44:37.142338 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a59d7ac-e643-4693-9c6b-994f1fadd83d" containerName="oc" Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.142366 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a59d7ac-e643-4693-9c6b-994f1fadd83d" containerName="oc" Feb 26 11:44:37 crc kubenswrapper[4699]: E0226 11:44:37.142411 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e0d37e-fb25-4bbc-afe5-7e6ab304390c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.142428 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e0d37e-fb25-4bbc-afe5-7e6ab304390c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.142678 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a59d7ac-e643-4693-9c6b-994f1fadd83d" containerName="oc" Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.142706 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e0d37e-fb25-4bbc-afe5-7e6ab304390c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.143471 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-t4sjg" Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.145450 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-69sdb" Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.145852 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.145852 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.148203 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.153904 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-t4sjg"] Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.262580 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2sgr\" (UniqueName: \"kubernetes.io/projected/2930a730-d5e2-49e1-a618-7428b999a73d-kube-api-access-b2sgr\") pod \"ssh-known-hosts-edpm-deployment-t4sjg\" (UID: \"2930a730-d5e2-49e1-a618-7428b999a73d\") " pod="openstack/ssh-known-hosts-edpm-deployment-t4sjg" Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.262696 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2930a730-d5e2-49e1-a618-7428b999a73d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-t4sjg\" (UID: \"2930a730-d5e2-49e1-a618-7428b999a73d\") " pod="openstack/ssh-known-hosts-edpm-deployment-t4sjg" Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.262766 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2930a730-d5e2-49e1-a618-7428b999a73d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-t4sjg\" (UID: \"2930a730-d5e2-49e1-a618-7428b999a73d\") " pod="openstack/ssh-known-hosts-edpm-deployment-t4sjg" Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.366058 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2sgr\" (UniqueName: \"kubernetes.io/projected/2930a730-d5e2-49e1-a618-7428b999a73d-kube-api-access-b2sgr\") pod \"ssh-known-hosts-edpm-deployment-t4sjg\" (UID: \"2930a730-d5e2-49e1-a618-7428b999a73d\") " pod="openstack/ssh-known-hosts-edpm-deployment-t4sjg" Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.366250 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2930a730-d5e2-49e1-a618-7428b999a73d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-t4sjg\" (UID: \"2930a730-d5e2-49e1-a618-7428b999a73d\") " pod="openstack/ssh-known-hosts-edpm-deployment-t4sjg" Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.366326 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2930a730-d5e2-49e1-a618-7428b999a73d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-t4sjg\" (UID: \"2930a730-d5e2-49e1-a618-7428b999a73d\") " pod="openstack/ssh-known-hosts-edpm-deployment-t4sjg" Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.370697 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2930a730-d5e2-49e1-a618-7428b999a73d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-t4sjg\" (UID: \"2930a730-d5e2-49e1-a618-7428b999a73d\") " pod="openstack/ssh-known-hosts-edpm-deployment-t4sjg" Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.370710 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2930a730-d5e2-49e1-a618-7428b999a73d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-t4sjg\" (UID: \"2930a730-d5e2-49e1-a618-7428b999a73d\") " pod="openstack/ssh-known-hosts-edpm-deployment-t4sjg" Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.385240 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2sgr\" (UniqueName: \"kubernetes.io/projected/2930a730-d5e2-49e1-a618-7428b999a73d-kube-api-access-b2sgr\") pod \"ssh-known-hosts-edpm-deployment-t4sjg\" (UID: \"2930a730-d5e2-49e1-a618-7428b999a73d\") " pod="openstack/ssh-known-hosts-edpm-deployment-t4sjg" Feb 26 11:44:37 crc kubenswrapper[4699]: I0226 11:44:37.467553 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-t4sjg" Feb 26 11:44:38 crc kubenswrapper[4699]: I0226 11:44:38.002851 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-t4sjg"] Feb 26 11:44:38 crc kubenswrapper[4699]: I0226 11:44:38.067465 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-t4sjg" event={"ID":"2930a730-d5e2-49e1-a618-7428b999a73d","Type":"ContainerStarted","Data":"2063f30625bff358b16eb9d11ebeaaff802901d1ca01220a33d3df5d5689163f"} Feb 26 11:44:39 crc kubenswrapper[4699]: I0226 11:44:39.108302 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-t4sjg" event={"ID":"2930a730-d5e2-49e1-a618-7428b999a73d","Type":"ContainerStarted","Data":"fb17c90a52d984a5a986e30c43494b9457c4431e764fdc4b4d2b63320bf412e8"} Feb 26 11:44:39 crc kubenswrapper[4699]: I0226 11:44:39.130651 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-t4sjg" podStartSLOduration=1.534482164 podStartE2EDuration="2.130629655s" podCreationTimestamp="2026-02-26 11:44:37 +0000 UTC" firstStartedPulling="2026-02-26 11:44:38.000601692 +0000 UTC m=+2023.811428146" lastFinishedPulling="2026-02-26 11:44:38.596749203 +0000 UTC m=+2024.407575637" observedRunningTime="2026-02-26 11:44:39.126193148 +0000 UTC m=+2024.937019572" watchObservedRunningTime="2026-02-26 11:44:39.130629655 +0000 UTC m=+2024.941456089" Feb 26 11:44:46 crc kubenswrapper[4699]: I0226 11:44:46.185026 4699 generic.go:334] "Generic (PLEG): container finished" podID="2930a730-d5e2-49e1-a618-7428b999a73d" containerID="fb17c90a52d984a5a986e30c43494b9457c4431e764fdc4b4d2b63320bf412e8" exitCode=0 Feb 26 11:44:46 crc kubenswrapper[4699]: I0226 11:44:46.185084 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-t4sjg" event={"ID":"2930a730-d5e2-49e1-a618-7428b999a73d","Type":"ContainerDied","Data":"fb17c90a52d984a5a986e30c43494b9457c4431e764fdc4b4d2b63320bf412e8"} Feb 26 11:44:47 crc kubenswrapper[4699]: I0226 11:44:47.603016 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-t4sjg" Feb 26 11:44:47 crc kubenswrapper[4699]: I0226 11:44:47.678675 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2930a730-d5e2-49e1-a618-7428b999a73d-ssh-key-openstack-edpm-ipam\") pod \"2930a730-d5e2-49e1-a618-7428b999a73d\" (UID: \"2930a730-d5e2-49e1-a618-7428b999a73d\") " Feb 26 11:44:47 crc kubenswrapper[4699]: I0226 11:44:47.678833 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2sgr\" (UniqueName: \"kubernetes.io/projected/2930a730-d5e2-49e1-a618-7428b999a73d-kube-api-access-b2sgr\") pod \"2930a730-d5e2-49e1-a618-7428b999a73d\" (UID: \"2930a730-d5e2-49e1-a618-7428b999a73d\") " Feb 26 11:44:47 crc kubenswrapper[4699]: I0226 11:44:47.678916 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2930a730-d5e2-49e1-a618-7428b999a73d-inventory-0\") pod \"2930a730-d5e2-49e1-a618-7428b999a73d\" (UID: \"2930a730-d5e2-49e1-a618-7428b999a73d\") " Feb 26 11:44:47 crc kubenswrapper[4699]: I0226 11:44:47.684659 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2930a730-d5e2-49e1-a618-7428b999a73d-kube-api-access-b2sgr" (OuterVolumeSpecName: "kube-api-access-b2sgr") pod "2930a730-d5e2-49e1-a618-7428b999a73d" (UID: "2930a730-d5e2-49e1-a618-7428b999a73d"). InnerVolumeSpecName "kube-api-access-b2sgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:44:47 crc kubenswrapper[4699]: I0226 11:44:47.708554 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2930a730-d5e2-49e1-a618-7428b999a73d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2930a730-d5e2-49e1-a618-7428b999a73d" (UID: "2930a730-d5e2-49e1-a618-7428b999a73d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:44:47 crc kubenswrapper[4699]: I0226 11:44:47.713385 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2930a730-d5e2-49e1-a618-7428b999a73d-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "2930a730-d5e2-49e1-a618-7428b999a73d" (UID: "2930a730-d5e2-49e1-a618-7428b999a73d"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:44:47 crc kubenswrapper[4699]: I0226 11:44:47.782843 4699 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2930a730-d5e2-49e1-a618-7428b999a73d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 11:44:47 crc kubenswrapper[4699]: I0226 11:44:47.782946 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2sgr\" (UniqueName: \"kubernetes.io/projected/2930a730-d5e2-49e1-a618-7428b999a73d-kube-api-access-b2sgr\") on node \"crc\" DevicePath \"\"" Feb 26 11:44:47 crc kubenswrapper[4699]: I0226 11:44:47.782963 4699 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2930a730-d5e2-49e1-a618-7428b999a73d-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 26 11:44:48 crc kubenswrapper[4699]: I0226 11:44:48.202007 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-t4sjg" event={"ID":"2930a730-d5e2-49e1-a618-7428b999a73d","Type":"ContainerDied","Data":"2063f30625bff358b16eb9d11ebeaaff802901d1ca01220a33d3df5d5689163f"} Feb 26 11:44:48 crc kubenswrapper[4699]: I0226 11:44:48.202454 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2063f30625bff358b16eb9d11ebeaaff802901d1ca01220a33d3df5d5689163f" Feb 26 11:44:48 crc kubenswrapper[4699]: I0226 11:44:48.202090 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-t4sjg" Feb 26 11:44:48 crc kubenswrapper[4699]: I0226 11:44:48.307040 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8w2tv"] Feb 26 11:44:48 crc kubenswrapper[4699]: E0226 11:44:48.307702 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2930a730-d5e2-49e1-a618-7428b999a73d" containerName="ssh-known-hosts-edpm-deployment" Feb 26 11:44:48 crc kubenswrapper[4699]: I0226 11:44:48.307723 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="2930a730-d5e2-49e1-a618-7428b999a73d" containerName="ssh-known-hosts-edpm-deployment" Feb 26 11:44:48 crc kubenswrapper[4699]: I0226 11:44:48.307962 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="2930a730-d5e2-49e1-a618-7428b999a73d" containerName="ssh-known-hosts-edpm-deployment" Feb 26 11:44:48 crc kubenswrapper[4699]: I0226 11:44:48.308793 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8w2tv" Feb 26 11:44:48 crc kubenswrapper[4699]: I0226 11:44:48.310869 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 11:44:48 crc kubenswrapper[4699]: I0226 11:44:48.310978 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 11:44:48 crc kubenswrapper[4699]: I0226 11:44:48.311002 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 11:44:48 crc kubenswrapper[4699]: I0226 11:44:48.312449 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-69sdb" Feb 26 11:44:48 crc kubenswrapper[4699]: I0226 11:44:48.329454 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8w2tv"] Feb 26 11:44:48 crc kubenswrapper[4699]: I0226 11:44:48.396810 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/96b6beba-4e99-4cb7-b49b-3f211c5e12b7-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8w2tv\" (UID: \"96b6beba-4e99-4cb7-b49b-3f211c5e12b7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8w2tv" Feb 26 11:44:48 crc kubenswrapper[4699]: I0226 11:44:48.397168 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkgkx\" (UniqueName: \"kubernetes.io/projected/96b6beba-4e99-4cb7-b49b-3f211c5e12b7-kube-api-access-xkgkx\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8w2tv\" (UID: \"96b6beba-4e99-4cb7-b49b-3f211c5e12b7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8w2tv" Feb 26 11:44:48 crc kubenswrapper[4699]: I0226 11:44:48.397375 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96b6beba-4e99-4cb7-b49b-3f211c5e12b7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8w2tv\" (UID: \"96b6beba-4e99-4cb7-b49b-3f211c5e12b7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8w2tv" Feb 26 11:44:48 crc kubenswrapper[4699]: I0226 11:44:48.500744 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96b6beba-4e99-4cb7-b49b-3f211c5e12b7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8w2tv\" (UID: \"96b6beba-4e99-4cb7-b49b-3f211c5e12b7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8w2tv" Feb 26 11:44:48 crc kubenswrapper[4699]: I0226 11:44:48.500966 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/96b6beba-4e99-4cb7-b49b-3f211c5e12b7-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8w2tv\" (UID: \"96b6beba-4e99-4cb7-b49b-3f211c5e12b7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8w2tv" Feb 26 11:44:48 crc kubenswrapper[4699]: I0226 11:44:48.501023 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkgkx\" (UniqueName: \"kubernetes.io/projected/96b6beba-4e99-4cb7-b49b-3f211c5e12b7-kube-api-access-xkgkx\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8w2tv\" (UID: \"96b6beba-4e99-4cb7-b49b-3f211c5e12b7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8w2tv" Feb 26 11:44:48 crc kubenswrapper[4699]: I0226 11:44:48.506017 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96b6beba-4e99-4cb7-b49b-3f211c5e12b7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8w2tv\" (UID: \"96b6beba-4e99-4cb7-b49b-3f211c5e12b7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8w2tv" Feb 26 11:44:48 crc kubenswrapper[4699]: I0226 11:44:48.510703 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/96b6beba-4e99-4cb7-b49b-3f211c5e12b7-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8w2tv\" (UID: \"96b6beba-4e99-4cb7-b49b-3f211c5e12b7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8w2tv" Feb 26 11:44:48 crc kubenswrapper[4699]: I0226 11:44:48.517366 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkgkx\" (UniqueName: \"kubernetes.io/projected/96b6beba-4e99-4cb7-b49b-3f211c5e12b7-kube-api-access-xkgkx\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8w2tv\" (UID: \"96b6beba-4e99-4cb7-b49b-3f211c5e12b7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8w2tv" Feb 26 11:44:48 crc kubenswrapper[4699]: I0226 11:44:48.628988 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8w2tv" Feb 26 11:44:49 crc kubenswrapper[4699]: I0226 11:44:49.129687 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8w2tv"] Feb 26 11:44:49 crc kubenswrapper[4699]: I0226 11:44:49.210370 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8w2tv" event={"ID":"96b6beba-4e99-4cb7-b49b-3f211c5e12b7","Type":"ContainerStarted","Data":"9574220a6d18084e8f19822098cf7500705998788cce5d11548c5e481341c3cc"} Feb 26 11:44:50 crc kubenswrapper[4699]: I0226 11:44:50.225337 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8w2tv" event={"ID":"96b6beba-4e99-4cb7-b49b-3f211c5e12b7","Type":"ContainerStarted","Data":"77c59d81e51e69d4d4ba9639877a0bef167616ca48d8fad172b42d426051feab"} Feb 26 11:44:50 crc kubenswrapper[4699]: I0226 11:44:50.245625 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8w2tv" podStartSLOduration=1.6117989640000001 podStartE2EDuration="2.245606801s" podCreationTimestamp="2026-02-26 11:44:48 +0000 UTC" firstStartedPulling="2026-02-26 11:44:49.148540439 +0000 UTC m=+2034.959366873" lastFinishedPulling="2026-02-26 11:44:49.782348276 +0000 UTC m=+2035.593174710" observedRunningTime="2026-02-26 11:44:50.242053709 +0000 UTC m=+2036.052880163" watchObservedRunningTime="2026-02-26 11:44:50.245606801 +0000 UTC m=+2036.056433235" Feb 26 11:44:58 crc kubenswrapper[4699]: I0226 11:44:58.288838 4699 generic.go:334] "Generic (PLEG): container finished" podID="96b6beba-4e99-4cb7-b49b-3f211c5e12b7" containerID="77c59d81e51e69d4d4ba9639877a0bef167616ca48d8fad172b42d426051feab" exitCode=0 Feb 26 11:44:58 crc kubenswrapper[4699]: I0226 11:44:58.288955 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8w2tv" event={"ID":"96b6beba-4e99-4cb7-b49b-3f211c5e12b7","Type":"ContainerDied","Data":"77c59d81e51e69d4d4ba9639877a0bef167616ca48d8fad172b42d426051feab"} Feb 26 11:44:59 crc kubenswrapper[4699]: I0226 11:44:59.711671 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8w2tv" Feb 26 11:44:59 crc kubenswrapper[4699]: I0226 11:44:59.835752 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96b6beba-4e99-4cb7-b49b-3f211c5e12b7-inventory\") pod \"96b6beba-4e99-4cb7-b49b-3f211c5e12b7\" (UID: \"96b6beba-4e99-4cb7-b49b-3f211c5e12b7\") " Feb 26 11:44:59 crc kubenswrapper[4699]: I0226 11:44:59.835992 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkgkx\" (UniqueName: \"kubernetes.io/projected/96b6beba-4e99-4cb7-b49b-3f211c5e12b7-kube-api-access-xkgkx\") pod \"96b6beba-4e99-4cb7-b49b-3f211c5e12b7\" (UID: \"96b6beba-4e99-4cb7-b49b-3f211c5e12b7\") " Feb 26 11:44:59 crc kubenswrapper[4699]: I0226 11:44:59.836105 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/96b6beba-4e99-4cb7-b49b-3f211c5e12b7-ssh-key-openstack-edpm-ipam\") pod \"96b6beba-4e99-4cb7-b49b-3f211c5e12b7\" (UID: \"96b6beba-4e99-4cb7-b49b-3f211c5e12b7\") " Feb 26 11:44:59 crc kubenswrapper[4699]: I0226 11:44:59.842306 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b6beba-4e99-4cb7-b49b-3f211c5e12b7-kube-api-access-xkgkx" (OuterVolumeSpecName: "kube-api-access-xkgkx") pod "96b6beba-4e99-4cb7-b49b-3f211c5e12b7" (UID: "96b6beba-4e99-4cb7-b49b-3f211c5e12b7"). InnerVolumeSpecName "kube-api-access-xkgkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:44:59 crc kubenswrapper[4699]: I0226 11:44:59.862458 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b6beba-4e99-4cb7-b49b-3f211c5e12b7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "96b6beba-4e99-4cb7-b49b-3f211c5e12b7" (UID: "96b6beba-4e99-4cb7-b49b-3f211c5e12b7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:44:59 crc kubenswrapper[4699]: I0226 11:44:59.863389 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b6beba-4e99-4cb7-b49b-3f211c5e12b7-inventory" (OuterVolumeSpecName: "inventory") pod "96b6beba-4e99-4cb7-b49b-3f211c5e12b7" (UID: "96b6beba-4e99-4cb7-b49b-3f211c5e12b7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:44:59 crc kubenswrapper[4699]: I0226 11:44:59.938470 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkgkx\" (UniqueName: \"kubernetes.io/projected/96b6beba-4e99-4cb7-b49b-3f211c5e12b7-kube-api-access-xkgkx\") on node \"crc\" DevicePath \"\"" Feb 26 11:44:59 crc kubenswrapper[4699]: I0226 11:44:59.938502 4699 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/96b6beba-4e99-4cb7-b49b-3f211c5e12b7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 11:44:59 crc kubenswrapper[4699]: I0226 11:44:59.938516 4699 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96b6beba-4e99-4cb7-b49b-3f211c5e12b7-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.151780 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535105-qcdn8"] Feb 26 11:45:00 crc kubenswrapper[4699]: E0226 11:45:00.152419 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96b6beba-4e99-4cb7-b49b-3f211c5e12b7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.152442 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b6beba-4e99-4cb7-b49b-3f211c5e12b7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.152615 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="96b6beba-4e99-4cb7-b49b-3f211c5e12b7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.155726 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535105-qcdn8" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.159236 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.159693 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.184229 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535105-qcdn8"] Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.244168 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3108c286-4671-45d5-ac60-fc5d8f4a9c17-config-volume\") pod \"collect-profiles-29535105-qcdn8\" (UID: \"3108c286-4671-45d5-ac60-fc5d8f4a9c17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535105-qcdn8" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.244242 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3108c286-4671-45d5-ac60-fc5d8f4a9c17-secret-volume\") pod \"collect-profiles-29535105-qcdn8\" (UID: \"3108c286-4671-45d5-ac60-fc5d8f4a9c17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535105-qcdn8" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.244295 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-567tp\" (UniqueName: \"kubernetes.io/projected/3108c286-4671-45d5-ac60-fc5d8f4a9c17-kube-api-access-567tp\") pod \"collect-profiles-29535105-qcdn8\" (UID: \"3108c286-4671-45d5-ac60-fc5d8f4a9c17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535105-qcdn8" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.309360 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8w2tv" event={"ID":"96b6beba-4e99-4cb7-b49b-3f211c5e12b7","Type":"ContainerDied","Data":"9574220a6d18084e8f19822098cf7500705998788cce5d11548c5e481341c3cc"} Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.309839 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9574220a6d18084e8f19822098cf7500705998788cce5d11548c5e481341c3cc" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.309488 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8w2tv" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.356239 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3108c286-4671-45d5-ac60-fc5d8f4a9c17-config-volume\") pod \"collect-profiles-29535105-qcdn8\" (UID: \"3108c286-4671-45d5-ac60-fc5d8f4a9c17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535105-qcdn8" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.356397 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3108c286-4671-45d5-ac60-fc5d8f4a9c17-secret-volume\") pod \"collect-profiles-29535105-qcdn8\" (UID: \"3108c286-4671-45d5-ac60-fc5d8f4a9c17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535105-qcdn8" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.356519 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-567tp\" (UniqueName: \"kubernetes.io/projected/3108c286-4671-45d5-ac60-fc5d8f4a9c17-kube-api-access-567tp\") pod \"collect-profiles-29535105-qcdn8\" (UID: \"3108c286-4671-45d5-ac60-fc5d8f4a9c17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535105-qcdn8" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.360369 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3108c286-4671-45d5-ac60-fc5d8f4a9c17-config-volume\") pod \"collect-profiles-29535105-qcdn8\" (UID: \"3108c286-4671-45d5-ac60-fc5d8f4a9c17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535105-qcdn8" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.366210 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3108c286-4671-45d5-ac60-fc5d8f4a9c17-secret-volume\") pod \"collect-profiles-29535105-qcdn8\" (UID: \"3108c286-4671-45d5-ac60-fc5d8f4a9c17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535105-qcdn8" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.391981 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-567tp\" (UniqueName: \"kubernetes.io/projected/3108c286-4671-45d5-ac60-fc5d8f4a9c17-kube-api-access-567tp\") pod \"collect-profiles-29535105-qcdn8\" (UID: \"3108c286-4671-45d5-ac60-fc5d8f4a9c17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535105-qcdn8" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.406869 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l"] Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.408185 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.416006 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.416014 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-69sdb" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.416884 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.417313 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l"] Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.418434 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.481634 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535105-qcdn8" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.561649 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjhg4\" (UniqueName: \"kubernetes.io/projected/a1aabb80-3c23-4f5a-9bd1-4d573089856c-kube-api-access-kjhg4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l\" (UID: \"a1aabb80-3c23-4f5a-9bd1-4d573089856c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.561986 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1aabb80-3c23-4f5a-9bd1-4d573089856c-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l\" (UID: \"a1aabb80-3c23-4f5a-9bd1-4d573089856c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.562022 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1aabb80-3c23-4f5a-9bd1-4d573089856c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l\" (UID: \"a1aabb80-3c23-4f5a-9bd1-4d573089856c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.664104 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1aabb80-3c23-4f5a-9bd1-4d573089856c-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l\" (UID: \"a1aabb80-3c23-4f5a-9bd1-4d573089856c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.664181 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1aabb80-3c23-4f5a-9bd1-4d573089856c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l\" (UID: \"a1aabb80-3c23-4f5a-9bd1-4d573089856c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.664318 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjhg4\" (UniqueName: \"kubernetes.io/projected/a1aabb80-3c23-4f5a-9bd1-4d573089856c-kube-api-access-kjhg4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l\" (UID: \"a1aabb80-3c23-4f5a-9bd1-4d573089856c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.669046 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1aabb80-3c23-4f5a-9bd1-4d573089856c-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l\" (UID: \"a1aabb80-3c23-4f5a-9bd1-4d573089856c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.672942 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1aabb80-3c23-4f5a-9bd1-4d573089856c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l\" (UID: \"a1aabb80-3c23-4f5a-9bd1-4d573089856c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.683161 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjhg4\" (UniqueName: \"kubernetes.io/projected/a1aabb80-3c23-4f5a-9bd1-4d573089856c-kube-api-access-kjhg4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l\" (UID: \"a1aabb80-3c23-4f5a-9bd1-4d573089856c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.737862 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l" Feb 26 11:45:00 crc kubenswrapper[4699]: I0226 11:45:00.903411 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535105-qcdn8"] Feb 26 11:45:00 crc kubenswrapper[4699]: W0226 11:45:00.909510 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3108c286_4671_45d5_ac60_fc5d8f4a9c17.slice/crio-b858b8207af0ca8569020fadec0a199634c2068e68731588f058f544f56c3b11 WatchSource:0}: Error finding container b858b8207af0ca8569020fadec0a199634c2068e68731588f058f544f56c3b11: Status 404 returned error can't find the container with id b858b8207af0ca8569020fadec0a199634c2068e68731588f058f544f56c3b11 Feb 26 11:45:01 crc kubenswrapper[4699]: I0226 11:45:01.263527 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l"] Feb 26 11:45:01 crc kubenswrapper[4699]: W0226 11:45:01.266541 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1aabb80_3c23_4f5a_9bd1_4d573089856c.slice/crio-44db8411ee307977b727b88c04835def7349e614d20428a8ac1dce69f67272db WatchSource:0}: Error finding container 44db8411ee307977b727b88c04835def7349e614d20428a8ac1dce69f67272db: Status 404 returned error can't find the container with id 44db8411ee307977b727b88c04835def7349e614d20428a8ac1dce69f67272db Feb 26 11:45:01 crc kubenswrapper[4699]: I0226 11:45:01.339038 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535105-qcdn8" event={"ID":"3108c286-4671-45d5-ac60-fc5d8f4a9c17","Type":"ContainerStarted","Data":"b858b8207af0ca8569020fadec0a199634c2068e68731588f058f544f56c3b11"} Feb 26 11:45:01 crc kubenswrapper[4699]: I0226 11:45:01.339967 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l" event={"ID":"a1aabb80-3c23-4f5a-9bd1-4d573089856c","Type":"ContainerStarted","Data":"44db8411ee307977b727b88c04835def7349e614d20428a8ac1dce69f67272db"} Feb 26 11:45:02 crc kubenswrapper[4699]: I0226 11:45:02.350958 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535105-qcdn8" event={"ID":"3108c286-4671-45d5-ac60-fc5d8f4a9c17","Type":"ContainerDied","Data":"5adf826586bde3c763d1261019557f3e2b59eaf1c943090396c46b11018cc761"} Feb 26 11:45:02 crc kubenswrapper[4699]: I0226 11:45:02.350822 4699 generic.go:334] "Generic (PLEG): container finished" podID="3108c286-4671-45d5-ac60-fc5d8f4a9c17" containerID="5adf826586bde3c763d1261019557f3e2b59eaf1c943090396c46b11018cc761" exitCode=0 Feb 26 11:45:03 crc kubenswrapper[4699]: I0226 11:45:03.365757 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l" event={"ID":"a1aabb80-3c23-4f5a-9bd1-4d573089856c","Type":"ContainerStarted","Data":"8bafcbd10d6755dab078792a930009cce1ed58cd9190fd7aecf6ae0b9170fff3"} Feb 26 11:45:03 crc kubenswrapper[4699]: I0226 11:45:03.398964 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l" podStartSLOduration=2.488979303 podStartE2EDuration="3.398940449s" podCreationTimestamp="2026-02-26 11:45:00 +0000 UTC" firstStartedPulling="2026-02-26 11:45:01.270986887 +0000 UTC m=+2047.081813321" lastFinishedPulling="2026-02-26 11:45:02.180948023 +0000 UTC m=+2047.991774467" observedRunningTime="2026-02-26 11:45:03.388732087 +0000 UTC m=+2049.199558541" watchObservedRunningTime="2026-02-26 11:45:03.398940449 +0000 UTC m=+2049.209766883" Feb 26 11:45:03 crc kubenswrapper[4699]: I0226 11:45:03.677952 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535105-qcdn8" Feb 26 11:45:03 crc kubenswrapper[4699]: I0226 11:45:03.831864 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3108c286-4671-45d5-ac60-fc5d8f4a9c17-config-volume\") pod \"3108c286-4671-45d5-ac60-fc5d8f4a9c17\" (UID: \"3108c286-4671-45d5-ac60-fc5d8f4a9c17\") " Feb 26 11:45:03 crc kubenswrapper[4699]: I0226 11:45:03.831924 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3108c286-4671-45d5-ac60-fc5d8f4a9c17-secret-volume\") pod \"3108c286-4671-45d5-ac60-fc5d8f4a9c17\" (UID: \"3108c286-4671-45d5-ac60-fc5d8f4a9c17\") " Feb 26 11:45:03 crc kubenswrapper[4699]: I0226 11:45:03.832016 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-567tp\" (UniqueName: \"kubernetes.io/projected/3108c286-4671-45d5-ac60-fc5d8f4a9c17-kube-api-access-567tp\") pod \"3108c286-4671-45d5-ac60-fc5d8f4a9c17\" (UID: \"3108c286-4671-45d5-ac60-fc5d8f4a9c17\") " Feb 26 11:45:03 crc kubenswrapper[4699]: I0226 11:45:03.833257 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3108c286-4671-45d5-ac60-fc5d8f4a9c17-config-volume" (OuterVolumeSpecName: "config-volume") pod "3108c286-4671-45d5-ac60-fc5d8f4a9c17" (UID: "3108c286-4671-45d5-ac60-fc5d8f4a9c17"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:45:03 crc kubenswrapper[4699]: I0226 11:45:03.838655 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3108c286-4671-45d5-ac60-fc5d8f4a9c17-kube-api-access-567tp" (OuterVolumeSpecName: "kube-api-access-567tp") pod "3108c286-4671-45d5-ac60-fc5d8f4a9c17" (UID: "3108c286-4671-45d5-ac60-fc5d8f4a9c17"). InnerVolumeSpecName "kube-api-access-567tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:45:03 crc kubenswrapper[4699]: I0226 11:45:03.852538 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3108c286-4671-45d5-ac60-fc5d8f4a9c17-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3108c286-4671-45d5-ac60-fc5d8f4a9c17" (UID: "3108c286-4671-45d5-ac60-fc5d8f4a9c17"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:45:03 crc kubenswrapper[4699]: I0226 11:45:03.935022 4699 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3108c286-4671-45d5-ac60-fc5d8f4a9c17-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:03 crc kubenswrapper[4699]: I0226 11:45:03.935067 4699 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3108c286-4671-45d5-ac60-fc5d8f4a9c17-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:03 crc kubenswrapper[4699]: I0226 11:45:03.935080 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-567tp\" (UniqueName: \"kubernetes.io/projected/3108c286-4671-45d5-ac60-fc5d8f4a9c17-kube-api-access-567tp\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:04 crc kubenswrapper[4699]: I0226 11:45:04.379591 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535105-qcdn8" event={"ID":"3108c286-4671-45d5-ac60-fc5d8f4a9c17","Type":"ContainerDied","Data":"b858b8207af0ca8569020fadec0a199634c2068e68731588f058f544f56c3b11"} Feb 26 11:45:04 crc kubenswrapper[4699]: I0226 11:45:04.380025 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b858b8207af0ca8569020fadec0a199634c2068e68731588f058f544f56c3b11" Feb 26 11:45:04 crc kubenswrapper[4699]: I0226 11:45:04.379616 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535105-qcdn8" Feb 26 11:45:04 crc kubenswrapper[4699]: I0226 11:45:04.764337 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535060-f97rd"] Feb 26 11:45:04 crc kubenswrapper[4699]: I0226 11:45:04.773511 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535060-f97rd"] Feb 26 11:45:06 crc kubenswrapper[4699]: I0226 11:45:06.273226 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f8a28b8-c47b-4288-877f-8e90a3b581b5" path="/var/lib/kubelet/pods/5f8a28b8-c47b-4288-877f-8e90a3b581b5/volumes" Feb 26 11:45:11 crc kubenswrapper[4699]: I0226 11:45:11.585272 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:45:11 crc kubenswrapper[4699]: I0226 11:45:11.585829 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:45:13 crc kubenswrapper[4699]: E0226 11:45:13.475474 4699 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.215s" Feb 26 11:45:13 crc kubenswrapper[4699]: I0226 11:45:13.475852 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x2xcg"] Feb 26 11:45:13 crc kubenswrapper[4699]: E0226 11:45:13.476215 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3108c286-4671-45d5-ac60-fc5d8f4a9c17" containerName="collect-profiles" Feb 26 11:45:13 crc kubenswrapper[4699]: I0226 11:45:13.476229 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="3108c286-4671-45d5-ac60-fc5d8f4a9c17" containerName="collect-profiles" Feb 26 11:45:13 crc kubenswrapper[4699]: I0226 11:45:13.476424 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="3108c286-4671-45d5-ac60-fc5d8f4a9c17" containerName="collect-profiles" Feb 26 11:45:13 crc kubenswrapper[4699]: I0226 11:45:13.478066 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2xcg" Feb 26 11:45:13 crc kubenswrapper[4699]: I0226 11:45:13.483106 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2xcg"] Feb 26 11:45:13 crc kubenswrapper[4699]: I0226 11:45:13.503097 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d945a7c6-8e43-4dae-8521-e5e8b04f612d-utilities\") pod \"redhat-marketplace-x2xcg\" (UID: \"d945a7c6-8e43-4dae-8521-e5e8b04f612d\") " pod="openshift-marketplace/redhat-marketplace-x2xcg" Feb 26 11:45:13 crc kubenswrapper[4699]: I0226 11:45:13.503295 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d945a7c6-8e43-4dae-8521-e5e8b04f612d-catalog-content\") pod \"redhat-marketplace-x2xcg\" (UID: \"d945a7c6-8e43-4dae-8521-e5e8b04f612d\") " pod="openshift-marketplace/redhat-marketplace-x2xcg" Feb 26 11:45:13 crc kubenswrapper[4699]: I0226 11:45:13.503384 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg8xl\" (UniqueName: \"kubernetes.io/projected/d945a7c6-8e43-4dae-8521-e5e8b04f612d-kube-api-access-mg8xl\") pod \"redhat-marketplace-x2xcg\" (UID: \"d945a7c6-8e43-4dae-8521-e5e8b04f612d\") " pod="openshift-marketplace/redhat-marketplace-x2xcg" Feb 26 11:45:13 crc kubenswrapper[4699]: I0226 11:45:13.607161 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d945a7c6-8e43-4dae-8521-e5e8b04f612d-utilities\") pod \"redhat-marketplace-x2xcg\" (UID: \"d945a7c6-8e43-4dae-8521-e5e8b04f612d\") " pod="openshift-marketplace/redhat-marketplace-x2xcg" Feb 26 11:45:13 crc kubenswrapper[4699]: I0226 11:45:13.607693 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d945a7c6-8e43-4dae-8521-e5e8b04f612d-utilities\") pod \"redhat-marketplace-x2xcg\" (UID: \"d945a7c6-8e43-4dae-8521-e5e8b04f612d\") " pod="openshift-marketplace/redhat-marketplace-x2xcg" Feb 26 11:45:13 crc kubenswrapper[4699]: I0226 11:45:13.607900 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d945a7c6-8e43-4dae-8521-e5e8b04f612d-catalog-content\") pod \"redhat-marketplace-x2xcg\" (UID: \"d945a7c6-8e43-4dae-8521-e5e8b04f612d\") " pod="openshift-marketplace/redhat-marketplace-x2xcg" Feb 26 11:45:13 crc kubenswrapper[4699]: I0226 11:45:13.607969 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg8xl\" (UniqueName: \"kubernetes.io/projected/d945a7c6-8e43-4dae-8521-e5e8b04f612d-kube-api-access-mg8xl\") pod \"redhat-marketplace-x2xcg\" (UID: \"d945a7c6-8e43-4dae-8521-e5e8b04f612d\") " pod="openshift-marketplace/redhat-marketplace-x2xcg" Feb 26 11:45:13 crc kubenswrapper[4699]: I0226 11:45:13.608585 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d945a7c6-8e43-4dae-8521-e5e8b04f612d-catalog-content\") pod \"redhat-marketplace-x2xcg\" (UID: \"d945a7c6-8e43-4dae-8521-e5e8b04f612d\") " pod="openshift-marketplace/redhat-marketplace-x2xcg" Feb 26 11:45:13 crc kubenswrapper[4699]: I0226 11:45:13.634979 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg8xl\" (UniqueName: \"kubernetes.io/projected/d945a7c6-8e43-4dae-8521-e5e8b04f612d-kube-api-access-mg8xl\") pod \"redhat-marketplace-x2xcg\" (UID: \"d945a7c6-8e43-4dae-8521-e5e8b04f612d\") " pod="openshift-marketplace/redhat-marketplace-x2xcg" Feb 26 11:45:13 crc kubenswrapper[4699]: I0226 11:45:13.822220 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2xcg" Feb 26 11:45:14 crc kubenswrapper[4699]: I0226 11:45:14.291616 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2xcg"] Feb 26 11:45:14 crc kubenswrapper[4699]: I0226 11:45:14.486496 4699 generic.go:334] "Generic (PLEG): container finished" podID="a1aabb80-3c23-4f5a-9bd1-4d573089856c" containerID="8bafcbd10d6755dab078792a930009cce1ed58cd9190fd7aecf6ae0b9170fff3" exitCode=0 Feb 26 11:45:14 crc kubenswrapper[4699]: I0226 11:45:14.486587 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l" event={"ID":"a1aabb80-3c23-4f5a-9bd1-4d573089856c","Type":"ContainerDied","Data":"8bafcbd10d6755dab078792a930009cce1ed58cd9190fd7aecf6ae0b9170fff3"} Feb 26 11:45:14 crc kubenswrapper[4699]: I0226 11:45:14.487800 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2xcg" event={"ID":"d945a7c6-8e43-4dae-8521-e5e8b04f612d","Type":"ContainerStarted","Data":"1369ca8ad6f82a788fe86943ed6a84b06e1a5107f0a845225dd71243c41f0ef7"} Feb 26 11:45:15 crc kubenswrapper[4699]: I0226 11:45:15.504683 4699 generic.go:334] "Generic (PLEG): container finished" podID="d945a7c6-8e43-4dae-8521-e5e8b04f612d" containerID="4cf3d7d09ae2ea0a26c0849df1b9968db98d3306fdbbd4dc77bd87b713945f7a" exitCode=0 Feb 26 11:45:15 crc kubenswrapper[4699]: I0226 11:45:15.505241 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2xcg" event={"ID":"d945a7c6-8e43-4dae-8521-e5e8b04f612d","Type":"ContainerDied","Data":"4cf3d7d09ae2ea0a26c0849df1b9968db98d3306fdbbd4dc77bd87b713945f7a"} Feb 26 11:45:15 crc kubenswrapper[4699]: I0226 11:45:15.987298 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.160990 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjhg4\" (UniqueName: \"kubernetes.io/projected/a1aabb80-3c23-4f5a-9bd1-4d573089856c-kube-api-access-kjhg4\") pod \"a1aabb80-3c23-4f5a-9bd1-4d573089856c\" (UID: \"a1aabb80-3c23-4f5a-9bd1-4d573089856c\") " Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.161055 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1aabb80-3c23-4f5a-9bd1-4d573089856c-ssh-key-openstack-edpm-ipam\") pod \"a1aabb80-3c23-4f5a-9bd1-4d573089856c\" (UID: \"a1aabb80-3c23-4f5a-9bd1-4d573089856c\") " Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.161146 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1aabb80-3c23-4f5a-9bd1-4d573089856c-inventory\") pod \"a1aabb80-3c23-4f5a-9bd1-4d573089856c\" (UID: \"a1aabb80-3c23-4f5a-9bd1-4d573089856c\") " Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.175614 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1aabb80-3c23-4f5a-9bd1-4d573089856c-kube-api-access-kjhg4" (OuterVolumeSpecName: "kube-api-access-kjhg4") pod "a1aabb80-3c23-4f5a-9bd1-4d573089856c" (UID: "a1aabb80-3c23-4f5a-9bd1-4d573089856c"). InnerVolumeSpecName "kube-api-access-kjhg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.186752 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1aabb80-3c23-4f5a-9bd1-4d573089856c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a1aabb80-3c23-4f5a-9bd1-4d573089856c" (UID: "a1aabb80-3c23-4f5a-9bd1-4d573089856c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.194394 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1aabb80-3c23-4f5a-9bd1-4d573089856c-inventory" (OuterVolumeSpecName: "inventory") pod "a1aabb80-3c23-4f5a-9bd1-4d573089856c" (UID: "a1aabb80-3c23-4f5a-9bd1-4d573089856c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.263569 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjhg4\" (UniqueName: \"kubernetes.io/projected/a1aabb80-3c23-4f5a-9bd1-4d573089856c-kube-api-access-kjhg4\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.263623 4699 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1aabb80-3c23-4f5a-9bd1-4d573089856c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.263641 4699 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1aabb80-3c23-4f5a-9bd1-4d573089856c-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.478747 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cxw5q"] Feb 26 11:45:16 crc kubenswrapper[4699]: E0226 11:45:16.479821 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1aabb80-3c23-4f5a-9bd1-4d573089856c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.479849 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1aabb80-3c23-4f5a-9bd1-4d573089856c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.480036 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1aabb80-3c23-4f5a-9bd1-4d573089856c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.481603 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cxw5q" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.489604 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cxw5q"] Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.529565 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l" event={"ID":"a1aabb80-3c23-4f5a-9bd1-4d573089856c","Type":"ContainerDied","Data":"44db8411ee307977b727b88c04835def7349e614d20428a8ac1dce69f67272db"} Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.529619 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44db8411ee307977b727b88c04835def7349e614d20428a8ac1dce69f67272db" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.529706 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.539545 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2xcg" event={"ID":"d945a7c6-8e43-4dae-8521-e5e8b04f612d","Type":"ContainerStarted","Data":"aec8d60c34d09df244e1612b82212b1617de26a8807d6e9649ca7c202efe59a6"} Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.672161 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b82d852b-9054-4ee4-96b8-36f007b257f3-utilities\") pod \"certified-operators-cxw5q\" (UID: \"b82d852b-9054-4ee4-96b8-36f007b257f3\") " pod="openshift-marketplace/certified-operators-cxw5q" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.672330 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb7bf\" (UniqueName: \"kubernetes.io/projected/b82d852b-9054-4ee4-96b8-36f007b257f3-kube-api-access-jb7bf\") pod \"certified-operators-cxw5q\" (UID: \"b82d852b-9054-4ee4-96b8-36f007b257f3\") " pod="openshift-marketplace/certified-operators-cxw5q" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.672444 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b82d852b-9054-4ee4-96b8-36f007b257f3-catalog-content\") pod \"certified-operators-cxw5q\" (UID: \"b82d852b-9054-4ee4-96b8-36f007b257f3\") " pod="openshift-marketplace/certified-operators-cxw5q" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.674834 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv"] Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.676324 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.678774 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.678921 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.680441 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.680772 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.680953 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.681106 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.681277 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-69sdb" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.681466 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.722986 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv"] Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.774150 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.774203 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.774232 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.774276 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.774330 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb7bf\" (UniqueName: \"kubernetes.io/projected/b82d852b-9054-4ee4-96b8-36f007b257f3-kube-api-access-jb7bf\") pod \"certified-operators-cxw5q\" (UID: \"b82d852b-9054-4ee4-96b8-36f007b257f3\") " pod="openshift-marketplace/certified-operators-cxw5q" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.774372 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.774417 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b82d852b-9054-4ee4-96b8-36f007b257f3-catalog-content\") pod \"certified-operators-cxw5q\" (UID: \"b82d852b-9054-4ee4-96b8-36f007b257f3\") " pod="openshift-marketplace/certified-operators-cxw5q" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.774438 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.774463 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.774510 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.774535 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.774555 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b82d852b-9054-4ee4-96b8-36f007b257f3-utilities\") pod \"certified-operators-cxw5q\" (UID: \"b82d852b-9054-4ee4-96b8-36f007b257f3\") " pod="openshift-marketplace/certified-operators-cxw5q" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.774588 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.774608 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjnhg\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-kube-api-access-cjnhg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.774640 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.774665 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.774681 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.774924 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b82d852b-9054-4ee4-96b8-36f007b257f3-catalog-content\") pod \"certified-operators-cxw5q\" (UID: \"b82d852b-9054-4ee4-96b8-36f007b257f3\") " pod="openshift-marketplace/certified-operators-cxw5q" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.774984 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b82d852b-9054-4ee4-96b8-36f007b257f3-utilities\") pod \"certified-operators-cxw5q\" (UID: \"b82d852b-9054-4ee4-96b8-36f007b257f3\") " pod="openshift-marketplace/certified-operators-cxw5q" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.793832 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb7bf\" (UniqueName: \"kubernetes.io/projected/b82d852b-9054-4ee4-96b8-36f007b257f3-kube-api-access-jb7bf\") pod \"certified-operators-cxw5q\" (UID: \"b82d852b-9054-4ee4-96b8-36f007b257f3\") " pod="openshift-marketplace/certified-operators-cxw5q" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.803416 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cxw5q" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.876510 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.876584 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjnhg\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-kube-api-access-cjnhg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.876622 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.876657 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.876685 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.876718 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.876742 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.876772 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.876804 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.876873 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.876934 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.876957 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.877017 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.877041 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.881252 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.882643 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.883311 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.883793 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.891247 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.896044 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.898701 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.900305 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.898718 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.900745 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.901915 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.904710 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjnhg\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-kube-api-access-cjnhg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.906931 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:16 crc kubenswrapper[4699]: I0226 11:45:16.956934 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:17 crc kubenswrapper[4699]: I0226 11:45:17.001803 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:17 crc kubenswrapper[4699]: E0226 11:45:17.072824 4699 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd945a7c6_8e43_4dae_8521_e5e8b04f612d.slice/crio-conmon-aec8d60c34d09df244e1612b82212b1617de26a8807d6e9649ca7c202efe59a6.scope\": RecentStats: unable to find data in memory cache]" Feb 26 11:45:17 crc kubenswrapper[4699]: I0226 11:45:17.269653 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cxw5q"] Feb 26 11:45:17 crc kubenswrapper[4699]: I0226 11:45:17.429562 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv"] Feb 26 11:45:17 crc kubenswrapper[4699]: I0226 11:45:17.549151 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" event={"ID":"e537c30c-dc6b-406f-bb86-5540ebd8a36d","Type":"ContainerStarted","Data":"d402e0aae0335e59241cdcb945195a8b0e0b32ee1dcab2f8a43f80904ff391a8"} Feb 26 11:45:17 crc kubenswrapper[4699]: I0226 11:45:17.554271 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cxw5q" event={"ID":"b82d852b-9054-4ee4-96b8-36f007b257f3","Type":"ContainerStarted","Data":"e2dda033de4ce7f20b34713e48e734282d1b4f963bc9f0f967f7a0138dec93ca"} Feb 26 11:45:17 crc kubenswrapper[4699]: I0226 11:45:17.560044 4699 generic.go:334] "Generic (PLEG): container finished" podID="d945a7c6-8e43-4dae-8521-e5e8b04f612d" containerID="aec8d60c34d09df244e1612b82212b1617de26a8807d6e9649ca7c202efe59a6" exitCode=0 Feb 26 11:45:17 crc kubenswrapper[4699]: I0226 11:45:17.560091 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2xcg" event={"ID":"d945a7c6-8e43-4dae-8521-e5e8b04f612d","Type":"ContainerDied","Data":"aec8d60c34d09df244e1612b82212b1617de26a8807d6e9649ca7c202efe59a6"} Feb 26 11:45:18 crc kubenswrapper[4699]: I0226 11:45:18.569899 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2xcg" event={"ID":"d945a7c6-8e43-4dae-8521-e5e8b04f612d","Type":"ContainerStarted","Data":"caf977fa548550b2ff07142c0cf5642b71f889e64d844100198dfaac448f0f9a"} Feb 26 11:45:18 crc kubenswrapper[4699]: I0226 11:45:18.580298 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" event={"ID":"e537c30c-dc6b-406f-bb86-5540ebd8a36d","Type":"ContainerStarted","Data":"61635d559717b9b7a130fef3ce5799ea8bf06b1611ecc7c36061490fa0b8373e"} Feb 26 11:45:18 crc kubenswrapper[4699]: I0226 11:45:18.585735 4699 generic.go:334] "Generic (PLEG): container finished" podID="b82d852b-9054-4ee4-96b8-36f007b257f3" containerID="005da2508f8e2e44e7242ce1462ad8bb033f8eace2f1fa0bfabd16956b3b2978" exitCode=0 Feb 26 11:45:18 crc kubenswrapper[4699]: I0226 11:45:18.585807 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cxw5q" event={"ID":"b82d852b-9054-4ee4-96b8-36f007b257f3","Type":"ContainerDied","Data":"005da2508f8e2e44e7242ce1462ad8bb033f8eace2f1fa0bfabd16956b3b2978"} Feb 26 11:45:18 crc kubenswrapper[4699]: I0226 11:45:18.602667 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x2xcg" podStartSLOduration=3.166418045 podStartE2EDuration="5.602640123s" podCreationTimestamp="2026-02-26 11:45:13 +0000 UTC" firstStartedPulling="2026-02-26 11:45:15.508676654 +0000 UTC m=+2061.319503128" lastFinishedPulling="2026-02-26 11:45:17.944898762 +0000 UTC m=+2063.755725206" observedRunningTime="2026-02-26 11:45:18.59486358 +0000 UTC m=+2064.405690044" watchObservedRunningTime="2026-02-26 11:45:18.602640123 +0000 UTC m=+2064.413466557" Feb 26 11:45:18 crc kubenswrapper[4699]: I0226 11:45:18.622089 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" podStartSLOduration=2.240843316 podStartE2EDuration="2.622051207s" podCreationTimestamp="2026-02-26 11:45:16 +0000 UTC" firstStartedPulling="2026-02-26 11:45:17.436768385 +0000 UTC m=+2063.247594819" lastFinishedPulling="2026-02-26 11:45:17.817976276 +0000 UTC m=+2063.628802710" observedRunningTime="2026-02-26 11:45:18.612573626 +0000 UTC m=+2064.423400090" watchObservedRunningTime="2026-02-26 11:45:18.622051207 +0000 UTC m=+2064.432877641" Feb 26 11:45:20 crc kubenswrapper[4699]: I0226 11:45:20.632034 4699 generic.go:334] "Generic (PLEG): container finished" podID="b82d852b-9054-4ee4-96b8-36f007b257f3" containerID="997ec4ad763969aa9a44f4675ac014308b3c270558206c207a14b13763bab310" exitCode=0 Feb 26 11:45:20 crc kubenswrapper[4699]: I0226 11:45:20.632167 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cxw5q" event={"ID":"b82d852b-9054-4ee4-96b8-36f007b257f3","Type":"ContainerDied","Data":"997ec4ad763969aa9a44f4675ac014308b3c270558206c207a14b13763bab310"} Feb 26 11:45:21 crc kubenswrapper[4699]: I0226 11:45:21.642336 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cxw5q" event={"ID":"b82d852b-9054-4ee4-96b8-36f007b257f3","Type":"ContainerStarted","Data":"a376c7d1e53f63537ecffcad45159d60d5f1146fdbc4b44ddb10c3779c4f9e01"} Feb 26 11:45:21 crc kubenswrapper[4699]: I0226 11:45:21.663264 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cxw5q" podStartSLOduration=2.954975199 podStartE2EDuration="5.663240679s" podCreationTimestamp="2026-02-26 11:45:16 +0000 UTC" firstStartedPulling="2026-02-26 11:45:18.588645073 +0000 UTC m=+2064.399471517" lastFinishedPulling="2026-02-26 11:45:21.296910563 +0000 UTC m=+2067.107736997" observedRunningTime="2026-02-26 11:45:21.660518381 +0000 UTC m=+2067.471344845" watchObservedRunningTime="2026-02-26 11:45:21.663240679 +0000 UTC m=+2067.474067123" Feb 26 11:45:23 crc kubenswrapper[4699]: I0226 11:45:23.822101 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x2xcg" Feb 26 11:45:23 crc kubenswrapper[4699]: I0226 11:45:23.822442 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x2xcg" Feb 26 11:45:23 crc kubenswrapper[4699]: I0226 11:45:23.878359 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x2xcg" Feb 26 11:45:24 crc kubenswrapper[4699]: I0226 11:45:24.722663 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x2xcg" Feb 26 11:45:26 crc kubenswrapper[4699]: I0226 11:45:26.066318 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2xcg"] Feb 26 11:45:26 crc kubenswrapper[4699]: I0226 11:45:26.804213 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cxw5q" Feb 26 11:45:26 crc kubenswrapper[4699]: I0226 11:45:26.804668 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cxw5q" Feb 26 11:45:26 crc kubenswrapper[4699]: I0226 11:45:26.855095 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cxw5q" Feb 26 11:45:27 crc kubenswrapper[4699]: I0226 11:45:27.693886 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x2xcg" podUID="d945a7c6-8e43-4dae-8521-e5e8b04f612d" containerName="registry-server" containerID="cri-o://caf977fa548550b2ff07142c0cf5642b71f889e64d844100198dfaac448f0f9a" gracePeriod=2 Feb 26 11:45:27 crc kubenswrapper[4699]: I0226 11:45:27.740442 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cxw5q" Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.130634 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2xcg" Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.215353 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d945a7c6-8e43-4dae-8521-e5e8b04f612d-utilities\") pod \"d945a7c6-8e43-4dae-8521-e5e8b04f612d\" (UID: \"d945a7c6-8e43-4dae-8521-e5e8b04f612d\") " Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.215526 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d945a7c6-8e43-4dae-8521-e5e8b04f612d-catalog-content\") pod \"d945a7c6-8e43-4dae-8521-e5e8b04f612d\" (UID: \"d945a7c6-8e43-4dae-8521-e5e8b04f612d\") " Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.215610 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg8xl\" (UniqueName: \"kubernetes.io/projected/d945a7c6-8e43-4dae-8521-e5e8b04f612d-kube-api-access-mg8xl\") pod \"d945a7c6-8e43-4dae-8521-e5e8b04f612d\" (UID: \"d945a7c6-8e43-4dae-8521-e5e8b04f612d\") " Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.216200 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d945a7c6-8e43-4dae-8521-e5e8b04f612d-utilities" (OuterVolumeSpecName: "utilities") pod "d945a7c6-8e43-4dae-8521-e5e8b04f612d" (UID: "d945a7c6-8e43-4dae-8521-e5e8b04f612d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.223310 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d945a7c6-8e43-4dae-8521-e5e8b04f612d-kube-api-access-mg8xl" (OuterVolumeSpecName: "kube-api-access-mg8xl") pod "d945a7c6-8e43-4dae-8521-e5e8b04f612d" (UID: "d945a7c6-8e43-4dae-8521-e5e8b04f612d"). InnerVolumeSpecName "kube-api-access-mg8xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.237390 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d945a7c6-8e43-4dae-8521-e5e8b04f612d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d945a7c6-8e43-4dae-8521-e5e8b04f612d" (UID: "d945a7c6-8e43-4dae-8521-e5e8b04f612d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.317896 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d945a7c6-8e43-4dae-8521-e5e8b04f612d-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.317926 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d945a7c6-8e43-4dae-8521-e5e8b04f612d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.317936 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg8xl\" (UniqueName: \"kubernetes.io/projected/d945a7c6-8e43-4dae-8521-e5e8b04f612d-kube-api-access-mg8xl\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.703935 4699 generic.go:334] "Generic (PLEG): container finished" podID="d945a7c6-8e43-4dae-8521-e5e8b04f612d" containerID="caf977fa548550b2ff07142c0cf5642b71f889e64d844100198dfaac448f0f9a" exitCode=0 Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.704033 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2xcg" Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.704077 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2xcg" event={"ID":"d945a7c6-8e43-4dae-8521-e5e8b04f612d","Type":"ContainerDied","Data":"caf977fa548550b2ff07142c0cf5642b71f889e64d844100198dfaac448f0f9a"} Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.704450 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2xcg" event={"ID":"d945a7c6-8e43-4dae-8521-e5e8b04f612d","Type":"ContainerDied","Data":"1369ca8ad6f82a788fe86943ed6a84b06e1a5107f0a845225dd71243c41f0ef7"} Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.704486 4699 scope.go:117] "RemoveContainer" containerID="caf977fa548550b2ff07142c0cf5642b71f889e64d844100198dfaac448f0f9a" Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.732639 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2xcg"] Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.733915 4699 scope.go:117] "RemoveContainer" containerID="aec8d60c34d09df244e1612b82212b1617de26a8807d6e9649ca7c202efe59a6" Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.740779 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2xcg"] Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.756774 4699 scope.go:117] "RemoveContainer" containerID="4cf3d7d09ae2ea0a26c0849df1b9968db98d3306fdbbd4dc77bd87b713945f7a" Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.795738 4699 scope.go:117] "RemoveContainer" containerID="caf977fa548550b2ff07142c0cf5642b71f889e64d844100198dfaac448f0f9a" Feb 26 11:45:28 crc kubenswrapper[4699]: E0226 11:45:28.796177 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caf977fa548550b2ff07142c0cf5642b71f889e64d844100198dfaac448f0f9a\": container with ID starting with caf977fa548550b2ff07142c0cf5642b71f889e64d844100198dfaac448f0f9a not found: ID does not exist" containerID="caf977fa548550b2ff07142c0cf5642b71f889e64d844100198dfaac448f0f9a" Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.796220 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caf977fa548550b2ff07142c0cf5642b71f889e64d844100198dfaac448f0f9a"} err="failed to get container status \"caf977fa548550b2ff07142c0cf5642b71f889e64d844100198dfaac448f0f9a\": rpc error: code = NotFound desc = could not find container \"caf977fa548550b2ff07142c0cf5642b71f889e64d844100198dfaac448f0f9a\": container with ID starting with caf977fa548550b2ff07142c0cf5642b71f889e64d844100198dfaac448f0f9a not found: ID does not exist" Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.796246 4699 scope.go:117] "RemoveContainer" containerID="aec8d60c34d09df244e1612b82212b1617de26a8807d6e9649ca7c202efe59a6" Feb 26 11:45:28 crc kubenswrapper[4699]: E0226 11:45:28.796691 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aec8d60c34d09df244e1612b82212b1617de26a8807d6e9649ca7c202efe59a6\": container with ID starting with aec8d60c34d09df244e1612b82212b1617de26a8807d6e9649ca7c202efe59a6 not found: ID does not exist" containerID="aec8d60c34d09df244e1612b82212b1617de26a8807d6e9649ca7c202efe59a6" Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.796752 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aec8d60c34d09df244e1612b82212b1617de26a8807d6e9649ca7c202efe59a6"} err="failed to get container status \"aec8d60c34d09df244e1612b82212b1617de26a8807d6e9649ca7c202efe59a6\": rpc error: code = NotFound desc = could not find container \"aec8d60c34d09df244e1612b82212b1617de26a8807d6e9649ca7c202efe59a6\": container with ID starting with aec8d60c34d09df244e1612b82212b1617de26a8807d6e9649ca7c202efe59a6 not found: ID does not exist" Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.796790 4699 scope.go:117] "RemoveContainer" containerID="4cf3d7d09ae2ea0a26c0849df1b9968db98d3306fdbbd4dc77bd87b713945f7a" Feb 26 11:45:28 crc kubenswrapper[4699]: E0226 11:45:28.797087 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cf3d7d09ae2ea0a26c0849df1b9968db98d3306fdbbd4dc77bd87b713945f7a\": container with ID starting with 4cf3d7d09ae2ea0a26c0849df1b9968db98d3306fdbbd4dc77bd87b713945f7a not found: ID does not exist" containerID="4cf3d7d09ae2ea0a26c0849df1b9968db98d3306fdbbd4dc77bd87b713945f7a" Feb 26 11:45:28 crc kubenswrapper[4699]: I0226 11:45:28.797108 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cf3d7d09ae2ea0a26c0849df1b9968db98d3306fdbbd4dc77bd87b713945f7a"} err="failed to get container status \"4cf3d7d09ae2ea0a26c0849df1b9968db98d3306fdbbd4dc77bd87b713945f7a\": rpc error: code = NotFound desc = could not find container \"4cf3d7d09ae2ea0a26c0849df1b9968db98d3306fdbbd4dc77bd87b713945f7a\": container with ID starting with 4cf3d7d09ae2ea0a26c0849df1b9968db98d3306fdbbd4dc77bd87b713945f7a not found: ID does not exist" Feb 26 11:45:29 crc kubenswrapper[4699]: I0226 11:45:29.266383 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cxw5q"] Feb 26 11:45:30 crc kubenswrapper[4699]: I0226 11:45:30.271427 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d945a7c6-8e43-4dae-8521-e5e8b04f612d" path="/var/lib/kubelet/pods/d945a7c6-8e43-4dae-8521-e5e8b04f612d/volumes" Feb 26 11:45:30 crc kubenswrapper[4699]: I0226 11:45:30.725245 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cxw5q" podUID="b82d852b-9054-4ee4-96b8-36f007b257f3" containerName="registry-server" containerID="cri-o://a376c7d1e53f63537ecffcad45159d60d5f1146fdbc4b44ddb10c3779c4f9e01" gracePeriod=2 Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.147597 4699 scope.go:117] "RemoveContainer" containerID="c08f0ffa53e77347fd581c677192ce80109e73083d1caad9bb7251a920a34172" Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.157695 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cxw5q" Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.206129 4699 scope.go:117] "RemoveContainer" containerID="61a2c48ee6bf74ea4766fbbb38a98752e4fc1a270493117d88d14b6af7b2c988" Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.270862 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b82d852b-9054-4ee4-96b8-36f007b257f3-utilities\") pod \"b82d852b-9054-4ee4-96b8-36f007b257f3\" (UID: \"b82d852b-9054-4ee4-96b8-36f007b257f3\") " Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.270960 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb7bf\" (UniqueName: \"kubernetes.io/projected/b82d852b-9054-4ee4-96b8-36f007b257f3-kube-api-access-jb7bf\") pod \"b82d852b-9054-4ee4-96b8-36f007b257f3\" (UID: \"b82d852b-9054-4ee4-96b8-36f007b257f3\") " Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.271153 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b82d852b-9054-4ee4-96b8-36f007b257f3-catalog-content\") pod \"b82d852b-9054-4ee4-96b8-36f007b257f3\" (UID: \"b82d852b-9054-4ee4-96b8-36f007b257f3\") " Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.271864 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b82d852b-9054-4ee4-96b8-36f007b257f3-utilities" (OuterVolumeSpecName: "utilities") pod "b82d852b-9054-4ee4-96b8-36f007b257f3" (UID: "b82d852b-9054-4ee4-96b8-36f007b257f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.278293 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b82d852b-9054-4ee4-96b8-36f007b257f3-kube-api-access-jb7bf" (OuterVolumeSpecName: "kube-api-access-jb7bf") pod "b82d852b-9054-4ee4-96b8-36f007b257f3" (UID: "b82d852b-9054-4ee4-96b8-36f007b257f3"). InnerVolumeSpecName "kube-api-access-jb7bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.323603 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b82d852b-9054-4ee4-96b8-36f007b257f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b82d852b-9054-4ee4-96b8-36f007b257f3" (UID: "b82d852b-9054-4ee4-96b8-36f007b257f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.373756 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b82d852b-9054-4ee4-96b8-36f007b257f3-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.373788 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb7bf\" (UniqueName: \"kubernetes.io/projected/b82d852b-9054-4ee4-96b8-36f007b257f3-kube-api-access-jb7bf\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.373823 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b82d852b-9054-4ee4-96b8-36f007b257f3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.736933 4699 generic.go:334] "Generic (PLEG): container finished" podID="b82d852b-9054-4ee4-96b8-36f007b257f3" containerID="a376c7d1e53f63537ecffcad45159d60d5f1146fdbc4b44ddb10c3779c4f9e01" exitCode=0 Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.736988 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cxw5q" Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.737002 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cxw5q" event={"ID":"b82d852b-9054-4ee4-96b8-36f007b257f3","Type":"ContainerDied","Data":"a376c7d1e53f63537ecffcad45159d60d5f1146fdbc4b44ddb10c3779c4f9e01"} Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.737191 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cxw5q" event={"ID":"b82d852b-9054-4ee4-96b8-36f007b257f3","Type":"ContainerDied","Data":"e2dda033de4ce7f20b34713e48e734282d1b4f963bc9f0f967f7a0138dec93ca"} Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.737228 4699 scope.go:117] "RemoveContainer" containerID="a376c7d1e53f63537ecffcad45159d60d5f1146fdbc4b44ddb10c3779c4f9e01" Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.754713 4699 scope.go:117] "RemoveContainer" containerID="997ec4ad763969aa9a44f4675ac014308b3c270558206c207a14b13763bab310" Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.773943 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cxw5q"] Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.779278 4699 scope.go:117] "RemoveContainer" containerID="005da2508f8e2e44e7242ce1462ad8bb033f8eace2f1fa0bfabd16956b3b2978" Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.784809 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cxw5q"] Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.799163 4699 scope.go:117] "RemoveContainer" containerID="a376c7d1e53f63537ecffcad45159d60d5f1146fdbc4b44ddb10c3779c4f9e01" Feb 26 11:45:31 crc kubenswrapper[4699]: E0226 11:45:31.799636 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a376c7d1e53f63537ecffcad45159d60d5f1146fdbc4b44ddb10c3779c4f9e01\": container with ID starting with a376c7d1e53f63537ecffcad45159d60d5f1146fdbc4b44ddb10c3779c4f9e01 not found: ID does not exist" containerID="a376c7d1e53f63537ecffcad45159d60d5f1146fdbc4b44ddb10c3779c4f9e01" Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.799675 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a376c7d1e53f63537ecffcad45159d60d5f1146fdbc4b44ddb10c3779c4f9e01"} err="failed to get container status \"a376c7d1e53f63537ecffcad45159d60d5f1146fdbc4b44ddb10c3779c4f9e01\": rpc error: code = NotFound desc = could not find container \"a376c7d1e53f63537ecffcad45159d60d5f1146fdbc4b44ddb10c3779c4f9e01\": container with ID starting with a376c7d1e53f63537ecffcad45159d60d5f1146fdbc4b44ddb10c3779c4f9e01 not found: ID does not exist" Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.799701 4699 scope.go:117] "RemoveContainer" containerID="997ec4ad763969aa9a44f4675ac014308b3c270558206c207a14b13763bab310" Feb 26 11:45:31 crc kubenswrapper[4699]: E0226 11:45:31.800094 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"997ec4ad763969aa9a44f4675ac014308b3c270558206c207a14b13763bab310\": container with ID starting with 997ec4ad763969aa9a44f4675ac014308b3c270558206c207a14b13763bab310 not found: ID does not exist" containerID="997ec4ad763969aa9a44f4675ac014308b3c270558206c207a14b13763bab310" Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.800163 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"997ec4ad763969aa9a44f4675ac014308b3c270558206c207a14b13763bab310"} err="failed to get container status \"997ec4ad763969aa9a44f4675ac014308b3c270558206c207a14b13763bab310\": rpc error: code = NotFound desc = could not find container \"997ec4ad763969aa9a44f4675ac014308b3c270558206c207a14b13763bab310\": container with ID starting with 997ec4ad763969aa9a44f4675ac014308b3c270558206c207a14b13763bab310 not found: ID does not exist" Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.800177 4699 scope.go:117] "RemoveContainer" containerID="005da2508f8e2e44e7242ce1462ad8bb033f8eace2f1fa0bfabd16956b3b2978" Feb 26 11:45:31 crc kubenswrapper[4699]: E0226 11:45:31.800697 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"005da2508f8e2e44e7242ce1462ad8bb033f8eace2f1fa0bfabd16956b3b2978\": container with ID starting with 005da2508f8e2e44e7242ce1462ad8bb033f8eace2f1fa0bfabd16956b3b2978 not found: ID does not exist" containerID="005da2508f8e2e44e7242ce1462ad8bb033f8eace2f1fa0bfabd16956b3b2978" Feb 26 11:45:31 crc kubenswrapper[4699]: I0226 11:45:31.800737 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"005da2508f8e2e44e7242ce1462ad8bb033f8eace2f1fa0bfabd16956b3b2978"} err="failed to get container status \"005da2508f8e2e44e7242ce1462ad8bb033f8eace2f1fa0bfabd16956b3b2978\": rpc error: code = NotFound desc = could not find container \"005da2508f8e2e44e7242ce1462ad8bb033f8eace2f1fa0bfabd16956b3b2978\": container with ID starting with 005da2508f8e2e44e7242ce1462ad8bb033f8eace2f1fa0bfabd16956b3b2978 not found: ID does not exist" Feb 26 11:45:32 crc kubenswrapper[4699]: I0226 11:45:32.273698 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b82d852b-9054-4ee4-96b8-36f007b257f3" path="/var/lib/kubelet/pods/b82d852b-9054-4ee4-96b8-36f007b257f3/volumes" Feb 26 11:45:41 crc kubenswrapper[4699]: I0226 11:45:41.584855 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:45:41 crc kubenswrapper[4699]: I0226 11:45:41.585494 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:45:53 crc kubenswrapper[4699]: I0226 11:45:53.946618 4699 generic.go:334] "Generic (PLEG): container finished" podID="e537c30c-dc6b-406f-bb86-5540ebd8a36d" containerID="61635d559717b9b7a130fef3ce5799ea8bf06b1611ecc7c36061490fa0b8373e" exitCode=0 Feb 26 11:45:53 crc kubenswrapper[4699]: I0226 11:45:53.946745 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" event={"ID":"e537c30c-dc6b-406f-bb86-5540ebd8a36d","Type":"ContainerDied","Data":"61635d559717b9b7a130fef3ce5799ea8bf06b1611ecc7c36061490fa0b8373e"} Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.310900 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.471631 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-repo-setup-combined-ca-bundle\") pod \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.471728 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-bootstrap-combined-ca-bundle\") pod \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.471836 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjnhg\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-kube-api-access-cjnhg\") pod \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.471872 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-libvirt-combined-ca-bundle\") pod \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.471916 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-ovn-combined-ca-bundle\") pod \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.471966 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.472018 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.472058 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.472095 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.472178 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-ssh-key-openstack-edpm-ipam\") pod \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.472209 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-nova-combined-ca-bundle\") pod \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.472242 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-inventory\") pod \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.472284 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-neutron-metadata-combined-ca-bundle\") pod \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.472315 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-telemetry-combined-ca-bundle\") pod \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\" (UID: \"e537c30c-dc6b-406f-bb86-5540ebd8a36d\") " Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.477511 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e537c30c-dc6b-406f-bb86-5540ebd8a36d" (UID: "e537c30c-dc6b-406f-bb86-5540ebd8a36d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.478361 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-kube-api-access-cjnhg" (OuterVolumeSpecName: "kube-api-access-cjnhg") pod "e537c30c-dc6b-406f-bb86-5540ebd8a36d" (UID: "e537c30c-dc6b-406f-bb86-5540ebd8a36d"). InnerVolumeSpecName "kube-api-access-cjnhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.479760 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e537c30c-dc6b-406f-bb86-5540ebd8a36d" (UID: "e537c30c-dc6b-406f-bb86-5540ebd8a36d"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.480142 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e537c30c-dc6b-406f-bb86-5540ebd8a36d" (UID: "e537c30c-dc6b-406f-bb86-5540ebd8a36d"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.481145 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e537c30c-dc6b-406f-bb86-5540ebd8a36d" (UID: "e537c30c-dc6b-406f-bb86-5540ebd8a36d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.481644 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "e537c30c-dc6b-406f-bb86-5540ebd8a36d" (UID: "e537c30c-dc6b-406f-bb86-5540ebd8a36d"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.482022 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e537c30c-dc6b-406f-bb86-5540ebd8a36d" (UID: "e537c30c-dc6b-406f-bb86-5540ebd8a36d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.482063 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "e537c30c-dc6b-406f-bb86-5540ebd8a36d" (UID: "e537c30c-dc6b-406f-bb86-5540ebd8a36d"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.482085 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "e537c30c-dc6b-406f-bb86-5540ebd8a36d" (UID: "e537c30c-dc6b-406f-bb86-5540ebd8a36d"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.482129 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "e537c30c-dc6b-406f-bb86-5540ebd8a36d" (UID: "e537c30c-dc6b-406f-bb86-5540ebd8a36d"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.482487 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "e537c30c-dc6b-406f-bb86-5540ebd8a36d" (UID: "e537c30c-dc6b-406f-bb86-5540ebd8a36d"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.483833 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e537c30c-dc6b-406f-bb86-5540ebd8a36d" (UID: "e537c30c-dc6b-406f-bb86-5540ebd8a36d"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.505937 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e537c30c-dc6b-406f-bb86-5540ebd8a36d" (UID: "e537c30c-dc6b-406f-bb86-5540ebd8a36d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.507506 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-inventory" (OuterVolumeSpecName: "inventory") pod "e537c30c-dc6b-406f-bb86-5540ebd8a36d" (UID: "e537c30c-dc6b-406f-bb86-5540ebd8a36d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.575002 4699 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.575047 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjnhg\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-kube-api-access-cjnhg\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.575059 4699 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.575071 4699 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.575086 4699 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.575104 4699 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.575135 4699 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.575152 4699 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e537c30c-dc6b-406f-bb86-5540ebd8a36d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.575164 4699 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.575175 4699 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.575186 4699 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.575196 4699 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.575207 4699 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.575218 4699 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537c30c-dc6b-406f-bb86-5540ebd8a36d-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.966989 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" event={"ID":"e537c30c-dc6b-406f-bb86-5540ebd8a36d","Type":"ContainerDied","Data":"d402e0aae0335e59241cdcb945195a8b0e0b32ee1dcab2f8a43f80904ff391a8"} Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.967076 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d402e0aae0335e59241cdcb945195a8b0e0b32ee1dcab2f8a43f80904ff391a8" Feb 26 11:45:55 crc kubenswrapper[4699]: I0226 11:45:55.967137 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.051630 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg"] Feb 26 11:45:56 crc kubenswrapper[4699]: E0226 11:45:56.052515 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b82d852b-9054-4ee4-96b8-36f007b257f3" containerName="registry-server" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.052540 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="b82d852b-9054-4ee4-96b8-36f007b257f3" containerName="registry-server" Feb 26 11:45:56 crc kubenswrapper[4699]: E0226 11:45:56.052562 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d945a7c6-8e43-4dae-8521-e5e8b04f612d" containerName="extract-content" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.052571 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="d945a7c6-8e43-4dae-8521-e5e8b04f612d" containerName="extract-content" Feb 26 11:45:56 crc kubenswrapper[4699]: E0226 11:45:56.052609 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b82d852b-9054-4ee4-96b8-36f007b257f3" containerName="extract-content" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.052619 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="b82d852b-9054-4ee4-96b8-36f007b257f3" containerName="extract-content" Feb 26 11:45:56 crc kubenswrapper[4699]: E0226 11:45:56.052629 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b82d852b-9054-4ee4-96b8-36f007b257f3" containerName="extract-utilities" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.052637 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="b82d852b-9054-4ee4-96b8-36f007b257f3" containerName="extract-utilities" Feb 26 11:45:56 crc kubenswrapper[4699]: E0226 11:45:56.052661 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d945a7c6-8e43-4dae-8521-e5e8b04f612d" containerName="extract-utilities" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.052669 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="d945a7c6-8e43-4dae-8521-e5e8b04f612d" containerName="extract-utilities" Feb 26 11:45:56 crc kubenswrapper[4699]: E0226 11:45:56.052685 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e537c30c-dc6b-406f-bb86-5540ebd8a36d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.052694 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="e537c30c-dc6b-406f-bb86-5540ebd8a36d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 26 11:45:56 crc kubenswrapper[4699]: E0226 11:45:56.052704 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d945a7c6-8e43-4dae-8521-e5e8b04f612d" containerName="registry-server" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.052712 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="d945a7c6-8e43-4dae-8521-e5e8b04f612d" containerName="registry-server" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.052929 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="e537c30c-dc6b-406f-bb86-5540ebd8a36d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.052947 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="d945a7c6-8e43-4dae-8521-e5e8b04f612d" containerName="registry-server" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.052977 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="b82d852b-9054-4ee4-96b8-36f007b257f3" containerName="registry-server" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.053857 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.056421 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.056615 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.056795 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.056865 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-69sdb" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.056815 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.068426 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg"] Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.087629 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hmpqg\" (UID: \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.087772 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hmpqg\" (UID: \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.087890 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sn7j\" (UniqueName: \"kubernetes.io/projected/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-kube-api-access-7sn7j\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hmpqg\" (UID: \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.087955 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hmpqg\" (UID: \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.088080 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hmpqg\" (UID: \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.189623 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hmpqg\" (UID: \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.189685 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sn7j\" (UniqueName: \"kubernetes.io/projected/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-kube-api-access-7sn7j\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hmpqg\" (UID: \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.189717 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hmpqg\" (UID: \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.189806 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hmpqg\" (UID: \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.189940 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hmpqg\" (UID: \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.193178 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.193277 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.193720 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.201455 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hmpqg\" (UID: \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.202843 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hmpqg\" (UID: \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.205509 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hmpqg\" (UID: \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.205959 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hmpqg\" (UID: \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.209044 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sn7j\" (UniqueName: \"kubernetes.io/projected/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-kube-api-access-7sn7j\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hmpqg\" (UID: \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.379017 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-69sdb" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.389406 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.885646 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg"] Feb 26 11:45:56 crc kubenswrapper[4699]: I0226 11:45:56.979562 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" event={"ID":"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b","Type":"ContainerStarted","Data":"6fec481dc036c5b83368c4004e881ded73cc4b939fc8a01a5a352115b40fddcc"} Feb 26 11:45:57 crc kubenswrapper[4699]: I0226 11:45:57.589927 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 11:45:57 crc kubenswrapper[4699]: I0226 11:45:57.992014 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" event={"ID":"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b","Type":"ContainerStarted","Data":"ee40b4b1a8d8cb5e3af0d9816425a6ede7800a1df3aca66053e669a22650ea0b"} Feb 26 11:45:58 crc kubenswrapper[4699]: I0226 11:45:58.027675 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" podStartSLOduration=1.328654826 podStartE2EDuration="2.027458839s" podCreationTimestamp="2026-02-26 11:45:56 +0000 UTC" firstStartedPulling="2026-02-26 11:45:56.887797948 +0000 UTC m=+2102.698624372" lastFinishedPulling="2026-02-26 11:45:57.586601951 +0000 UTC m=+2103.397428385" observedRunningTime="2026-02-26 11:45:58.017681391 +0000 UTC m=+2103.828507835" watchObservedRunningTime="2026-02-26 11:45:58.027458839 +0000 UTC m=+2103.838285283" Feb 26 11:46:00 crc kubenswrapper[4699]: I0226 11:46:00.131255 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535106-cv2s5"] Feb 26 11:46:00 crc kubenswrapper[4699]: I0226 11:46:00.132881 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535106-cv2s5" Feb 26 11:46:00 crc kubenswrapper[4699]: I0226 11:46:00.139536 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 11:46:00 crc kubenswrapper[4699]: I0226 11:46:00.139905 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 11:46:00 crc kubenswrapper[4699]: I0226 11:46:00.140311 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 11:46:00 crc kubenswrapper[4699]: I0226 11:46:00.150468 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535106-cv2s5"] Feb 26 11:46:00 crc kubenswrapper[4699]: I0226 11:46:00.164267 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h47mg\" (UniqueName: \"kubernetes.io/projected/277ed376-d775-489c-82e7-93962bd513ff-kube-api-access-h47mg\") pod \"auto-csr-approver-29535106-cv2s5\" (UID: \"277ed376-d775-489c-82e7-93962bd513ff\") " pod="openshift-infra/auto-csr-approver-29535106-cv2s5" Feb 26 11:46:00 crc kubenswrapper[4699]: I0226 11:46:00.266661 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h47mg\" (UniqueName: \"kubernetes.io/projected/277ed376-d775-489c-82e7-93962bd513ff-kube-api-access-h47mg\") pod \"auto-csr-approver-29535106-cv2s5\" (UID: \"277ed376-d775-489c-82e7-93962bd513ff\") " pod="openshift-infra/auto-csr-approver-29535106-cv2s5" Feb 26 11:46:00 crc kubenswrapper[4699]: I0226 11:46:00.283970 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h47mg\" (UniqueName: \"kubernetes.io/projected/277ed376-d775-489c-82e7-93962bd513ff-kube-api-access-h47mg\") pod \"auto-csr-approver-29535106-cv2s5\" (UID: \"277ed376-d775-489c-82e7-93962bd513ff\") " pod="openshift-infra/auto-csr-approver-29535106-cv2s5" Feb 26 11:46:00 crc kubenswrapper[4699]: I0226 11:46:00.455242 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535106-cv2s5" Feb 26 11:46:00 crc kubenswrapper[4699]: I0226 11:46:00.875098 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535106-cv2s5"] Feb 26 11:46:01 crc kubenswrapper[4699]: I0226 11:46:01.016942 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535106-cv2s5" event={"ID":"277ed376-d775-489c-82e7-93962bd513ff","Type":"ContainerStarted","Data":"741fa4c3f9f588aa570fbe95633265e19299d4dd53f53505e7fa07dabc271811"} Feb 26 11:46:03 crc kubenswrapper[4699]: I0226 11:46:03.037748 4699 generic.go:334] "Generic (PLEG): container finished" podID="277ed376-d775-489c-82e7-93962bd513ff" containerID="6316bd489dab2ee525da2e5168f12e3d42a5b7c5139e77da702337350ea3b44a" exitCode=0 Feb 26 11:46:03 crc kubenswrapper[4699]: I0226 11:46:03.038023 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535106-cv2s5" event={"ID":"277ed376-d775-489c-82e7-93962bd513ff","Type":"ContainerDied","Data":"6316bd489dab2ee525da2e5168f12e3d42a5b7c5139e77da702337350ea3b44a"} Feb 26 11:46:04 crc kubenswrapper[4699]: I0226 11:46:04.352098 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535106-cv2s5" Feb 26 11:46:04 crc kubenswrapper[4699]: I0226 11:46:04.460845 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h47mg\" (UniqueName: \"kubernetes.io/projected/277ed376-d775-489c-82e7-93962bd513ff-kube-api-access-h47mg\") pod \"277ed376-d775-489c-82e7-93962bd513ff\" (UID: \"277ed376-d775-489c-82e7-93962bd513ff\") " Feb 26 11:46:04 crc kubenswrapper[4699]: I0226 11:46:04.466735 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/277ed376-d775-489c-82e7-93962bd513ff-kube-api-access-h47mg" (OuterVolumeSpecName: "kube-api-access-h47mg") pod "277ed376-d775-489c-82e7-93962bd513ff" (UID: "277ed376-d775-489c-82e7-93962bd513ff"). InnerVolumeSpecName "kube-api-access-h47mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:46:04 crc kubenswrapper[4699]: I0226 11:46:04.565658 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h47mg\" (UniqueName: \"kubernetes.io/projected/277ed376-d775-489c-82e7-93962bd513ff-kube-api-access-h47mg\") on node \"crc\" DevicePath \"\"" Feb 26 11:46:05 crc kubenswrapper[4699]: I0226 11:46:05.057663 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535106-cv2s5" event={"ID":"277ed376-d775-489c-82e7-93962bd513ff","Type":"ContainerDied","Data":"741fa4c3f9f588aa570fbe95633265e19299d4dd53f53505e7fa07dabc271811"} Feb 26 11:46:05 crc kubenswrapper[4699]: I0226 11:46:05.057707 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="741fa4c3f9f588aa570fbe95633265e19299d4dd53f53505e7fa07dabc271811" Feb 26 11:46:05 crc kubenswrapper[4699]: I0226 11:46:05.057785 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535106-cv2s5" Feb 26 11:46:05 crc kubenswrapper[4699]: I0226 11:46:05.438694 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535100-2fxw5"] Feb 26 11:46:05 crc kubenswrapper[4699]: I0226 11:46:05.445807 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535100-2fxw5"] Feb 26 11:46:06 crc kubenswrapper[4699]: I0226 11:46:06.275738 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db34348f-7e21-4666-8e45-c48a1fdbe2a4" path="/var/lib/kubelet/pods/db34348f-7e21-4666-8e45-c48a1fdbe2a4/volumes" Feb 26 11:46:11 crc kubenswrapper[4699]: I0226 11:46:11.585468 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:46:11 crc kubenswrapper[4699]: I0226 11:46:11.586276 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:46:11 crc kubenswrapper[4699]: I0226 11:46:11.586344 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:46:11 crc kubenswrapper[4699]: I0226 11:46:11.587455 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6ef034a72d27c84dbd807adb1a50ce258b1b8022f1d940a8fb612e62f1d33345"} pod="openshift-machine-config-operator/machine-config-daemon-28p79" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 11:46:11 crc kubenswrapper[4699]: I0226 11:46:11.587535 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" containerID="cri-o://6ef034a72d27c84dbd807adb1a50ce258b1b8022f1d940a8fb612e62f1d33345" gracePeriod=600 Feb 26 11:46:12 crc kubenswrapper[4699]: I0226 11:46:12.121584 4699 generic.go:334] "Generic (PLEG): container finished" podID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerID="6ef034a72d27c84dbd807adb1a50ce258b1b8022f1d940a8fb612e62f1d33345" exitCode=0 Feb 26 11:46:12 crc kubenswrapper[4699]: I0226 11:46:12.121617 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerDied","Data":"6ef034a72d27c84dbd807adb1a50ce258b1b8022f1d940a8fb612e62f1d33345"} Feb 26 11:46:12 crc kubenswrapper[4699]: I0226 11:46:12.122215 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerStarted","Data":"c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde"} Feb 26 11:46:12 crc kubenswrapper[4699]: I0226 11:46:12.122240 4699 scope.go:117] "RemoveContainer" containerID="44e3df3b72d5e05b5e3932b70a80af4e9396963ab4e1e6f5434efe863b881c99" Feb 26 11:46:31 crc kubenswrapper[4699]: I0226 11:46:31.288024 4699 scope.go:117] "RemoveContainer" containerID="4505b88d80198e91d210a89e948ba5fb9b137a6a7006ae878e49e6ab4a45d98a" Feb 26 11:46:59 crc kubenswrapper[4699]: I0226 11:46:59.544227 4699 generic.go:334] "Generic (PLEG): container finished" podID="dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b" containerID="ee40b4b1a8d8cb5e3af0d9816425a6ede7800a1df3aca66053e669a22650ea0b" exitCode=0 Feb 26 11:46:59 crc kubenswrapper[4699]: I0226 11:46:59.544338 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" event={"ID":"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b","Type":"ContainerDied","Data":"ee40b4b1a8d8cb5e3af0d9816425a6ede7800a1df3aca66053e669a22650ea0b"} Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.031014 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.147851 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-ovn-combined-ca-bundle\") pod \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\" (UID: \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\") " Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.148206 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-ssh-key-openstack-edpm-ipam\") pod \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\" (UID: \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\") " Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.148239 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-ovncontroller-config-0\") pod \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\" (UID: \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\") " Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.148275 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sn7j\" (UniqueName: \"kubernetes.io/projected/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-kube-api-access-7sn7j\") pod \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\" (UID: \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\") " Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.148317 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-inventory\") pod \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\" (UID: \"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b\") " Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.155812 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-kube-api-access-7sn7j" (OuterVolumeSpecName: "kube-api-access-7sn7j") pod "dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b" (UID: "dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b"). InnerVolumeSpecName "kube-api-access-7sn7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.157832 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b" (UID: "dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.176638 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b" (UID: "dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.185167 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b" (UID: "dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.187385 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-inventory" (OuterVolumeSpecName: "inventory") pod "dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b" (UID: "dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.251608 4699 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.251648 4699 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.251659 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sn7j\" (UniqueName: \"kubernetes.io/projected/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-kube-api-access-7sn7j\") on node \"crc\" DevicePath \"\"" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.251668 4699 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.251678 4699 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.567671 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" event={"ID":"dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b","Type":"ContainerDied","Data":"6fec481dc036c5b83368c4004e881ded73cc4b939fc8a01a5a352115b40fddcc"} Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.567721 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fec481dc036c5b83368c4004e881ded73cc4b939fc8a01a5a352115b40fddcc" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.567742 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hmpqg" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.687583 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l"] Feb 26 11:47:01 crc kubenswrapper[4699]: E0226 11:47:01.688072 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="277ed376-d775-489c-82e7-93962bd513ff" containerName="oc" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.688096 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="277ed376-d775-489c-82e7-93962bd513ff" containerName="oc" Feb 26 11:47:01 crc kubenswrapper[4699]: E0226 11:47:01.688160 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.688172 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.688431 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.688455 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="277ed376-d775-489c-82e7-93962bd513ff" containerName="oc" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.689235 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.691456 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.691498 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.691822 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.691846 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-69sdb" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.692174 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.693100 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.702814 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l"] Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.862501 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.862552 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.862585 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.862609 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.862675 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjv4v\" (UniqueName: \"kubernetes.io/projected/59456382-a459-4f82-ac99-b96eb735ddb9-kube-api-access-tjv4v\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.863054 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.964735 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.964896 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.964959 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.964983 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.965077 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.965139 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjv4v\" (UniqueName: \"kubernetes.io/projected/59456382-a459-4f82-ac99-b96eb735ddb9-kube-api-access-tjv4v\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.969267 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.969294 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.969365 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.969770 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.970913 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" Feb 26 11:47:01 crc kubenswrapper[4699]: I0226 11:47:01.984004 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjv4v\" (UniqueName: \"kubernetes.io/projected/59456382-a459-4f82-ac99-b96eb735ddb9-kube-api-access-tjv4v\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" Feb 26 11:47:02 crc kubenswrapper[4699]: I0226 11:47:02.006589 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" Feb 26 11:47:02 crc kubenswrapper[4699]: I0226 11:47:02.542711 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l"] Feb 26 11:47:02 crc kubenswrapper[4699]: I0226 11:47:02.549883 4699 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 11:47:02 crc kubenswrapper[4699]: I0226 11:47:02.583889 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" event={"ID":"59456382-a459-4f82-ac99-b96eb735ddb9","Type":"ContainerStarted","Data":"b65ffd3662b40c51c98c1dd30170152c5d509e1d0fe771319b3ef00e26682063"} Feb 26 11:47:03 crc kubenswrapper[4699]: I0226 11:47:03.594824 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" event={"ID":"59456382-a459-4f82-ac99-b96eb735ddb9","Type":"ContainerStarted","Data":"08a5874a3b7c9d905481c4a6b7b1f36886135a1f3140e5983bc7888075a8dbaa"} Feb 26 11:47:03 crc kubenswrapper[4699]: I0226 11:47:03.613686 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" podStartSLOduration=2.000379345 podStartE2EDuration="2.613643779s" podCreationTimestamp="2026-02-26 11:47:01 +0000 UTC" firstStartedPulling="2026-02-26 11:47:02.549638715 +0000 UTC m=+2168.360465149" lastFinishedPulling="2026-02-26 11:47:03.162903149 +0000 UTC m=+2168.973729583" observedRunningTime="2026-02-26 11:47:03.610189011 +0000 UTC m=+2169.421015455" watchObservedRunningTime="2026-02-26 11:47:03.613643779 +0000 UTC m=+2169.424470223" Feb 26 11:47:10 crc kubenswrapper[4699]: I0226 11:47:10.747377 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6t992"] Feb 26 11:47:10 crc kubenswrapper[4699]: I0226 11:47:10.750454 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6t992" Feb 26 11:47:10 crc kubenswrapper[4699]: I0226 11:47:10.757799 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6t992"] Feb 26 11:47:10 crc kubenswrapper[4699]: I0226 11:47:10.882670 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17908e12-55e0-4e17-9ffb-a33a2208c13c-utilities\") pod \"redhat-operators-6t992\" (UID: \"17908e12-55e0-4e17-9ffb-a33a2208c13c\") " pod="openshift-marketplace/redhat-operators-6t992" Feb 26 11:47:10 crc kubenswrapper[4699]: I0226 11:47:10.883440 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17908e12-55e0-4e17-9ffb-a33a2208c13c-catalog-content\") pod \"redhat-operators-6t992\" (UID: \"17908e12-55e0-4e17-9ffb-a33a2208c13c\") " pod="openshift-marketplace/redhat-operators-6t992" Feb 26 11:47:10 crc kubenswrapper[4699]: I0226 11:47:10.883575 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfpcd\" (UniqueName: \"kubernetes.io/projected/17908e12-55e0-4e17-9ffb-a33a2208c13c-kube-api-access-pfpcd\") pod \"redhat-operators-6t992\" (UID: \"17908e12-55e0-4e17-9ffb-a33a2208c13c\") " pod="openshift-marketplace/redhat-operators-6t992" Feb 26 11:47:10 crc kubenswrapper[4699]: I0226 11:47:10.985396 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17908e12-55e0-4e17-9ffb-a33a2208c13c-utilities\") pod \"redhat-operators-6t992\" (UID: \"17908e12-55e0-4e17-9ffb-a33a2208c13c\") " pod="openshift-marketplace/redhat-operators-6t992" Feb 26 11:47:10 crc kubenswrapper[4699]: I0226 11:47:10.985465 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17908e12-55e0-4e17-9ffb-a33a2208c13c-catalog-content\") pod \"redhat-operators-6t992\" (UID: \"17908e12-55e0-4e17-9ffb-a33a2208c13c\") " pod="openshift-marketplace/redhat-operators-6t992" Feb 26 11:47:10 crc kubenswrapper[4699]: I0226 11:47:10.985515 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfpcd\" (UniqueName: \"kubernetes.io/projected/17908e12-55e0-4e17-9ffb-a33a2208c13c-kube-api-access-pfpcd\") pod \"redhat-operators-6t992\" (UID: \"17908e12-55e0-4e17-9ffb-a33a2208c13c\") " pod="openshift-marketplace/redhat-operators-6t992" Feb 26 11:47:10 crc kubenswrapper[4699]: I0226 11:47:10.985973 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17908e12-55e0-4e17-9ffb-a33a2208c13c-utilities\") pod \"redhat-operators-6t992\" (UID: \"17908e12-55e0-4e17-9ffb-a33a2208c13c\") " pod="openshift-marketplace/redhat-operators-6t992" Feb 26 11:47:10 crc kubenswrapper[4699]: I0226 11:47:10.986052 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17908e12-55e0-4e17-9ffb-a33a2208c13c-catalog-content\") pod \"redhat-operators-6t992\" (UID: \"17908e12-55e0-4e17-9ffb-a33a2208c13c\") " pod="openshift-marketplace/redhat-operators-6t992" Feb 26 11:47:11 crc kubenswrapper[4699]: I0226 11:47:11.011572 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfpcd\" (UniqueName: \"kubernetes.io/projected/17908e12-55e0-4e17-9ffb-a33a2208c13c-kube-api-access-pfpcd\") pod \"redhat-operators-6t992\" (UID: \"17908e12-55e0-4e17-9ffb-a33a2208c13c\") " pod="openshift-marketplace/redhat-operators-6t992" Feb 26 11:47:11 crc kubenswrapper[4699]: I0226 11:47:11.071232 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6t992" Feb 26 11:47:11 crc kubenswrapper[4699]: I0226 11:47:11.632472 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6t992"] Feb 26 11:47:11 crc kubenswrapper[4699]: I0226 11:47:11.711948 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6t992" event={"ID":"17908e12-55e0-4e17-9ffb-a33a2208c13c","Type":"ContainerStarted","Data":"27550c0b57d27ece73dfcb2f3c7ecab7a2bc0f6ea93790910de25028b7547595"} Feb 26 11:47:12 crc kubenswrapper[4699]: I0226 11:47:12.723371 4699 generic.go:334] "Generic (PLEG): container finished" podID="17908e12-55e0-4e17-9ffb-a33a2208c13c" containerID="2ce84e69f7d9b5f03ad4ad405f0853dcfef618c84355f2c04940b92b37190ddf" exitCode=0 Feb 26 11:47:12 crc kubenswrapper[4699]: I0226 11:47:12.723711 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6t992" event={"ID":"17908e12-55e0-4e17-9ffb-a33a2208c13c","Type":"ContainerDied","Data":"2ce84e69f7d9b5f03ad4ad405f0853dcfef618c84355f2c04940b92b37190ddf"} Feb 26 11:47:13 crc kubenswrapper[4699]: I0226 11:47:13.734071 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6t992" event={"ID":"17908e12-55e0-4e17-9ffb-a33a2208c13c","Type":"ContainerStarted","Data":"8d0b1ac4594d259abec4b98cdcc147f8d8a0f1d0e1fa7c1dc5a734331fac319a"} Feb 26 11:47:19 crc kubenswrapper[4699]: I0226 11:47:19.782725 4699 generic.go:334] "Generic (PLEG): container finished" podID="17908e12-55e0-4e17-9ffb-a33a2208c13c" containerID="8d0b1ac4594d259abec4b98cdcc147f8d8a0f1d0e1fa7c1dc5a734331fac319a" exitCode=0 Feb 26 11:47:19 crc kubenswrapper[4699]: I0226 11:47:19.782836 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6t992" event={"ID":"17908e12-55e0-4e17-9ffb-a33a2208c13c","Type":"ContainerDied","Data":"8d0b1ac4594d259abec4b98cdcc147f8d8a0f1d0e1fa7c1dc5a734331fac319a"} Feb 26 11:47:20 crc kubenswrapper[4699]: I0226 11:47:20.794706 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6t992" event={"ID":"17908e12-55e0-4e17-9ffb-a33a2208c13c","Type":"ContainerStarted","Data":"e04e82cff708119a13028fa8ab510b921da08edef6a16507873b418d4c86cb1a"} Feb 26 11:47:20 crc kubenswrapper[4699]: I0226 11:47:20.816542 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6t992" podStartSLOduration=3.227941036 podStartE2EDuration="10.816507841s" podCreationTimestamp="2026-02-26 11:47:10 +0000 UTC" firstStartedPulling="2026-02-26 11:47:12.725995906 +0000 UTC m=+2178.536822340" lastFinishedPulling="2026-02-26 11:47:20.314562711 +0000 UTC m=+2186.125389145" observedRunningTime="2026-02-26 11:47:20.811599711 +0000 UTC m=+2186.622426135" watchObservedRunningTime="2026-02-26 11:47:20.816507841 +0000 UTC m=+2186.627334275" Feb 26 11:47:21 crc kubenswrapper[4699]: I0226 11:47:21.072417 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6t992" Feb 26 11:47:21 crc kubenswrapper[4699]: I0226 11:47:21.072473 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6t992" Feb 26 11:47:22 crc kubenswrapper[4699]: I0226 11:47:22.123094 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6t992" podUID="17908e12-55e0-4e17-9ffb-a33a2208c13c" containerName="registry-server" probeResult="failure" output=< Feb 26 11:47:22 crc kubenswrapper[4699]: timeout: failed to connect service ":50051" within 1s Feb 26 11:47:22 crc kubenswrapper[4699]: > Feb 26 11:47:31 crc kubenswrapper[4699]: I0226 11:47:31.123770 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6t992" Feb 26 11:47:31 crc kubenswrapper[4699]: I0226 11:47:31.170146 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6t992" Feb 26 11:47:31 crc kubenswrapper[4699]: I0226 11:47:31.357653 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6t992"] Feb 26 11:47:32 crc kubenswrapper[4699]: I0226 11:47:32.893082 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6t992" podUID="17908e12-55e0-4e17-9ffb-a33a2208c13c" containerName="registry-server" containerID="cri-o://e04e82cff708119a13028fa8ab510b921da08edef6a16507873b418d4c86cb1a" gracePeriod=2 Feb 26 11:47:33 crc kubenswrapper[4699]: I0226 11:47:33.356812 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6t992" Feb 26 11:47:33 crc kubenswrapper[4699]: I0226 11:47:33.520088 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17908e12-55e0-4e17-9ffb-a33a2208c13c-utilities\") pod \"17908e12-55e0-4e17-9ffb-a33a2208c13c\" (UID: \"17908e12-55e0-4e17-9ffb-a33a2208c13c\") " Feb 26 11:47:33 crc kubenswrapper[4699]: I0226 11:47:33.520283 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfpcd\" (UniqueName: \"kubernetes.io/projected/17908e12-55e0-4e17-9ffb-a33a2208c13c-kube-api-access-pfpcd\") pod \"17908e12-55e0-4e17-9ffb-a33a2208c13c\" (UID: \"17908e12-55e0-4e17-9ffb-a33a2208c13c\") " Feb 26 11:47:33 crc kubenswrapper[4699]: I0226 11:47:33.520312 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17908e12-55e0-4e17-9ffb-a33a2208c13c-catalog-content\") pod \"17908e12-55e0-4e17-9ffb-a33a2208c13c\" (UID: \"17908e12-55e0-4e17-9ffb-a33a2208c13c\") " Feb 26 11:47:33 crc kubenswrapper[4699]: I0226 11:47:33.521812 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17908e12-55e0-4e17-9ffb-a33a2208c13c-utilities" (OuterVolumeSpecName: "utilities") pod "17908e12-55e0-4e17-9ffb-a33a2208c13c" (UID: "17908e12-55e0-4e17-9ffb-a33a2208c13c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:47:33 crc kubenswrapper[4699]: I0226 11:47:33.527666 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17908e12-55e0-4e17-9ffb-a33a2208c13c-kube-api-access-pfpcd" (OuterVolumeSpecName: "kube-api-access-pfpcd") pod "17908e12-55e0-4e17-9ffb-a33a2208c13c" (UID: "17908e12-55e0-4e17-9ffb-a33a2208c13c"). InnerVolumeSpecName "kube-api-access-pfpcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:47:33 crc kubenswrapper[4699]: I0226 11:47:33.622627 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17908e12-55e0-4e17-9ffb-a33a2208c13c-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:47:33 crc kubenswrapper[4699]: I0226 11:47:33.622659 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfpcd\" (UniqueName: \"kubernetes.io/projected/17908e12-55e0-4e17-9ffb-a33a2208c13c-kube-api-access-pfpcd\") on node \"crc\" DevicePath \"\"" Feb 26 11:47:33 crc kubenswrapper[4699]: I0226 11:47:33.639382 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17908e12-55e0-4e17-9ffb-a33a2208c13c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17908e12-55e0-4e17-9ffb-a33a2208c13c" (UID: "17908e12-55e0-4e17-9ffb-a33a2208c13c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:47:33 crc kubenswrapper[4699]: I0226 11:47:33.724745 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17908e12-55e0-4e17-9ffb-a33a2208c13c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:47:33 crc kubenswrapper[4699]: I0226 11:47:33.906360 4699 generic.go:334] "Generic (PLEG): container finished" podID="17908e12-55e0-4e17-9ffb-a33a2208c13c" containerID="e04e82cff708119a13028fa8ab510b921da08edef6a16507873b418d4c86cb1a" exitCode=0 Feb 26 11:47:33 crc kubenswrapper[4699]: I0226 11:47:33.906401 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6t992" event={"ID":"17908e12-55e0-4e17-9ffb-a33a2208c13c","Type":"ContainerDied","Data":"e04e82cff708119a13028fa8ab510b921da08edef6a16507873b418d4c86cb1a"} Feb 26 11:47:33 crc kubenswrapper[4699]: I0226 11:47:33.906432 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6t992" event={"ID":"17908e12-55e0-4e17-9ffb-a33a2208c13c","Type":"ContainerDied","Data":"27550c0b57d27ece73dfcb2f3c7ecab7a2bc0f6ea93790910de25028b7547595"} Feb 26 11:47:33 crc kubenswrapper[4699]: I0226 11:47:33.906450 4699 scope.go:117] "RemoveContainer" containerID="e04e82cff708119a13028fa8ab510b921da08edef6a16507873b418d4c86cb1a" Feb 26 11:47:33 crc kubenswrapper[4699]: I0226 11:47:33.906496 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6t992" Feb 26 11:47:33 crc kubenswrapper[4699]: I0226 11:47:33.928445 4699 scope.go:117] "RemoveContainer" containerID="8d0b1ac4594d259abec4b98cdcc147f8d8a0f1d0e1fa7c1dc5a734331fac319a" Feb 26 11:47:33 crc kubenswrapper[4699]: I0226 11:47:33.953859 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6t992"] Feb 26 11:47:33 crc kubenswrapper[4699]: I0226 11:47:33.968518 4699 scope.go:117] "RemoveContainer" containerID="2ce84e69f7d9b5f03ad4ad405f0853dcfef618c84355f2c04940b92b37190ddf" Feb 26 11:47:33 crc kubenswrapper[4699]: I0226 11:47:33.993324 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6t992"] Feb 26 11:47:34 crc kubenswrapper[4699]: I0226 11:47:34.017033 4699 scope.go:117] "RemoveContainer" containerID="e04e82cff708119a13028fa8ab510b921da08edef6a16507873b418d4c86cb1a" Feb 26 11:47:34 crc kubenswrapper[4699]: E0226 11:47:34.017781 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e04e82cff708119a13028fa8ab510b921da08edef6a16507873b418d4c86cb1a\": container with ID starting with e04e82cff708119a13028fa8ab510b921da08edef6a16507873b418d4c86cb1a not found: ID does not exist" containerID="e04e82cff708119a13028fa8ab510b921da08edef6a16507873b418d4c86cb1a" Feb 26 11:47:34 crc kubenswrapper[4699]: I0226 11:47:34.017867 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04e82cff708119a13028fa8ab510b921da08edef6a16507873b418d4c86cb1a"} err="failed to get container status \"e04e82cff708119a13028fa8ab510b921da08edef6a16507873b418d4c86cb1a\": rpc error: code = NotFound desc = could not find container \"e04e82cff708119a13028fa8ab510b921da08edef6a16507873b418d4c86cb1a\": container with ID starting with e04e82cff708119a13028fa8ab510b921da08edef6a16507873b418d4c86cb1a not found: ID does not exist" Feb 26 11:47:34 crc kubenswrapper[4699]: I0226 11:47:34.017899 4699 scope.go:117] "RemoveContainer" containerID="8d0b1ac4594d259abec4b98cdcc147f8d8a0f1d0e1fa7c1dc5a734331fac319a" Feb 26 11:47:34 crc kubenswrapper[4699]: E0226 11:47:34.018584 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d0b1ac4594d259abec4b98cdcc147f8d8a0f1d0e1fa7c1dc5a734331fac319a\": container with ID starting with 8d0b1ac4594d259abec4b98cdcc147f8d8a0f1d0e1fa7c1dc5a734331fac319a not found: ID does not exist" containerID="8d0b1ac4594d259abec4b98cdcc147f8d8a0f1d0e1fa7c1dc5a734331fac319a" Feb 26 11:47:34 crc kubenswrapper[4699]: I0226 11:47:34.018619 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d0b1ac4594d259abec4b98cdcc147f8d8a0f1d0e1fa7c1dc5a734331fac319a"} err="failed to get container status \"8d0b1ac4594d259abec4b98cdcc147f8d8a0f1d0e1fa7c1dc5a734331fac319a\": rpc error: code = NotFound desc = could not find container \"8d0b1ac4594d259abec4b98cdcc147f8d8a0f1d0e1fa7c1dc5a734331fac319a\": container with ID starting with 8d0b1ac4594d259abec4b98cdcc147f8d8a0f1d0e1fa7c1dc5a734331fac319a not found: ID does not exist" Feb 26 11:47:34 crc kubenswrapper[4699]: I0226 11:47:34.018644 4699 scope.go:117] "RemoveContainer" containerID="2ce84e69f7d9b5f03ad4ad405f0853dcfef618c84355f2c04940b92b37190ddf" Feb 26 11:47:34 crc kubenswrapper[4699]: E0226 11:47:34.018969 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ce84e69f7d9b5f03ad4ad405f0853dcfef618c84355f2c04940b92b37190ddf\": container with ID starting with 2ce84e69f7d9b5f03ad4ad405f0853dcfef618c84355f2c04940b92b37190ddf not found: ID does not exist" containerID="2ce84e69f7d9b5f03ad4ad405f0853dcfef618c84355f2c04940b92b37190ddf" Feb 26 11:47:34 crc kubenswrapper[4699]: I0226 11:47:34.018995 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ce84e69f7d9b5f03ad4ad405f0853dcfef618c84355f2c04940b92b37190ddf"} err="failed to get container status \"2ce84e69f7d9b5f03ad4ad405f0853dcfef618c84355f2c04940b92b37190ddf\": rpc error: code = NotFound desc = could not find container \"2ce84e69f7d9b5f03ad4ad405f0853dcfef618c84355f2c04940b92b37190ddf\": container with ID starting with 2ce84e69f7d9b5f03ad4ad405f0853dcfef618c84355f2c04940b92b37190ddf not found: ID does not exist" Feb 26 11:47:34 crc kubenswrapper[4699]: I0226 11:47:34.274211 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17908e12-55e0-4e17-9ffb-a33a2208c13c" path="/var/lib/kubelet/pods/17908e12-55e0-4e17-9ffb-a33a2208c13c/volumes" Feb 26 11:47:52 crc kubenswrapper[4699]: I0226 11:47:52.135631 4699 generic.go:334] "Generic (PLEG): container finished" podID="59456382-a459-4f82-ac99-b96eb735ddb9" containerID="08a5874a3b7c9d905481c4a6b7b1f36886135a1f3140e5983bc7888075a8dbaa" exitCode=0 Feb 26 11:47:52 crc kubenswrapper[4699]: I0226 11:47:52.135715 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" event={"ID":"59456382-a459-4f82-ac99-b96eb735ddb9","Type":"ContainerDied","Data":"08a5874a3b7c9d905481c4a6b7b1f36886135a1f3140e5983bc7888075a8dbaa"} Feb 26 11:47:53 crc kubenswrapper[4699]: I0226 11:47:53.622704 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" Feb 26 11:47:53 crc kubenswrapper[4699]: I0226 11:47:53.810637 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjv4v\" (UniqueName: \"kubernetes.io/projected/59456382-a459-4f82-ac99-b96eb735ddb9-kube-api-access-tjv4v\") pod \"59456382-a459-4f82-ac99-b96eb735ddb9\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " Feb 26 11:47:53 crc kubenswrapper[4699]: I0226 11:47:53.810723 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-nova-metadata-neutron-config-0\") pod \"59456382-a459-4f82-ac99-b96eb735ddb9\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " Feb 26 11:47:53 crc kubenswrapper[4699]: I0226 11:47:53.810752 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-inventory\") pod \"59456382-a459-4f82-ac99-b96eb735ddb9\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " Feb 26 11:47:53 crc kubenswrapper[4699]: I0226 11:47:53.810786 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-neutron-metadata-combined-ca-bundle\") pod \"59456382-a459-4f82-ac99-b96eb735ddb9\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " Feb 26 11:47:53 crc kubenswrapper[4699]: I0226 11:47:53.810992 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"59456382-a459-4f82-ac99-b96eb735ddb9\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " Feb 26 11:47:53 crc kubenswrapper[4699]: I0226 11:47:53.811072 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-ssh-key-openstack-edpm-ipam\") pod \"59456382-a459-4f82-ac99-b96eb735ddb9\" (UID: \"59456382-a459-4f82-ac99-b96eb735ddb9\") " Feb 26 11:47:53 crc kubenswrapper[4699]: I0226 11:47:53.818649 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59456382-a459-4f82-ac99-b96eb735ddb9-kube-api-access-tjv4v" (OuterVolumeSpecName: "kube-api-access-tjv4v") pod "59456382-a459-4f82-ac99-b96eb735ddb9" (UID: "59456382-a459-4f82-ac99-b96eb735ddb9"). InnerVolumeSpecName "kube-api-access-tjv4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:47:53 crc kubenswrapper[4699]: I0226 11:47:53.821104 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "59456382-a459-4f82-ac99-b96eb735ddb9" (UID: "59456382-a459-4f82-ac99-b96eb735ddb9"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:47:53 crc kubenswrapper[4699]: I0226 11:47:53.846125 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "59456382-a459-4f82-ac99-b96eb735ddb9" (UID: "59456382-a459-4f82-ac99-b96eb735ddb9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:47:53 crc kubenswrapper[4699]: I0226 11:47:53.855065 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "59456382-a459-4f82-ac99-b96eb735ddb9" (UID: "59456382-a459-4f82-ac99-b96eb735ddb9"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:47:53 crc kubenswrapper[4699]: I0226 11:47:53.856913 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-inventory" (OuterVolumeSpecName: "inventory") pod "59456382-a459-4f82-ac99-b96eb735ddb9" (UID: "59456382-a459-4f82-ac99-b96eb735ddb9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:47:53 crc kubenswrapper[4699]: I0226 11:47:53.862415 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "59456382-a459-4f82-ac99-b96eb735ddb9" (UID: "59456382-a459-4f82-ac99-b96eb735ddb9"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:47:53 crc kubenswrapper[4699]: I0226 11:47:53.981283 4699 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 11:47:53 crc kubenswrapper[4699]: I0226 11:47:53.981331 4699 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 11:47:53 crc kubenswrapper[4699]: I0226 11:47:53.981345 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjv4v\" (UniqueName: \"kubernetes.io/projected/59456382-a459-4f82-ac99-b96eb735ddb9-kube-api-access-tjv4v\") on node \"crc\" DevicePath \"\"" Feb 26 11:47:53 crc kubenswrapper[4699]: I0226 11:47:53.981359 4699 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 11:47:53 crc kubenswrapper[4699]: I0226 11:47:53.981369 4699 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 11:47:53 crc kubenswrapper[4699]: I0226 11:47:53.981379 4699 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59456382-a459-4f82-ac99-b96eb735ddb9-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.157989 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" event={"ID":"59456382-a459-4f82-ac99-b96eb735ddb9","Type":"ContainerDied","Data":"b65ffd3662b40c51c98c1dd30170152c5d509e1d0fe771319b3ef00e26682063"} Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.158033 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b65ffd3662b40c51c98c1dd30170152c5d509e1d0fe771319b3ef00e26682063" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.158052 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.250147 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f"] Feb 26 11:47:54 crc kubenswrapper[4699]: E0226 11:47:54.252676 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17908e12-55e0-4e17-9ffb-a33a2208c13c" containerName="extract-content" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.252705 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="17908e12-55e0-4e17-9ffb-a33a2208c13c" containerName="extract-content" Feb 26 11:47:54 crc kubenswrapper[4699]: E0226 11:47:54.252723 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17908e12-55e0-4e17-9ffb-a33a2208c13c" containerName="registry-server" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.252732 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="17908e12-55e0-4e17-9ffb-a33a2208c13c" containerName="registry-server" Feb 26 11:47:54 crc kubenswrapper[4699]: E0226 11:47:54.252752 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59456382-a459-4f82-ac99-b96eb735ddb9" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.252760 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="59456382-a459-4f82-ac99-b96eb735ddb9" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 26 11:47:54 crc kubenswrapper[4699]: E0226 11:47:54.252774 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17908e12-55e0-4e17-9ffb-a33a2208c13c" containerName="extract-utilities" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.252781 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="17908e12-55e0-4e17-9ffb-a33a2208c13c" containerName="extract-utilities" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.253012 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="59456382-a459-4f82-ac99-b96eb735ddb9" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.253033 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="17908e12-55e0-4e17-9ffb-a33a2208c13c" containerName="registry-server" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.253711 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.257267 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.257276 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-69sdb" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.257387 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.257424 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.257648 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.277654 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f"] Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.389298 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f\" (UID: \"6436c321-6850-4db3-81b2-0dc329e10900\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.390251 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f\" (UID: \"6436c321-6850-4db3-81b2-0dc329e10900\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.390432 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f\" (UID: \"6436c321-6850-4db3-81b2-0dc329e10900\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.390583 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f\" (UID: \"6436c321-6850-4db3-81b2-0dc329e10900\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.390624 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57ppn\" (UniqueName: \"kubernetes.io/projected/6436c321-6850-4db3-81b2-0dc329e10900-kube-api-access-57ppn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f\" (UID: \"6436c321-6850-4db3-81b2-0dc329e10900\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.492313 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f\" (UID: \"6436c321-6850-4db3-81b2-0dc329e10900\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.492397 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f\" (UID: \"6436c321-6850-4db3-81b2-0dc329e10900\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.492424 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57ppn\" (UniqueName: \"kubernetes.io/projected/6436c321-6850-4db3-81b2-0dc329e10900-kube-api-access-57ppn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f\" (UID: \"6436c321-6850-4db3-81b2-0dc329e10900\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.492483 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f\" (UID: \"6436c321-6850-4db3-81b2-0dc329e10900\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.492547 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f\" (UID: \"6436c321-6850-4db3-81b2-0dc329e10900\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.496578 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f\" (UID: \"6436c321-6850-4db3-81b2-0dc329e10900\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.496585 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f\" (UID: \"6436c321-6850-4db3-81b2-0dc329e10900\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.496596 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f\" (UID: \"6436c321-6850-4db3-81b2-0dc329e10900\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.503783 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f\" (UID: \"6436c321-6850-4db3-81b2-0dc329e10900\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.513714 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57ppn\" (UniqueName: \"kubernetes.io/projected/6436c321-6850-4db3-81b2-0dc329e10900-kube-api-access-57ppn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f\" (UID: \"6436c321-6850-4db3-81b2-0dc329e10900\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" Feb 26 11:47:54 crc kubenswrapper[4699]: I0226 11:47:54.585576 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" Feb 26 11:47:55 crc kubenswrapper[4699]: I0226 11:47:55.120723 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f"] Feb 26 11:47:55 crc kubenswrapper[4699]: I0226 11:47:55.167702 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" event={"ID":"6436c321-6850-4db3-81b2-0dc329e10900","Type":"ContainerStarted","Data":"da7eabc20b73f3cfcb5f479d6c26b5a779dbd02d4697b1b42ef3653df7b2ae5b"} Feb 26 11:47:56 crc kubenswrapper[4699]: I0226 11:47:56.178715 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" event={"ID":"6436c321-6850-4db3-81b2-0dc329e10900","Type":"ContainerStarted","Data":"8bd3df01daa0942902ceb3f721b2d365aa21e62ede502d0a6f006ad1267cec53"} Feb 26 11:47:56 crc kubenswrapper[4699]: I0226 11:47:56.200376 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" podStartSLOduration=1.3859444220000001 podStartE2EDuration="2.200354882s" podCreationTimestamp="2026-02-26 11:47:54 +0000 UTC" firstStartedPulling="2026-02-26 11:47:55.122782917 +0000 UTC m=+2220.933609351" lastFinishedPulling="2026-02-26 11:47:55.937193377 +0000 UTC m=+2221.748019811" observedRunningTime="2026-02-26 11:47:56.196540364 +0000 UTC m=+2222.007366798" watchObservedRunningTime="2026-02-26 11:47:56.200354882 +0000 UTC m=+2222.011181316" Feb 26 11:48:00 crc kubenswrapper[4699]: I0226 11:48:00.137052 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535108-79cdj"] Feb 26 11:48:00 crc kubenswrapper[4699]: I0226 11:48:00.139582 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535108-79cdj" Feb 26 11:48:00 crc kubenswrapper[4699]: I0226 11:48:00.142834 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 11:48:00 crc kubenswrapper[4699]: I0226 11:48:00.143334 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 11:48:00 crc kubenswrapper[4699]: I0226 11:48:00.143636 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 11:48:00 crc kubenswrapper[4699]: I0226 11:48:00.155919 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535108-79cdj"] Feb 26 11:48:00 crc kubenswrapper[4699]: I0226 11:48:00.255012 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgwkv\" (UniqueName: \"kubernetes.io/projected/6366100d-f68c-43ce-879b-4cc3f80c8156-kube-api-access-wgwkv\") pod \"auto-csr-approver-29535108-79cdj\" (UID: \"6366100d-f68c-43ce-879b-4cc3f80c8156\") " pod="openshift-infra/auto-csr-approver-29535108-79cdj" Feb 26 11:48:00 crc kubenswrapper[4699]: I0226 11:48:00.357207 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgwkv\" (UniqueName: \"kubernetes.io/projected/6366100d-f68c-43ce-879b-4cc3f80c8156-kube-api-access-wgwkv\") pod \"auto-csr-approver-29535108-79cdj\" (UID: \"6366100d-f68c-43ce-879b-4cc3f80c8156\") " pod="openshift-infra/auto-csr-approver-29535108-79cdj" Feb 26 11:48:00 crc kubenswrapper[4699]: I0226 11:48:00.381739 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgwkv\" (UniqueName: \"kubernetes.io/projected/6366100d-f68c-43ce-879b-4cc3f80c8156-kube-api-access-wgwkv\") pod \"auto-csr-approver-29535108-79cdj\" (UID: \"6366100d-f68c-43ce-879b-4cc3f80c8156\") " pod="openshift-infra/auto-csr-approver-29535108-79cdj" Feb 26 11:48:00 crc kubenswrapper[4699]: I0226 11:48:00.467242 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535108-79cdj" Feb 26 11:48:00 crc kubenswrapper[4699]: I0226 11:48:00.937721 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535108-79cdj"] Feb 26 11:48:01 crc kubenswrapper[4699]: I0226 11:48:01.261463 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535108-79cdj" event={"ID":"6366100d-f68c-43ce-879b-4cc3f80c8156","Type":"ContainerStarted","Data":"9550e0566c3abf9a4ff53e5eebe42cb9ed71dc39b77b450749cfbf15b78168d7"} Feb 26 11:48:04 crc kubenswrapper[4699]: I0226 11:48:04.299834 4699 generic.go:334] "Generic (PLEG): container finished" podID="6366100d-f68c-43ce-879b-4cc3f80c8156" containerID="a49e0f6a5b8aa98c17ff2dc316f41da6f3d780c3f18aaef30599837dcc6bc0ea" exitCode=0 Feb 26 11:48:04 crc kubenswrapper[4699]: I0226 11:48:04.299946 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535108-79cdj" event={"ID":"6366100d-f68c-43ce-879b-4cc3f80c8156","Type":"ContainerDied","Data":"a49e0f6a5b8aa98c17ff2dc316f41da6f3d780c3f18aaef30599837dcc6bc0ea"} Feb 26 11:48:05 crc kubenswrapper[4699]: I0226 11:48:05.680043 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535108-79cdj" Feb 26 11:48:05 crc kubenswrapper[4699]: I0226 11:48:05.808875 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgwkv\" (UniqueName: \"kubernetes.io/projected/6366100d-f68c-43ce-879b-4cc3f80c8156-kube-api-access-wgwkv\") pod \"6366100d-f68c-43ce-879b-4cc3f80c8156\" (UID: \"6366100d-f68c-43ce-879b-4cc3f80c8156\") " Feb 26 11:48:05 crc kubenswrapper[4699]: I0226 11:48:05.815504 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6366100d-f68c-43ce-879b-4cc3f80c8156-kube-api-access-wgwkv" (OuterVolumeSpecName: "kube-api-access-wgwkv") pod "6366100d-f68c-43ce-879b-4cc3f80c8156" (UID: "6366100d-f68c-43ce-879b-4cc3f80c8156"). InnerVolumeSpecName "kube-api-access-wgwkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:48:05 crc kubenswrapper[4699]: I0226 11:48:05.910736 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgwkv\" (UniqueName: \"kubernetes.io/projected/6366100d-f68c-43ce-879b-4cc3f80c8156-kube-api-access-wgwkv\") on node \"crc\" DevicePath \"\"" Feb 26 11:48:06 crc kubenswrapper[4699]: I0226 11:48:06.319061 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535108-79cdj" event={"ID":"6366100d-f68c-43ce-879b-4cc3f80c8156","Type":"ContainerDied","Data":"9550e0566c3abf9a4ff53e5eebe42cb9ed71dc39b77b450749cfbf15b78168d7"} Feb 26 11:48:06 crc kubenswrapper[4699]: I0226 11:48:06.319452 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9550e0566c3abf9a4ff53e5eebe42cb9ed71dc39b77b450749cfbf15b78168d7" Feb 26 11:48:06 crc kubenswrapper[4699]: I0226 11:48:06.319143 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535108-79cdj" Feb 26 11:48:06 crc kubenswrapper[4699]: I0226 11:48:06.753922 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535102-2zbvr"] Feb 26 11:48:06 crc kubenswrapper[4699]: I0226 11:48:06.762852 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535102-2zbvr"] Feb 26 11:48:08 crc kubenswrapper[4699]: I0226 11:48:08.272475 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b7b6e2c-94b0-4cfd-9b33-f988c4a8c984" path="/var/lib/kubelet/pods/1b7b6e2c-94b0-4cfd-9b33-f988c4a8c984/volumes" Feb 26 11:48:11 crc kubenswrapper[4699]: I0226 11:48:11.584929 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:48:11 crc kubenswrapper[4699]: I0226 11:48:11.585657 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:48:31 crc kubenswrapper[4699]: I0226 11:48:31.385803 4699 scope.go:117] "RemoveContainer" containerID="dfc62ad99cdddeccaa0a04e48b0be130dad6cc30569fc90d45e5fa7beabda285" Feb 26 11:48:41 crc kubenswrapper[4699]: I0226 11:48:41.594007 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:48:41 crc kubenswrapper[4699]: I0226 11:48:41.595520 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:49:11 crc kubenswrapper[4699]: I0226 11:49:11.585527 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:49:11 crc kubenswrapper[4699]: I0226 11:49:11.586235 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:49:11 crc kubenswrapper[4699]: I0226 11:49:11.586304 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:49:11 crc kubenswrapper[4699]: I0226 11:49:11.587459 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde"} pod="openshift-machine-config-operator/machine-config-daemon-28p79" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 11:49:11 crc kubenswrapper[4699]: I0226 11:49:11.587545 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" containerID="cri-o://c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" gracePeriod=600 Feb 26 11:49:11 crc kubenswrapper[4699]: E0226 11:49:11.716613 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:49:11 crc kubenswrapper[4699]: I0226 11:49:11.969927 4699 generic.go:334] "Generic (PLEG): container finished" podID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" exitCode=0 Feb 26 11:49:11 crc kubenswrapper[4699]: I0226 11:49:11.969982 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerDied","Data":"c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde"} Feb 26 11:49:11 crc kubenswrapper[4699]: I0226 11:49:11.970384 4699 scope.go:117] "RemoveContainer" containerID="6ef034a72d27c84dbd807adb1a50ce258b1b8022f1d940a8fb612e62f1d33345" Feb 26 11:49:11 crc kubenswrapper[4699]: I0226 11:49:11.972001 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:49:11 crc kubenswrapper[4699]: E0226 11:49:11.973309 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:49:25 crc kubenswrapper[4699]: I0226 11:49:25.261859 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:49:25 crc kubenswrapper[4699]: E0226 11:49:25.262870 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:49:38 crc kubenswrapper[4699]: I0226 11:49:38.261038 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:49:38 crc kubenswrapper[4699]: E0226 11:49:38.261945 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:49:51 crc kubenswrapper[4699]: I0226 11:49:51.261247 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:49:51 crc kubenswrapper[4699]: E0226 11:49:51.262082 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:50:00 crc kubenswrapper[4699]: I0226 11:50:00.151875 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535110-g2n8d"] Feb 26 11:50:00 crc kubenswrapper[4699]: E0226 11:50:00.153230 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6366100d-f68c-43ce-879b-4cc3f80c8156" containerName="oc" Feb 26 11:50:00 crc kubenswrapper[4699]: I0226 11:50:00.153247 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6366100d-f68c-43ce-879b-4cc3f80c8156" containerName="oc" Feb 26 11:50:00 crc kubenswrapper[4699]: I0226 11:50:00.153530 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6366100d-f68c-43ce-879b-4cc3f80c8156" containerName="oc" Feb 26 11:50:00 crc kubenswrapper[4699]: I0226 11:50:00.154384 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535110-g2n8d" Feb 26 11:50:00 crc kubenswrapper[4699]: I0226 11:50:00.157382 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 11:50:00 crc kubenswrapper[4699]: I0226 11:50:00.157475 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 11:50:00 crc kubenswrapper[4699]: I0226 11:50:00.157524 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 11:50:00 crc kubenswrapper[4699]: I0226 11:50:00.175281 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535110-g2n8d"] Feb 26 11:50:00 crc kubenswrapper[4699]: I0226 11:50:00.258224 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtf98\" (UniqueName: \"kubernetes.io/projected/e270a2c1-b1c4-498d-9adf-a3cbb51defce-kube-api-access-gtf98\") pod \"auto-csr-approver-29535110-g2n8d\" (UID: \"e270a2c1-b1c4-498d-9adf-a3cbb51defce\") " pod="openshift-infra/auto-csr-approver-29535110-g2n8d" Feb 26 11:50:00 crc kubenswrapper[4699]: I0226 11:50:00.360259 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtf98\" (UniqueName: \"kubernetes.io/projected/e270a2c1-b1c4-498d-9adf-a3cbb51defce-kube-api-access-gtf98\") pod \"auto-csr-approver-29535110-g2n8d\" (UID: \"e270a2c1-b1c4-498d-9adf-a3cbb51defce\") " pod="openshift-infra/auto-csr-approver-29535110-g2n8d" Feb 26 11:50:00 crc kubenswrapper[4699]: I0226 11:50:00.381853 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtf98\" (UniqueName: \"kubernetes.io/projected/e270a2c1-b1c4-498d-9adf-a3cbb51defce-kube-api-access-gtf98\") pod \"auto-csr-approver-29535110-g2n8d\" (UID: \"e270a2c1-b1c4-498d-9adf-a3cbb51defce\") " pod="openshift-infra/auto-csr-approver-29535110-g2n8d" Feb 26 11:50:00 crc kubenswrapper[4699]: I0226 11:50:00.479556 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535110-g2n8d" Feb 26 11:50:00 crc kubenswrapper[4699]: I0226 11:50:00.963154 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535110-g2n8d"] Feb 26 11:50:01 crc kubenswrapper[4699]: I0226 11:50:01.419944 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535110-g2n8d" event={"ID":"e270a2c1-b1c4-498d-9adf-a3cbb51defce","Type":"ContainerStarted","Data":"c9a043ff431ac00b2450dceb37f7050fd1f84e1b7f33e8aa986d6da1ce100586"} Feb 26 11:50:03 crc kubenswrapper[4699]: I0226 11:50:03.261589 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:50:03 crc kubenswrapper[4699]: E0226 11:50:03.262547 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:50:04 crc kubenswrapper[4699]: I0226 11:50:04.447535 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535110-g2n8d" event={"ID":"e270a2c1-b1c4-498d-9adf-a3cbb51defce","Type":"ContainerStarted","Data":"5c3d5a2f0c08caa11b3efe5f7dadcab2f42f5d3eecfcc331eaac28aadfec2f57"} Feb 26 11:50:04 crc kubenswrapper[4699]: I0226 11:50:04.467130 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535110-g2n8d" podStartSLOduration=1.3769225600000001 podStartE2EDuration="4.46709568s" podCreationTimestamp="2026-02-26 11:50:00 +0000 UTC" firstStartedPulling="2026-02-26 11:50:00.966701521 +0000 UTC m=+2346.777527955" lastFinishedPulling="2026-02-26 11:50:04.056874641 +0000 UTC m=+2349.867701075" observedRunningTime="2026-02-26 11:50:04.459552103 +0000 UTC m=+2350.270378527" watchObservedRunningTime="2026-02-26 11:50:04.46709568 +0000 UTC m=+2350.277922124" Feb 26 11:50:05 crc kubenswrapper[4699]: I0226 11:50:05.458276 4699 generic.go:334] "Generic (PLEG): container finished" podID="e270a2c1-b1c4-498d-9adf-a3cbb51defce" containerID="5c3d5a2f0c08caa11b3efe5f7dadcab2f42f5d3eecfcc331eaac28aadfec2f57" exitCode=0 Feb 26 11:50:05 crc kubenswrapper[4699]: I0226 11:50:05.458328 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535110-g2n8d" event={"ID":"e270a2c1-b1c4-498d-9adf-a3cbb51defce","Type":"ContainerDied","Data":"5c3d5a2f0c08caa11b3efe5f7dadcab2f42f5d3eecfcc331eaac28aadfec2f57"} Feb 26 11:50:06 crc kubenswrapper[4699]: I0226 11:50:06.783481 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535110-g2n8d" Feb 26 11:50:06 crc kubenswrapper[4699]: I0226 11:50:06.797896 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtf98\" (UniqueName: \"kubernetes.io/projected/e270a2c1-b1c4-498d-9adf-a3cbb51defce-kube-api-access-gtf98\") pod \"e270a2c1-b1c4-498d-9adf-a3cbb51defce\" (UID: \"e270a2c1-b1c4-498d-9adf-a3cbb51defce\") " Feb 26 11:50:06 crc kubenswrapper[4699]: I0226 11:50:06.805211 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e270a2c1-b1c4-498d-9adf-a3cbb51defce-kube-api-access-gtf98" (OuterVolumeSpecName: "kube-api-access-gtf98") pod "e270a2c1-b1c4-498d-9adf-a3cbb51defce" (UID: "e270a2c1-b1c4-498d-9adf-a3cbb51defce"). InnerVolumeSpecName "kube-api-access-gtf98". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:50:06 crc kubenswrapper[4699]: I0226 11:50:06.899633 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtf98\" (UniqueName: \"kubernetes.io/projected/e270a2c1-b1c4-498d-9adf-a3cbb51defce-kube-api-access-gtf98\") on node \"crc\" DevicePath \"\"" Feb 26 11:50:07 crc kubenswrapper[4699]: I0226 11:50:07.476848 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535110-g2n8d" event={"ID":"e270a2c1-b1c4-498d-9adf-a3cbb51defce","Type":"ContainerDied","Data":"c9a043ff431ac00b2450dceb37f7050fd1f84e1b7f33e8aa986d6da1ce100586"} Feb 26 11:50:07 crc kubenswrapper[4699]: I0226 11:50:07.477200 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9a043ff431ac00b2450dceb37f7050fd1f84e1b7f33e8aa986d6da1ce100586" Feb 26 11:50:07 crc kubenswrapper[4699]: I0226 11:50:07.476924 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535110-g2n8d" Feb 26 11:50:07 crc kubenswrapper[4699]: I0226 11:50:07.534022 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535104-r58dw"] Feb 26 11:50:07 crc kubenswrapper[4699]: I0226 11:50:07.544914 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535104-r58dw"] Feb 26 11:50:08 crc kubenswrapper[4699]: I0226 11:50:08.271658 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a59d7ac-e643-4693-9c6b-994f1fadd83d" path="/var/lib/kubelet/pods/3a59d7ac-e643-4693-9c6b-994f1fadd83d/volumes" Feb 26 11:50:18 crc kubenswrapper[4699]: I0226 11:50:18.262091 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:50:18 crc kubenswrapper[4699]: E0226 11:50:18.265626 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:50:29 crc kubenswrapper[4699]: I0226 11:50:29.261572 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:50:29 crc kubenswrapper[4699]: E0226 11:50:29.264431 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:50:31 crc kubenswrapper[4699]: I0226 11:50:31.508105 4699 scope.go:117] "RemoveContainer" containerID="6dd92189791b2617628aa3e717314eb02f69fda3f8d5e7e8ceb2bcddb537435f" Feb 26 11:50:40 crc kubenswrapper[4699]: I0226 11:50:40.261207 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:50:40 crc kubenswrapper[4699]: E0226 11:50:40.262046 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:50:52 crc kubenswrapper[4699]: I0226 11:50:52.261034 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:50:52 crc kubenswrapper[4699]: E0226 11:50:52.262054 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:51:07 crc kubenswrapper[4699]: I0226 11:51:07.261240 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:51:07 crc kubenswrapper[4699]: E0226 11:51:07.262175 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:51:21 crc kubenswrapper[4699]: I0226 11:51:21.261269 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:51:21 crc kubenswrapper[4699]: E0226 11:51:21.262142 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:51:32 crc kubenswrapper[4699]: I0226 11:51:32.261901 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:51:32 crc kubenswrapper[4699]: E0226 11:51:32.263073 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:51:46 crc kubenswrapper[4699]: I0226 11:51:46.266055 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:51:46 crc kubenswrapper[4699]: E0226 11:51:46.266986 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:51:57 crc kubenswrapper[4699]: I0226 11:51:57.260946 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:51:57 crc kubenswrapper[4699]: E0226 11:51:57.262004 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:52:00 crc kubenswrapper[4699]: I0226 11:52:00.161919 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535112-jg5nd"] Feb 26 11:52:00 crc kubenswrapper[4699]: E0226 11:52:00.163267 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e270a2c1-b1c4-498d-9adf-a3cbb51defce" containerName="oc" Feb 26 11:52:00 crc kubenswrapper[4699]: I0226 11:52:00.163294 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="e270a2c1-b1c4-498d-9adf-a3cbb51defce" containerName="oc" Feb 26 11:52:00 crc kubenswrapper[4699]: I0226 11:52:00.163597 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="e270a2c1-b1c4-498d-9adf-a3cbb51defce" containerName="oc" Feb 26 11:52:00 crc kubenswrapper[4699]: I0226 11:52:00.164310 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535112-jg5nd" Feb 26 11:52:00 crc kubenswrapper[4699]: I0226 11:52:00.169451 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 11:52:00 crc kubenswrapper[4699]: I0226 11:52:00.171607 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 11:52:00 crc kubenswrapper[4699]: I0226 11:52:00.171867 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 11:52:00 crc kubenswrapper[4699]: I0226 11:52:00.188105 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535112-jg5nd"] Feb 26 11:52:00 crc kubenswrapper[4699]: I0226 11:52:00.298342 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qvmb\" (UniqueName: \"kubernetes.io/projected/5ae8ab95-85cc-473a-bbe7-6065a75e5720-kube-api-access-7qvmb\") pod \"auto-csr-approver-29535112-jg5nd\" (UID: \"5ae8ab95-85cc-473a-bbe7-6065a75e5720\") " pod="openshift-infra/auto-csr-approver-29535112-jg5nd" Feb 26 11:52:00 crc kubenswrapper[4699]: I0226 11:52:00.553010 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qvmb\" (UniqueName: \"kubernetes.io/projected/5ae8ab95-85cc-473a-bbe7-6065a75e5720-kube-api-access-7qvmb\") pod \"auto-csr-approver-29535112-jg5nd\" (UID: \"5ae8ab95-85cc-473a-bbe7-6065a75e5720\") " pod="openshift-infra/auto-csr-approver-29535112-jg5nd" Feb 26 11:52:00 crc kubenswrapper[4699]: I0226 11:52:00.582731 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qvmb\" (UniqueName: \"kubernetes.io/projected/5ae8ab95-85cc-473a-bbe7-6065a75e5720-kube-api-access-7qvmb\") pod \"auto-csr-approver-29535112-jg5nd\" (UID: \"5ae8ab95-85cc-473a-bbe7-6065a75e5720\") " pod="openshift-infra/auto-csr-approver-29535112-jg5nd" Feb 26 11:52:00 crc kubenswrapper[4699]: I0226 11:52:00.784422 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535112-jg5nd" Feb 26 11:52:01 crc kubenswrapper[4699]: I0226 11:52:01.281516 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535112-jg5nd"] Feb 26 11:52:01 crc kubenswrapper[4699]: I0226 11:52:01.691281 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535112-jg5nd" event={"ID":"5ae8ab95-85cc-473a-bbe7-6065a75e5720","Type":"ContainerStarted","Data":"df32e428e87e50ae5b5d52fe0abccaf14b1261cbb9be90b6f95e6b5819312938"} Feb 26 11:52:03 crc kubenswrapper[4699]: I0226 11:52:03.763800 4699 generic.go:334] "Generic (PLEG): container finished" podID="5ae8ab95-85cc-473a-bbe7-6065a75e5720" containerID="e246a9fcedf1306ea4a405c16944f8ad4f9cf630b0ec81a4cd3160f4b051a918" exitCode=0 Feb 26 11:52:03 crc kubenswrapper[4699]: I0226 11:52:03.763891 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535112-jg5nd" event={"ID":"5ae8ab95-85cc-473a-bbe7-6065a75e5720","Type":"ContainerDied","Data":"e246a9fcedf1306ea4a405c16944f8ad4f9cf630b0ec81a4cd3160f4b051a918"} Feb 26 11:52:05 crc kubenswrapper[4699]: I0226 11:52:05.108190 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535112-jg5nd" Feb 26 11:52:05 crc kubenswrapper[4699]: I0226 11:52:05.226213 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qvmb\" (UniqueName: \"kubernetes.io/projected/5ae8ab95-85cc-473a-bbe7-6065a75e5720-kube-api-access-7qvmb\") pod \"5ae8ab95-85cc-473a-bbe7-6065a75e5720\" (UID: \"5ae8ab95-85cc-473a-bbe7-6065a75e5720\") " Feb 26 11:52:05 crc kubenswrapper[4699]: I0226 11:52:05.234547 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ae8ab95-85cc-473a-bbe7-6065a75e5720-kube-api-access-7qvmb" (OuterVolumeSpecName: "kube-api-access-7qvmb") pod "5ae8ab95-85cc-473a-bbe7-6065a75e5720" (UID: "5ae8ab95-85cc-473a-bbe7-6065a75e5720"). InnerVolumeSpecName "kube-api-access-7qvmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:52:05 crc kubenswrapper[4699]: I0226 11:52:05.408052 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qvmb\" (UniqueName: \"kubernetes.io/projected/5ae8ab95-85cc-473a-bbe7-6065a75e5720-kube-api-access-7qvmb\") on node \"crc\" DevicePath \"\"" Feb 26 11:52:05 crc kubenswrapper[4699]: I0226 11:52:05.782141 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535112-jg5nd" event={"ID":"5ae8ab95-85cc-473a-bbe7-6065a75e5720","Type":"ContainerDied","Data":"df32e428e87e50ae5b5d52fe0abccaf14b1261cbb9be90b6f95e6b5819312938"} Feb 26 11:52:05 crc kubenswrapper[4699]: I0226 11:52:05.782189 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df32e428e87e50ae5b5d52fe0abccaf14b1261cbb9be90b6f95e6b5819312938" Feb 26 11:52:05 crc kubenswrapper[4699]: I0226 11:52:05.782189 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535112-jg5nd" Feb 26 11:52:06 crc kubenswrapper[4699]: I0226 11:52:06.180913 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535106-cv2s5"] Feb 26 11:52:06 crc kubenswrapper[4699]: I0226 11:52:06.188445 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535106-cv2s5"] Feb 26 11:52:06 crc kubenswrapper[4699]: I0226 11:52:06.272064 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="277ed376-d775-489c-82e7-93962bd513ff" path="/var/lib/kubelet/pods/277ed376-d775-489c-82e7-93962bd513ff/volumes" Feb 26 11:52:09 crc kubenswrapper[4699]: I0226 11:52:09.261418 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:52:09 crc kubenswrapper[4699]: E0226 11:52:09.261951 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:52:21 crc kubenswrapper[4699]: I0226 11:52:21.012861 4699 generic.go:334] "Generic (PLEG): container finished" podID="6436c321-6850-4db3-81b2-0dc329e10900" containerID="8bd3df01daa0942902ceb3f721b2d365aa21e62ede502d0a6f006ad1267cec53" exitCode=0 Feb 26 11:52:21 crc kubenswrapper[4699]: I0226 11:52:21.012967 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" event={"ID":"6436c321-6850-4db3-81b2-0dc329e10900","Type":"ContainerDied","Data":"8bd3df01daa0942902ceb3f721b2d365aa21e62ede502d0a6f006ad1267cec53"} Feb 26 11:52:22 crc kubenswrapper[4699]: I0226 11:52:22.466022 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" Feb 26 11:52:22 crc kubenswrapper[4699]: I0226 11:52:22.659066 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57ppn\" (UniqueName: \"kubernetes.io/projected/6436c321-6850-4db3-81b2-0dc329e10900-kube-api-access-57ppn\") pod \"6436c321-6850-4db3-81b2-0dc329e10900\" (UID: \"6436c321-6850-4db3-81b2-0dc329e10900\") " Feb 26 11:52:22 crc kubenswrapper[4699]: I0226 11:52:22.659201 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-ssh-key-openstack-edpm-ipam\") pod \"6436c321-6850-4db3-81b2-0dc329e10900\" (UID: \"6436c321-6850-4db3-81b2-0dc329e10900\") " Feb 26 11:52:22 crc kubenswrapper[4699]: I0226 11:52:22.659423 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-libvirt-secret-0\") pod \"6436c321-6850-4db3-81b2-0dc329e10900\" (UID: \"6436c321-6850-4db3-81b2-0dc329e10900\") " Feb 26 11:52:22 crc kubenswrapper[4699]: I0226 11:52:22.659570 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-libvirt-combined-ca-bundle\") pod \"6436c321-6850-4db3-81b2-0dc329e10900\" (UID: \"6436c321-6850-4db3-81b2-0dc329e10900\") " Feb 26 11:52:22 crc kubenswrapper[4699]: I0226 11:52:22.659700 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-inventory\") pod \"6436c321-6850-4db3-81b2-0dc329e10900\" (UID: \"6436c321-6850-4db3-81b2-0dc329e10900\") " Feb 26 11:52:22 crc kubenswrapper[4699]: I0226 11:52:22.665723 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6436c321-6850-4db3-81b2-0dc329e10900-kube-api-access-57ppn" (OuterVolumeSpecName: "kube-api-access-57ppn") pod "6436c321-6850-4db3-81b2-0dc329e10900" (UID: "6436c321-6850-4db3-81b2-0dc329e10900"). InnerVolumeSpecName "kube-api-access-57ppn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:52:22 crc kubenswrapper[4699]: I0226 11:52:22.666321 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "6436c321-6850-4db3-81b2-0dc329e10900" (UID: "6436c321-6850-4db3-81b2-0dc329e10900"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:52:22 crc kubenswrapper[4699]: I0226 11:52:22.691976 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6436c321-6850-4db3-81b2-0dc329e10900" (UID: "6436c321-6850-4db3-81b2-0dc329e10900"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:52:22 crc kubenswrapper[4699]: I0226 11:52:22.695334 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-inventory" (OuterVolumeSpecName: "inventory") pod "6436c321-6850-4db3-81b2-0dc329e10900" (UID: "6436c321-6850-4db3-81b2-0dc329e10900"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:52:22 crc kubenswrapper[4699]: I0226 11:52:22.698946 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "6436c321-6850-4db3-81b2-0dc329e10900" (UID: "6436c321-6850-4db3-81b2-0dc329e10900"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:52:22 crc kubenswrapper[4699]: I0226 11:52:22.762514 4699 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 11:52:22 crc kubenswrapper[4699]: I0226 11:52:22.762717 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57ppn\" (UniqueName: \"kubernetes.io/projected/6436c321-6850-4db3-81b2-0dc329e10900-kube-api-access-57ppn\") on node \"crc\" DevicePath \"\"" Feb 26 11:52:22 crc kubenswrapper[4699]: I0226 11:52:22.762793 4699 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 11:52:22 crc kubenswrapper[4699]: I0226 11:52:22.762857 4699 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 26 11:52:22 crc kubenswrapper[4699]: I0226 11:52:22.762921 4699 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6436c321-6850-4db3-81b2-0dc329e10900-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.033778 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" event={"ID":"6436c321-6850-4db3-81b2-0dc329e10900","Type":"ContainerDied","Data":"da7eabc20b73f3cfcb5f479d6c26b5a779dbd02d4697b1b42ef3653df7b2ae5b"} Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.033932 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.034110 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da7eabc20b73f3cfcb5f479d6c26b5a779dbd02d4697b1b42ef3653df7b2ae5b" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.149482 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666"] Feb 26 11:52:23 crc kubenswrapper[4699]: E0226 11:52:23.150721 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6436c321-6850-4db3-81b2-0dc329e10900" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.150753 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6436c321-6850-4db3-81b2-0dc329e10900" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 26 11:52:23 crc kubenswrapper[4699]: E0226 11:52:23.150795 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ae8ab95-85cc-473a-bbe7-6065a75e5720" containerName="oc" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.150804 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae8ab95-85cc-473a-bbe7-6065a75e5720" containerName="oc" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.151079 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ae8ab95-85cc-473a-bbe7-6065a75e5720" containerName="oc" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.151147 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6436c321-6850-4db3-81b2-0dc329e10900" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.152065 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.155040 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.157643 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.166206 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.261494 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.261580 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.261659 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-69sdb" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.261893 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.304479 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666"] Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.313480 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.313582 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.313752 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xnjl\" (UniqueName: \"kubernetes.io/projected/2c2e8329-038c-4347-b30f-f8b42f36cc67-kube-api-access-8xnjl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.313858 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.313899 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.313955 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.313993 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.314074 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.314132 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.314236 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.314322 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.416357 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.416429 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.416510 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xnjl\" (UniqueName: \"kubernetes.io/projected/2c2e8329-038c-4347-b30f-f8b42f36cc67-kube-api-access-8xnjl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.416562 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.416589 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.416622 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.416663 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.416710 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.416739 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.416794 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.416842 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.419103 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.422601 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.422634 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.422680 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.423046 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.423566 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.424018 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.425726 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.428710 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.428982 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.436604 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xnjl\" (UniqueName: \"kubernetes.io/projected/2c2e8329-038c-4347-b30f-f8b42f36cc67-kube-api-access-8xnjl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wv666\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:23 crc kubenswrapper[4699]: I0226 11:52:23.608397 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:52:24 crc kubenswrapper[4699]: I0226 11:52:24.160209 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666"] Feb 26 11:52:24 crc kubenswrapper[4699]: I0226 11:52:24.163236 4699 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 11:52:24 crc kubenswrapper[4699]: I0226 11:52:24.261289 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:52:24 crc kubenswrapper[4699]: E0226 11:52:24.261603 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:52:25 crc kubenswrapper[4699]: I0226 11:52:25.052467 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" event={"ID":"2c2e8329-038c-4347-b30f-f8b42f36cc67","Type":"ContainerStarted","Data":"c9492e99db4aabcb6a5c3c841ccddee9f07e9207f1e35227acfbe163ff34fec2"} Feb 26 11:52:26 crc kubenswrapper[4699]: I0226 11:52:26.061188 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" event={"ID":"2c2e8329-038c-4347-b30f-f8b42f36cc67","Type":"ContainerStarted","Data":"9e14db39e9ca42779c03c0f56859f1620acf93ef1802275caf2edd19f2d27624"} Feb 26 11:52:26 crc kubenswrapper[4699]: I0226 11:52:26.082047 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" podStartSLOduration=2.165358527 podStartE2EDuration="3.082016028s" podCreationTimestamp="2026-02-26 11:52:23 +0000 UTC" firstStartedPulling="2026-02-26 11:52:24.162881586 +0000 UTC m=+2489.973708020" lastFinishedPulling="2026-02-26 11:52:25.079539087 +0000 UTC m=+2490.890365521" observedRunningTime="2026-02-26 11:52:26.079461838 +0000 UTC m=+2491.890288272" watchObservedRunningTime="2026-02-26 11:52:26.082016028 +0000 UTC m=+2491.892842462" Feb 26 11:52:31 crc kubenswrapper[4699]: I0226 11:52:31.620830 4699 scope.go:117] "RemoveContainer" containerID="6316bd489dab2ee525da2e5168f12e3d42a5b7c5139e77da702337350ea3b44a" Feb 26 11:52:38 crc kubenswrapper[4699]: I0226 11:52:38.261331 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:52:38 crc kubenswrapper[4699]: E0226 11:52:38.262433 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:52:50 crc kubenswrapper[4699]: I0226 11:52:50.261454 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:52:50 crc kubenswrapper[4699]: E0226 11:52:50.262396 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:53:05 crc kubenswrapper[4699]: I0226 11:53:05.261670 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:53:05 crc kubenswrapper[4699]: E0226 11:53:05.262468 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:53:08 crc kubenswrapper[4699]: I0226 11:53:08.785518 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l7242"] Feb 26 11:53:08 crc kubenswrapper[4699]: I0226 11:53:08.787934 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7242" Feb 26 11:53:08 crc kubenswrapper[4699]: I0226 11:53:08.798922 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2wpm\" (UniqueName: \"kubernetes.io/projected/8bffc9ae-b2b5-473a-8876-958983e1b5cc-kube-api-access-w2wpm\") pod \"community-operators-l7242\" (UID: \"8bffc9ae-b2b5-473a-8876-958983e1b5cc\") " pod="openshift-marketplace/community-operators-l7242" Feb 26 11:53:08 crc kubenswrapper[4699]: I0226 11:53:08.798989 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bffc9ae-b2b5-473a-8876-958983e1b5cc-utilities\") pod \"community-operators-l7242\" (UID: \"8bffc9ae-b2b5-473a-8876-958983e1b5cc\") " pod="openshift-marketplace/community-operators-l7242" Feb 26 11:53:08 crc kubenswrapper[4699]: I0226 11:53:08.799014 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bffc9ae-b2b5-473a-8876-958983e1b5cc-catalog-content\") pod \"community-operators-l7242\" (UID: \"8bffc9ae-b2b5-473a-8876-958983e1b5cc\") " pod="openshift-marketplace/community-operators-l7242" Feb 26 11:53:08 crc kubenswrapper[4699]: I0226 11:53:08.806502 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l7242"] Feb 26 11:53:08 crc kubenswrapper[4699]: I0226 11:53:08.902177 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2wpm\" (UniqueName: \"kubernetes.io/projected/8bffc9ae-b2b5-473a-8876-958983e1b5cc-kube-api-access-w2wpm\") pod \"community-operators-l7242\" (UID: \"8bffc9ae-b2b5-473a-8876-958983e1b5cc\") " pod="openshift-marketplace/community-operators-l7242" Feb 26 11:53:08 crc kubenswrapper[4699]: I0226 11:53:08.902290 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bffc9ae-b2b5-473a-8876-958983e1b5cc-utilities\") pod \"community-operators-l7242\" (UID: \"8bffc9ae-b2b5-473a-8876-958983e1b5cc\") " pod="openshift-marketplace/community-operators-l7242" Feb 26 11:53:08 crc kubenswrapper[4699]: I0226 11:53:08.902334 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bffc9ae-b2b5-473a-8876-958983e1b5cc-catalog-content\") pod \"community-operators-l7242\" (UID: \"8bffc9ae-b2b5-473a-8876-958983e1b5cc\") " pod="openshift-marketplace/community-operators-l7242" Feb 26 11:53:08 crc kubenswrapper[4699]: I0226 11:53:08.903261 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bffc9ae-b2b5-473a-8876-958983e1b5cc-utilities\") pod \"community-operators-l7242\" (UID: \"8bffc9ae-b2b5-473a-8876-958983e1b5cc\") " pod="openshift-marketplace/community-operators-l7242" Feb 26 11:53:08 crc kubenswrapper[4699]: I0226 11:53:08.903286 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bffc9ae-b2b5-473a-8876-958983e1b5cc-catalog-content\") pod \"community-operators-l7242\" (UID: \"8bffc9ae-b2b5-473a-8876-958983e1b5cc\") " pod="openshift-marketplace/community-operators-l7242" Feb 26 11:53:08 crc kubenswrapper[4699]: I0226 11:53:08.924265 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2wpm\" (UniqueName: \"kubernetes.io/projected/8bffc9ae-b2b5-473a-8876-958983e1b5cc-kube-api-access-w2wpm\") pod \"community-operators-l7242\" (UID: \"8bffc9ae-b2b5-473a-8876-958983e1b5cc\") " pod="openshift-marketplace/community-operators-l7242" Feb 26 11:53:09 crc kubenswrapper[4699]: I0226 11:53:09.110524 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7242" Feb 26 11:53:09 crc kubenswrapper[4699]: I0226 11:53:09.564008 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l7242"] Feb 26 11:53:10 crc kubenswrapper[4699]: I0226 11:53:10.427026 4699 generic.go:334] "Generic (PLEG): container finished" podID="8bffc9ae-b2b5-473a-8876-958983e1b5cc" containerID="9e7a7cd6623aa96bb6b9ff153e54ceb4b7943dc8d7122fd11cbfc89df20ba577" exitCode=0 Feb 26 11:53:10 crc kubenswrapper[4699]: I0226 11:53:10.427093 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7242" event={"ID":"8bffc9ae-b2b5-473a-8876-958983e1b5cc","Type":"ContainerDied","Data":"9e7a7cd6623aa96bb6b9ff153e54ceb4b7943dc8d7122fd11cbfc89df20ba577"} Feb 26 11:53:10 crc kubenswrapper[4699]: I0226 11:53:10.427339 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7242" event={"ID":"8bffc9ae-b2b5-473a-8876-958983e1b5cc","Type":"ContainerStarted","Data":"587c06830f08110563d11bd959f9ca1b7f81ea2321b8b0fba829c5801e29fcb7"} Feb 26 11:53:12 crc kubenswrapper[4699]: I0226 11:53:12.443326 4699 generic.go:334] "Generic (PLEG): container finished" podID="8bffc9ae-b2b5-473a-8876-958983e1b5cc" containerID="f7690aff2701e37432ac7b6d5a7d18421e13b42c997817acc6658e7e5f9b41e0" exitCode=0 Feb 26 11:53:12 crc kubenswrapper[4699]: I0226 11:53:12.443432 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7242" event={"ID":"8bffc9ae-b2b5-473a-8876-958983e1b5cc","Type":"ContainerDied","Data":"f7690aff2701e37432ac7b6d5a7d18421e13b42c997817acc6658e7e5f9b41e0"} Feb 26 11:53:13 crc kubenswrapper[4699]: I0226 11:53:13.456908 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7242" event={"ID":"8bffc9ae-b2b5-473a-8876-958983e1b5cc","Type":"ContainerStarted","Data":"bc50d20d4888831a6bedb5c285335eef8a68e1d761755841550d21819f1014e1"} Feb 26 11:53:13 crc kubenswrapper[4699]: I0226 11:53:13.487211 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l7242" podStartSLOduration=3.051567778 podStartE2EDuration="5.487188083s" podCreationTimestamp="2026-02-26 11:53:08 +0000 UTC" firstStartedPulling="2026-02-26 11:53:10.429267525 +0000 UTC m=+2536.240093959" lastFinishedPulling="2026-02-26 11:53:12.86488782 +0000 UTC m=+2538.675714264" observedRunningTime="2026-02-26 11:53:13.476858737 +0000 UTC m=+2539.287685221" watchObservedRunningTime="2026-02-26 11:53:13.487188083 +0000 UTC m=+2539.298014517" Feb 26 11:53:17 crc kubenswrapper[4699]: I0226 11:53:17.260689 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:53:17 crc kubenswrapper[4699]: E0226 11:53:17.261901 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:53:19 crc kubenswrapper[4699]: I0226 11:53:19.111061 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l7242" Feb 26 11:53:19 crc kubenswrapper[4699]: I0226 11:53:19.111582 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l7242" Feb 26 11:53:19 crc kubenswrapper[4699]: I0226 11:53:19.155944 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l7242" Feb 26 11:53:19 crc kubenswrapper[4699]: I0226 11:53:19.574617 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l7242" Feb 26 11:53:19 crc kubenswrapper[4699]: I0226 11:53:19.620658 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l7242"] Feb 26 11:53:21 crc kubenswrapper[4699]: I0226 11:53:21.542967 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l7242" podUID="8bffc9ae-b2b5-473a-8876-958983e1b5cc" containerName="registry-server" containerID="cri-o://bc50d20d4888831a6bedb5c285335eef8a68e1d761755841550d21819f1014e1" gracePeriod=2 Feb 26 11:53:21 crc kubenswrapper[4699]: I0226 11:53:21.944347 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7242" Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.080004 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bffc9ae-b2b5-473a-8876-958983e1b5cc-utilities\") pod \"8bffc9ae-b2b5-473a-8876-958983e1b5cc\" (UID: \"8bffc9ae-b2b5-473a-8876-958983e1b5cc\") " Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.080166 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2wpm\" (UniqueName: \"kubernetes.io/projected/8bffc9ae-b2b5-473a-8876-958983e1b5cc-kube-api-access-w2wpm\") pod \"8bffc9ae-b2b5-473a-8876-958983e1b5cc\" (UID: \"8bffc9ae-b2b5-473a-8876-958983e1b5cc\") " Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.080196 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bffc9ae-b2b5-473a-8876-958983e1b5cc-catalog-content\") pod \"8bffc9ae-b2b5-473a-8876-958983e1b5cc\" (UID: \"8bffc9ae-b2b5-473a-8876-958983e1b5cc\") " Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.081045 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bffc9ae-b2b5-473a-8876-958983e1b5cc-utilities" (OuterVolumeSpecName: "utilities") pod "8bffc9ae-b2b5-473a-8876-958983e1b5cc" (UID: "8bffc9ae-b2b5-473a-8876-958983e1b5cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.086406 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bffc9ae-b2b5-473a-8876-958983e1b5cc-kube-api-access-w2wpm" (OuterVolumeSpecName: "kube-api-access-w2wpm") pod "8bffc9ae-b2b5-473a-8876-958983e1b5cc" (UID: "8bffc9ae-b2b5-473a-8876-958983e1b5cc"). InnerVolumeSpecName "kube-api-access-w2wpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.143512 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bffc9ae-b2b5-473a-8876-958983e1b5cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8bffc9ae-b2b5-473a-8876-958983e1b5cc" (UID: "8bffc9ae-b2b5-473a-8876-958983e1b5cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.182867 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2wpm\" (UniqueName: \"kubernetes.io/projected/8bffc9ae-b2b5-473a-8876-958983e1b5cc-kube-api-access-w2wpm\") on node \"crc\" DevicePath \"\"" Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.182910 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bffc9ae-b2b5-473a-8876-958983e1b5cc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.182920 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bffc9ae-b2b5-473a-8876-958983e1b5cc-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.553681 4699 generic.go:334] "Generic (PLEG): container finished" podID="8bffc9ae-b2b5-473a-8876-958983e1b5cc" containerID="bc50d20d4888831a6bedb5c285335eef8a68e1d761755841550d21819f1014e1" exitCode=0 Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.553741 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7242" event={"ID":"8bffc9ae-b2b5-473a-8876-958983e1b5cc","Type":"ContainerDied","Data":"bc50d20d4888831a6bedb5c285335eef8a68e1d761755841550d21819f1014e1"} Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.553786 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7242" event={"ID":"8bffc9ae-b2b5-473a-8876-958983e1b5cc","Type":"ContainerDied","Data":"587c06830f08110563d11bd959f9ca1b7f81ea2321b8b0fba829c5801e29fcb7"} Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.553809 4699 scope.go:117] "RemoveContainer" containerID="bc50d20d4888831a6bedb5c285335eef8a68e1d761755841550d21819f1014e1" Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.554891 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7242" Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.581598 4699 scope.go:117] "RemoveContainer" containerID="f7690aff2701e37432ac7b6d5a7d18421e13b42c997817acc6658e7e5f9b41e0" Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.604247 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l7242"] Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.613261 4699 scope.go:117] "RemoveContainer" containerID="9e7a7cd6623aa96bb6b9ff153e54ceb4b7943dc8d7122fd11cbfc89df20ba577" Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.616255 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l7242"] Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.648261 4699 scope.go:117] "RemoveContainer" containerID="bc50d20d4888831a6bedb5c285335eef8a68e1d761755841550d21819f1014e1" Feb 26 11:53:22 crc kubenswrapper[4699]: E0226 11:53:22.648787 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc50d20d4888831a6bedb5c285335eef8a68e1d761755841550d21819f1014e1\": container with ID starting with bc50d20d4888831a6bedb5c285335eef8a68e1d761755841550d21819f1014e1 not found: ID does not exist" containerID="bc50d20d4888831a6bedb5c285335eef8a68e1d761755841550d21819f1014e1" Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.648835 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc50d20d4888831a6bedb5c285335eef8a68e1d761755841550d21819f1014e1"} err="failed to get container status \"bc50d20d4888831a6bedb5c285335eef8a68e1d761755841550d21819f1014e1\": rpc error: code = NotFound desc = could not find container \"bc50d20d4888831a6bedb5c285335eef8a68e1d761755841550d21819f1014e1\": container with ID starting with bc50d20d4888831a6bedb5c285335eef8a68e1d761755841550d21819f1014e1 not found: ID does not exist" Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.648866 4699 scope.go:117] "RemoveContainer" containerID="f7690aff2701e37432ac7b6d5a7d18421e13b42c997817acc6658e7e5f9b41e0" Feb 26 11:53:22 crc kubenswrapper[4699]: E0226 11:53:22.649256 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7690aff2701e37432ac7b6d5a7d18421e13b42c997817acc6658e7e5f9b41e0\": container with ID starting with f7690aff2701e37432ac7b6d5a7d18421e13b42c997817acc6658e7e5f9b41e0 not found: ID does not exist" containerID="f7690aff2701e37432ac7b6d5a7d18421e13b42c997817acc6658e7e5f9b41e0" Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.649294 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7690aff2701e37432ac7b6d5a7d18421e13b42c997817acc6658e7e5f9b41e0"} err="failed to get container status \"f7690aff2701e37432ac7b6d5a7d18421e13b42c997817acc6658e7e5f9b41e0\": rpc error: code = NotFound desc = could not find container \"f7690aff2701e37432ac7b6d5a7d18421e13b42c997817acc6658e7e5f9b41e0\": container with ID starting with f7690aff2701e37432ac7b6d5a7d18421e13b42c997817acc6658e7e5f9b41e0 not found: ID does not exist" Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.649321 4699 scope.go:117] "RemoveContainer" containerID="9e7a7cd6623aa96bb6b9ff153e54ceb4b7943dc8d7122fd11cbfc89df20ba577" Feb 26 11:53:22 crc kubenswrapper[4699]: E0226 11:53:22.649582 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e7a7cd6623aa96bb6b9ff153e54ceb4b7943dc8d7122fd11cbfc89df20ba577\": container with ID starting with 9e7a7cd6623aa96bb6b9ff153e54ceb4b7943dc8d7122fd11cbfc89df20ba577 not found: ID does not exist" containerID="9e7a7cd6623aa96bb6b9ff153e54ceb4b7943dc8d7122fd11cbfc89df20ba577" Feb 26 11:53:22 crc kubenswrapper[4699]: I0226 11:53:22.649610 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e7a7cd6623aa96bb6b9ff153e54ceb4b7943dc8d7122fd11cbfc89df20ba577"} err="failed to get container status \"9e7a7cd6623aa96bb6b9ff153e54ceb4b7943dc8d7122fd11cbfc89df20ba577\": rpc error: code = NotFound desc = could not find container \"9e7a7cd6623aa96bb6b9ff153e54ceb4b7943dc8d7122fd11cbfc89df20ba577\": container with ID starting with 9e7a7cd6623aa96bb6b9ff153e54ceb4b7943dc8d7122fd11cbfc89df20ba577 not found: ID does not exist" Feb 26 11:53:24 crc kubenswrapper[4699]: I0226 11:53:24.273750 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bffc9ae-b2b5-473a-8876-958983e1b5cc" path="/var/lib/kubelet/pods/8bffc9ae-b2b5-473a-8876-958983e1b5cc/volumes" Feb 26 11:53:31 crc kubenswrapper[4699]: I0226 11:53:31.262447 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:53:31 crc kubenswrapper[4699]: E0226 11:53:31.263648 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:53:44 crc kubenswrapper[4699]: I0226 11:53:44.261057 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:53:44 crc kubenswrapper[4699]: E0226 11:53:44.262448 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:53:58 crc kubenswrapper[4699]: I0226 11:53:58.261687 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:53:58 crc kubenswrapper[4699]: E0226 11:53:58.262609 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:54:00 crc kubenswrapper[4699]: I0226 11:54:00.152356 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535114-zp6df"] Feb 26 11:54:00 crc kubenswrapper[4699]: E0226 11:54:00.153242 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bffc9ae-b2b5-473a-8876-958983e1b5cc" containerName="extract-utilities" Feb 26 11:54:00 crc kubenswrapper[4699]: I0226 11:54:00.153258 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bffc9ae-b2b5-473a-8876-958983e1b5cc" containerName="extract-utilities" Feb 26 11:54:00 crc kubenswrapper[4699]: E0226 11:54:00.153285 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bffc9ae-b2b5-473a-8876-958983e1b5cc" containerName="extract-content" Feb 26 11:54:00 crc kubenswrapper[4699]: I0226 11:54:00.153292 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bffc9ae-b2b5-473a-8876-958983e1b5cc" containerName="extract-content" Feb 26 11:54:00 crc kubenswrapper[4699]: E0226 11:54:00.153307 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bffc9ae-b2b5-473a-8876-958983e1b5cc" containerName="registry-server" Feb 26 11:54:00 crc kubenswrapper[4699]: I0226 11:54:00.153314 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bffc9ae-b2b5-473a-8876-958983e1b5cc" containerName="registry-server" Feb 26 11:54:00 crc kubenswrapper[4699]: I0226 11:54:00.153585 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bffc9ae-b2b5-473a-8876-958983e1b5cc" containerName="registry-server" Feb 26 11:54:00 crc kubenswrapper[4699]: I0226 11:54:00.154275 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535114-zp6df" Feb 26 11:54:00 crc kubenswrapper[4699]: I0226 11:54:00.161546 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535114-zp6df"] Feb 26 11:54:00 crc kubenswrapper[4699]: I0226 11:54:00.161566 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 11:54:00 crc kubenswrapper[4699]: I0226 11:54:00.161851 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 11:54:00 crc kubenswrapper[4699]: I0226 11:54:00.161917 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 11:54:00 crc kubenswrapper[4699]: I0226 11:54:00.251910 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jntw7\" (UniqueName: \"kubernetes.io/projected/c8acba14-233d-44a8-98b6-93df64a45300-kube-api-access-jntw7\") pod \"auto-csr-approver-29535114-zp6df\" (UID: \"c8acba14-233d-44a8-98b6-93df64a45300\") " pod="openshift-infra/auto-csr-approver-29535114-zp6df" Feb 26 11:54:00 crc kubenswrapper[4699]: I0226 11:54:00.354080 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jntw7\" (UniqueName: \"kubernetes.io/projected/c8acba14-233d-44a8-98b6-93df64a45300-kube-api-access-jntw7\") pod \"auto-csr-approver-29535114-zp6df\" (UID: \"c8acba14-233d-44a8-98b6-93df64a45300\") " pod="openshift-infra/auto-csr-approver-29535114-zp6df" Feb 26 11:54:00 crc kubenswrapper[4699]: I0226 11:54:00.382916 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jntw7\" (UniqueName: \"kubernetes.io/projected/c8acba14-233d-44a8-98b6-93df64a45300-kube-api-access-jntw7\") pod \"auto-csr-approver-29535114-zp6df\" (UID: \"c8acba14-233d-44a8-98b6-93df64a45300\") " pod="openshift-infra/auto-csr-approver-29535114-zp6df" Feb 26 11:54:00 crc kubenswrapper[4699]: I0226 11:54:00.490433 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535114-zp6df" Feb 26 11:54:00 crc kubenswrapper[4699]: I0226 11:54:00.958694 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535114-zp6df"] Feb 26 11:54:01 crc kubenswrapper[4699]: I0226 11:54:01.896419 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535114-zp6df" event={"ID":"c8acba14-233d-44a8-98b6-93df64a45300","Type":"ContainerStarted","Data":"c34504929420ae9c1342ec0cb9206d4652cb123907ec8d9973ed44375f7fd77d"} Feb 26 11:54:02 crc kubenswrapper[4699]: I0226 11:54:02.907085 4699 generic.go:334] "Generic (PLEG): container finished" podID="c8acba14-233d-44a8-98b6-93df64a45300" containerID="ffa425939368131f51ca5df0c799cff39019457552b4886c8f2b5719e7868319" exitCode=0 Feb 26 11:54:02 crc kubenswrapper[4699]: I0226 11:54:02.907155 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535114-zp6df" event={"ID":"c8acba14-233d-44a8-98b6-93df64a45300","Type":"ContainerDied","Data":"ffa425939368131f51ca5df0c799cff39019457552b4886c8f2b5719e7868319"} Feb 26 11:54:04 crc kubenswrapper[4699]: I0226 11:54:04.245689 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535114-zp6df" Feb 26 11:54:04 crc kubenswrapper[4699]: I0226 11:54:04.338491 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jntw7\" (UniqueName: \"kubernetes.io/projected/c8acba14-233d-44a8-98b6-93df64a45300-kube-api-access-jntw7\") pod \"c8acba14-233d-44a8-98b6-93df64a45300\" (UID: \"c8acba14-233d-44a8-98b6-93df64a45300\") " Feb 26 11:54:04 crc kubenswrapper[4699]: I0226 11:54:04.345350 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8acba14-233d-44a8-98b6-93df64a45300-kube-api-access-jntw7" (OuterVolumeSpecName: "kube-api-access-jntw7") pod "c8acba14-233d-44a8-98b6-93df64a45300" (UID: "c8acba14-233d-44a8-98b6-93df64a45300"). InnerVolumeSpecName "kube-api-access-jntw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:54:04 crc kubenswrapper[4699]: I0226 11:54:04.442276 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jntw7\" (UniqueName: \"kubernetes.io/projected/c8acba14-233d-44a8-98b6-93df64a45300-kube-api-access-jntw7\") on node \"crc\" DevicePath \"\"" Feb 26 11:54:04 crc kubenswrapper[4699]: I0226 11:54:04.926760 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535114-zp6df" event={"ID":"c8acba14-233d-44a8-98b6-93df64a45300","Type":"ContainerDied","Data":"c34504929420ae9c1342ec0cb9206d4652cb123907ec8d9973ed44375f7fd77d"} Feb 26 11:54:04 crc kubenswrapper[4699]: I0226 11:54:04.926799 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c34504929420ae9c1342ec0cb9206d4652cb123907ec8d9973ed44375f7fd77d" Feb 26 11:54:04 crc kubenswrapper[4699]: I0226 11:54:04.926853 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535114-zp6df" Feb 26 11:54:05 crc kubenswrapper[4699]: I0226 11:54:05.313105 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535108-79cdj"] Feb 26 11:54:05 crc kubenswrapper[4699]: I0226 11:54:05.320636 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535108-79cdj"] Feb 26 11:54:06 crc kubenswrapper[4699]: I0226 11:54:06.272034 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6366100d-f68c-43ce-879b-4cc3f80c8156" path="/var/lib/kubelet/pods/6366100d-f68c-43ce-879b-4cc3f80c8156/volumes" Feb 26 11:54:09 crc kubenswrapper[4699]: I0226 11:54:09.261705 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:54:09 crc kubenswrapper[4699]: E0226 11:54:09.262395 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 11:54:23 crc kubenswrapper[4699]: I0226 11:54:23.261462 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:54:24 crc kubenswrapper[4699]: I0226 11:54:24.091955 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerStarted","Data":"4c58903dcb4a12909fcb0583a1d55149dc4c175867d594a32d984dde51ae536f"} Feb 26 11:54:31 crc kubenswrapper[4699]: I0226 11:54:31.754605 4699 scope.go:117] "RemoveContainer" containerID="a49e0f6a5b8aa98c17ff2dc316f41da6f3d780c3f18aaef30599837dcc6bc0ea" Feb 26 11:54:42 crc kubenswrapper[4699]: I0226 11:54:42.891171 4699 generic.go:334] "Generic (PLEG): container finished" podID="2c2e8329-038c-4347-b30f-f8b42f36cc67" containerID="9e14db39e9ca42779c03c0f56859f1620acf93ef1802275caf2edd19f2d27624" exitCode=0 Feb 26 11:54:42 crc kubenswrapper[4699]: I0226 11:54:42.891256 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" event={"ID":"2c2e8329-038c-4347-b30f-f8b42f36cc67","Type":"ContainerDied","Data":"9e14db39e9ca42779c03c0f56859f1620acf93ef1802275caf2edd19f2d27624"} Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.338788 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.481468 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-migration-ssh-key-0\") pod \"2c2e8329-038c-4347-b30f-f8b42f36cc67\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.481569 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-1\") pod \"2c2e8329-038c-4347-b30f-f8b42f36cc67\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.481606 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-combined-ca-bundle\") pod \"2c2e8329-038c-4347-b30f-f8b42f36cc67\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.481636 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-migration-ssh-key-1\") pod \"2c2e8329-038c-4347-b30f-f8b42f36cc67\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.481704 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-2\") pod \"2c2e8329-038c-4347-b30f-f8b42f36cc67\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.481732 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-inventory\") pod \"2c2e8329-038c-4347-b30f-f8b42f36cc67\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.481758 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-ssh-key-openstack-edpm-ipam\") pod \"2c2e8329-038c-4347-b30f-f8b42f36cc67\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.481800 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-3\") pod \"2c2e8329-038c-4347-b30f-f8b42f36cc67\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.481845 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-0\") pod \"2c2e8329-038c-4347-b30f-f8b42f36cc67\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.481885 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-extra-config-0\") pod \"2c2e8329-038c-4347-b30f-f8b42f36cc67\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.482031 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xnjl\" (UniqueName: \"kubernetes.io/projected/2c2e8329-038c-4347-b30f-f8b42f36cc67-kube-api-access-8xnjl\") pod \"2c2e8329-038c-4347-b30f-f8b42f36cc67\" (UID: \"2c2e8329-038c-4347-b30f-f8b42f36cc67\") " Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.489490 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "2c2e8329-038c-4347-b30f-f8b42f36cc67" (UID: "2c2e8329-038c-4347-b30f-f8b42f36cc67"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.490731 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c2e8329-038c-4347-b30f-f8b42f36cc67-kube-api-access-8xnjl" (OuterVolumeSpecName: "kube-api-access-8xnjl") pod "2c2e8329-038c-4347-b30f-f8b42f36cc67" (UID: "2c2e8329-038c-4347-b30f-f8b42f36cc67"). InnerVolumeSpecName "kube-api-access-8xnjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.514262 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "2c2e8329-038c-4347-b30f-f8b42f36cc67" (UID: "2c2e8329-038c-4347-b30f-f8b42f36cc67"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.520242 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "2c2e8329-038c-4347-b30f-f8b42f36cc67" (UID: "2c2e8329-038c-4347-b30f-f8b42f36cc67"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.520266 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "2c2e8329-038c-4347-b30f-f8b42f36cc67" (UID: "2c2e8329-038c-4347-b30f-f8b42f36cc67"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.521469 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2c2e8329-038c-4347-b30f-f8b42f36cc67" (UID: "2c2e8329-038c-4347-b30f-f8b42f36cc67"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.526039 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "2c2e8329-038c-4347-b30f-f8b42f36cc67" (UID: "2c2e8329-038c-4347-b30f-f8b42f36cc67"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.531066 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "2c2e8329-038c-4347-b30f-f8b42f36cc67" (UID: "2c2e8329-038c-4347-b30f-f8b42f36cc67"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.531066 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-inventory" (OuterVolumeSpecName: "inventory") pod "2c2e8329-038c-4347-b30f-f8b42f36cc67" (UID: "2c2e8329-038c-4347-b30f-f8b42f36cc67"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.532345 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "2c2e8329-038c-4347-b30f-f8b42f36cc67" (UID: "2c2e8329-038c-4347-b30f-f8b42f36cc67"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.533890 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "2c2e8329-038c-4347-b30f-f8b42f36cc67" (UID: "2c2e8329-038c-4347-b30f-f8b42f36cc67"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.587517 4699 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.587988 4699 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.588159 4699 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.588266 4699 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.588373 4699 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.588466 4699 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.588548 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xnjl\" (UniqueName: \"kubernetes.io/projected/2c2e8329-038c-4347-b30f-f8b42f36cc67-kube-api-access-8xnjl\") on node \"crc\" DevicePath \"\"" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.588688 4699 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.588829 4699 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.588920 4699 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.589134 4699 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2c2e8329-038c-4347-b30f-f8b42f36cc67-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.910398 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" event={"ID":"2c2e8329-038c-4347-b30f-f8b42f36cc67","Type":"ContainerDied","Data":"c9492e99db4aabcb6a5c3c841ccddee9f07e9207f1e35227acfbe163ff34fec2"} Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.910440 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9492e99db4aabcb6a5c3c841ccddee9f07e9207f1e35227acfbe163ff34fec2" Feb 26 11:54:44 crc kubenswrapper[4699]: I0226 11:54:44.910461 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wv666" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.106688 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9"] Feb 26 11:54:45 crc kubenswrapper[4699]: E0226 11:54:45.107328 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2e8329-038c-4347-b30f-f8b42f36cc67" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.107351 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2e8329-038c-4347-b30f-f8b42f36cc67" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 26 11:54:45 crc kubenswrapper[4699]: E0226 11:54:45.107390 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8acba14-233d-44a8-98b6-93df64a45300" containerName="oc" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.107400 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8acba14-233d-44a8-98b6-93df64a45300" containerName="oc" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.107651 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8acba14-233d-44a8-98b6-93df64a45300" containerName="oc" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.107681 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2e8329-038c-4347-b30f-f8b42f36cc67" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.108491 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.113940 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.114309 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.114833 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.115061 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.115286 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-69sdb" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.124458 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9"] Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.205066 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.205308 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.205739 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.205854 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjjd7\" (UniqueName: \"kubernetes.io/projected/08bdd16a-fc18-4262-9175-a05b613a76c9-kube-api-access-qjjd7\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.205931 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.206037 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.206102 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.308501 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.309447 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjjd7\" (UniqueName: \"kubernetes.io/projected/08bdd16a-fc18-4262-9175-a05b613a76c9-kube-api-access-qjjd7\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.309541 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.309703 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.309763 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.309872 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.309988 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.314723 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.328205 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.328396 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.328842 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.328993 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.329559 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.332813 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjjd7\" (UniqueName: \"kubernetes.io/projected/08bdd16a-fc18-4262-9175-a05b613a76c9-kube-api-access-qjjd7\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.436003 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:54:45 crc kubenswrapper[4699]: I0226 11:54:45.981039 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9"] Feb 26 11:54:46 crc kubenswrapper[4699]: I0226 11:54:46.931533 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" event={"ID":"08bdd16a-fc18-4262-9175-a05b613a76c9","Type":"ContainerStarted","Data":"43d0f761beaf929bd7b88c678a07a81fe65b54f961758754c84f435ce7b8d8cb"} Feb 26 11:54:46 crc kubenswrapper[4699]: I0226 11:54:46.931941 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" event={"ID":"08bdd16a-fc18-4262-9175-a05b613a76c9","Type":"ContainerStarted","Data":"14466ace843fec23ce73560f034f012e5b9e5664261a1686721bb34a65e7ea16"} Feb 26 11:54:46 crc kubenswrapper[4699]: I0226 11:54:46.952585 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" podStartSLOduration=1.525734689 podStartE2EDuration="1.952564236s" podCreationTimestamp="2026-02-26 11:54:45 +0000 UTC" firstStartedPulling="2026-02-26 11:54:45.98275775 +0000 UTC m=+2631.793584184" lastFinishedPulling="2026-02-26 11:54:46.409587297 +0000 UTC m=+2632.220413731" observedRunningTime="2026-02-26 11:54:46.949571311 +0000 UTC m=+2632.760397765" watchObservedRunningTime="2026-02-26 11:54:46.952564236 +0000 UTC m=+2632.763390690" Feb 26 11:55:35 crc kubenswrapper[4699]: I0226 11:55:35.810727 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ms696"] Feb 26 11:55:35 crc kubenswrapper[4699]: I0226 11:55:35.813484 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ms696" Feb 26 11:55:35 crc kubenswrapper[4699]: I0226 11:55:35.826316 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ms696"] Feb 26 11:55:35 crc kubenswrapper[4699]: I0226 11:55:35.944153 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24a9d670-7b0f-45dc-ae64-5a2ef0c623ee-catalog-content\") pod \"certified-operators-ms696\" (UID: \"24a9d670-7b0f-45dc-ae64-5a2ef0c623ee\") " pod="openshift-marketplace/certified-operators-ms696" Feb 26 11:55:35 crc kubenswrapper[4699]: I0226 11:55:35.945148 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwtbf\" (UniqueName: \"kubernetes.io/projected/24a9d670-7b0f-45dc-ae64-5a2ef0c623ee-kube-api-access-dwtbf\") pod \"certified-operators-ms696\" (UID: \"24a9d670-7b0f-45dc-ae64-5a2ef0c623ee\") " pod="openshift-marketplace/certified-operators-ms696" Feb 26 11:55:35 crc kubenswrapper[4699]: I0226 11:55:35.945315 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24a9d670-7b0f-45dc-ae64-5a2ef0c623ee-utilities\") pod \"certified-operators-ms696\" (UID: \"24a9d670-7b0f-45dc-ae64-5a2ef0c623ee\") " pod="openshift-marketplace/certified-operators-ms696" Feb 26 11:55:36 crc kubenswrapper[4699]: I0226 11:55:36.047132 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24a9d670-7b0f-45dc-ae64-5a2ef0c623ee-catalog-content\") pod \"certified-operators-ms696\" (UID: \"24a9d670-7b0f-45dc-ae64-5a2ef0c623ee\") " pod="openshift-marketplace/certified-operators-ms696" Feb 26 11:55:36 crc kubenswrapper[4699]: I0226 11:55:36.047317 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwtbf\" (UniqueName: \"kubernetes.io/projected/24a9d670-7b0f-45dc-ae64-5a2ef0c623ee-kube-api-access-dwtbf\") pod \"certified-operators-ms696\" (UID: \"24a9d670-7b0f-45dc-ae64-5a2ef0c623ee\") " pod="openshift-marketplace/certified-operators-ms696" Feb 26 11:55:36 crc kubenswrapper[4699]: I0226 11:55:36.047358 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24a9d670-7b0f-45dc-ae64-5a2ef0c623ee-utilities\") pod \"certified-operators-ms696\" (UID: \"24a9d670-7b0f-45dc-ae64-5a2ef0c623ee\") " pod="openshift-marketplace/certified-operators-ms696" Feb 26 11:55:36 crc kubenswrapper[4699]: I0226 11:55:36.047925 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24a9d670-7b0f-45dc-ae64-5a2ef0c623ee-utilities\") pod \"certified-operators-ms696\" (UID: \"24a9d670-7b0f-45dc-ae64-5a2ef0c623ee\") " pod="openshift-marketplace/certified-operators-ms696" Feb 26 11:55:36 crc kubenswrapper[4699]: I0226 11:55:36.047925 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24a9d670-7b0f-45dc-ae64-5a2ef0c623ee-catalog-content\") pod \"certified-operators-ms696\" (UID: \"24a9d670-7b0f-45dc-ae64-5a2ef0c623ee\") " pod="openshift-marketplace/certified-operators-ms696" Feb 26 11:55:36 crc kubenswrapper[4699]: I0226 11:55:36.069846 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwtbf\" (UniqueName: \"kubernetes.io/projected/24a9d670-7b0f-45dc-ae64-5a2ef0c623ee-kube-api-access-dwtbf\") pod \"certified-operators-ms696\" (UID: \"24a9d670-7b0f-45dc-ae64-5a2ef0c623ee\") " pod="openshift-marketplace/certified-operators-ms696" Feb 26 11:55:36 crc kubenswrapper[4699]: I0226 11:55:36.143243 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ms696" Feb 26 11:55:36 crc kubenswrapper[4699]: I0226 11:55:36.648505 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ms696"] Feb 26 11:55:37 crc kubenswrapper[4699]: I0226 11:55:37.394253 4699 generic.go:334] "Generic (PLEG): container finished" podID="24a9d670-7b0f-45dc-ae64-5a2ef0c623ee" containerID="57058290b00f5d6b590d5a4c78114725d2a482c376a97f7adb8c6ad1e53c213e" exitCode=0 Feb 26 11:55:37 crc kubenswrapper[4699]: I0226 11:55:37.394321 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ms696" event={"ID":"24a9d670-7b0f-45dc-ae64-5a2ef0c623ee","Type":"ContainerDied","Data":"57058290b00f5d6b590d5a4c78114725d2a482c376a97f7adb8c6ad1e53c213e"} Feb 26 11:55:37 crc kubenswrapper[4699]: I0226 11:55:37.394697 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ms696" event={"ID":"24a9d670-7b0f-45dc-ae64-5a2ef0c623ee","Type":"ContainerStarted","Data":"f8058ebb930564641cfd3d71c132ace7ecff5864e9f26174f1874bbeeb27a955"} Feb 26 11:55:39 crc kubenswrapper[4699]: I0226 11:55:39.415915 4699 generic.go:334] "Generic (PLEG): container finished" podID="24a9d670-7b0f-45dc-ae64-5a2ef0c623ee" containerID="46aeb100bd5a41ee978076bb87423c17006a41bb58ca71c78f257f6951501401" exitCode=0 Feb 26 11:55:39 crc kubenswrapper[4699]: I0226 11:55:39.416012 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ms696" event={"ID":"24a9d670-7b0f-45dc-ae64-5a2ef0c623ee","Type":"ContainerDied","Data":"46aeb100bd5a41ee978076bb87423c17006a41bb58ca71c78f257f6951501401"} Feb 26 11:55:40 crc kubenswrapper[4699]: I0226 11:55:40.431798 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ms696" event={"ID":"24a9d670-7b0f-45dc-ae64-5a2ef0c623ee","Type":"ContainerStarted","Data":"c98a95633db82982137dc140fdf6eaae665566024d6262406fab215329fcef6d"} Feb 26 11:55:40 crc kubenswrapper[4699]: I0226 11:55:40.464470 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ms696" podStartSLOduration=2.739494285 podStartE2EDuration="5.464446673s" podCreationTimestamp="2026-02-26 11:55:35 +0000 UTC" firstStartedPulling="2026-02-26 11:55:37.396728032 +0000 UTC m=+2683.207554466" lastFinishedPulling="2026-02-26 11:55:40.12168042 +0000 UTC m=+2685.932506854" observedRunningTime="2026-02-26 11:55:40.456456237 +0000 UTC m=+2686.267282701" watchObservedRunningTime="2026-02-26 11:55:40.464446673 +0000 UTC m=+2686.275273107" Feb 26 11:55:46 crc kubenswrapper[4699]: I0226 11:55:46.144315 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ms696" Feb 26 11:55:46 crc kubenswrapper[4699]: I0226 11:55:46.144377 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ms696" Feb 26 11:55:46 crc kubenswrapper[4699]: I0226 11:55:46.194360 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ms696" Feb 26 11:55:46 crc kubenswrapper[4699]: I0226 11:55:46.527283 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ms696" Feb 26 11:55:46 crc kubenswrapper[4699]: I0226 11:55:46.575867 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ms696"] Feb 26 11:55:48 crc kubenswrapper[4699]: I0226 11:55:48.499969 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ms696" podUID="24a9d670-7b0f-45dc-ae64-5a2ef0c623ee" containerName="registry-server" containerID="cri-o://c98a95633db82982137dc140fdf6eaae665566024d6262406fab215329fcef6d" gracePeriod=2 Feb 26 11:55:48 crc kubenswrapper[4699]: I0226 11:55:48.958184 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ms696" Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.117695 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwtbf\" (UniqueName: \"kubernetes.io/projected/24a9d670-7b0f-45dc-ae64-5a2ef0c623ee-kube-api-access-dwtbf\") pod \"24a9d670-7b0f-45dc-ae64-5a2ef0c623ee\" (UID: \"24a9d670-7b0f-45dc-ae64-5a2ef0c623ee\") " Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.118523 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24a9d670-7b0f-45dc-ae64-5a2ef0c623ee-catalog-content\") pod \"24a9d670-7b0f-45dc-ae64-5a2ef0c623ee\" (UID: \"24a9d670-7b0f-45dc-ae64-5a2ef0c623ee\") " Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.118666 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24a9d670-7b0f-45dc-ae64-5a2ef0c623ee-utilities\") pod \"24a9d670-7b0f-45dc-ae64-5a2ef0c623ee\" (UID: \"24a9d670-7b0f-45dc-ae64-5a2ef0c623ee\") " Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.119681 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24a9d670-7b0f-45dc-ae64-5a2ef0c623ee-utilities" (OuterVolumeSpecName: "utilities") pod "24a9d670-7b0f-45dc-ae64-5a2ef0c623ee" (UID: "24a9d670-7b0f-45dc-ae64-5a2ef0c623ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.125526 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24a9d670-7b0f-45dc-ae64-5a2ef0c623ee-kube-api-access-dwtbf" (OuterVolumeSpecName: "kube-api-access-dwtbf") pod "24a9d670-7b0f-45dc-ae64-5a2ef0c623ee" (UID: "24a9d670-7b0f-45dc-ae64-5a2ef0c623ee"). InnerVolumeSpecName "kube-api-access-dwtbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.185690 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24a9d670-7b0f-45dc-ae64-5a2ef0c623ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24a9d670-7b0f-45dc-ae64-5a2ef0c623ee" (UID: "24a9d670-7b0f-45dc-ae64-5a2ef0c623ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.221174 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwtbf\" (UniqueName: \"kubernetes.io/projected/24a9d670-7b0f-45dc-ae64-5a2ef0c623ee-kube-api-access-dwtbf\") on node \"crc\" DevicePath \"\"" Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.221213 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24a9d670-7b0f-45dc-ae64-5a2ef0c623ee-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.221223 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24a9d670-7b0f-45dc-ae64-5a2ef0c623ee-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.512396 4699 generic.go:334] "Generic (PLEG): container finished" podID="24a9d670-7b0f-45dc-ae64-5a2ef0c623ee" containerID="c98a95633db82982137dc140fdf6eaae665566024d6262406fab215329fcef6d" exitCode=0 Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.512444 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ms696" event={"ID":"24a9d670-7b0f-45dc-ae64-5a2ef0c623ee","Type":"ContainerDied","Data":"c98a95633db82982137dc140fdf6eaae665566024d6262406fab215329fcef6d"} Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.512539 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ms696" event={"ID":"24a9d670-7b0f-45dc-ae64-5a2ef0c623ee","Type":"ContainerDied","Data":"f8058ebb930564641cfd3d71c132ace7ecff5864e9f26174f1874bbeeb27a955"} Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.512562 4699 scope.go:117] "RemoveContainer" containerID="c98a95633db82982137dc140fdf6eaae665566024d6262406fab215329fcef6d" Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.512457 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ms696" Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.537573 4699 scope.go:117] "RemoveContainer" containerID="46aeb100bd5a41ee978076bb87423c17006a41bb58ca71c78f257f6951501401" Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.571559 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ms696"] Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.571891 4699 scope.go:117] "RemoveContainer" containerID="57058290b00f5d6b590d5a4c78114725d2a482c376a97f7adb8c6ad1e53c213e" Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.587856 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ms696"] Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.616856 4699 scope.go:117] "RemoveContainer" containerID="c98a95633db82982137dc140fdf6eaae665566024d6262406fab215329fcef6d" Feb 26 11:55:49 crc kubenswrapper[4699]: E0226 11:55:49.617593 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c98a95633db82982137dc140fdf6eaae665566024d6262406fab215329fcef6d\": container with ID starting with c98a95633db82982137dc140fdf6eaae665566024d6262406fab215329fcef6d not found: ID does not exist" containerID="c98a95633db82982137dc140fdf6eaae665566024d6262406fab215329fcef6d" Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.617644 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c98a95633db82982137dc140fdf6eaae665566024d6262406fab215329fcef6d"} err="failed to get container status \"c98a95633db82982137dc140fdf6eaae665566024d6262406fab215329fcef6d\": rpc error: code = NotFound desc = could not find container \"c98a95633db82982137dc140fdf6eaae665566024d6262406fab215329fcef6d\": container with ID starting with c98a95633db82982137dc140fdf6eaae665566024d6262406fab215329fcef6d not found: ID does not exist" Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.617679 4699 scope.go:117] "RemoveContainer" containerID="46aeb100bd5a41ee978076bb87423c17006a41bb58ca71c78f257f6951501401" Feb 26 11:55:49 crc kubenswrapper[4699]: E0226 11:55:49.618032 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46aeb100bd5a41ee978076bb87423c17006a41bb58ca71c78f257f6951501401\": container with ID starting with 46aeb100bd5a41ee978076bb87423c17006a41bb58ca71c78f257f6951501401 not found: ID does not exist" containerID="46aeb100bd5a41ee978076bb87423c17006a41bb58ca71c78f257f6951501401" Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.618052 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46aeb100bd5a41ee978076bb87423c17006a41bb58ca71c78f257f6951501401"} err="failed to get container status \"46aeb100bd5a41ee978076bb87423c17006a41bb58ca71c78f257f6951501401\": rpc error: code = NotFound desc = could not find container \"46aeb100bd5a41ee978076bb87423c17006a41bb58ca71c78f257f6951501401\": container with ID starting with 46aeb100bd5a41ee978076bb87423c17006a41bb58ca71c78f257f6951501401 not found: ID does not exist" Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.618065 4699 scope.go:117] "RemoveContainer" containerID="57058290b00f5d6b590d5a4c78114725d2a482c376a97f7adb8c6ad1e53c213e" Feb 26 11:55:49 crc kubenswrapper[4699]: E0226 11:55:49.618579 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57058290b00f5d6b590d5a4c78114725d2a482c376a97f7adb8c6ad1e53c213e\": container with ID starting with 57058290b00f5d6b590d5a4c78114725d2a482c376a97f7adb8c6ad1e53c213e not found: ID does not exist" containerID="57058290b00f5d6b590d5a4c78114725d2a482c376a97f7adb8c6ad1e53c213e" Feb 26 11:55:49 crc kubenswrapper[4699]: I0226 11:55:49.618622 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57058290b00f5d6b590d5a4c78114725d2a482c376a97f7adb8c6ad1e53c213e"} err="failed to get container status \"57058290b00f5d6b590d5a4c78114725d2a482c376a97f7adb8c6ad1e53c213e\": rpc error: code = NotFound desc = could not find container \"57058290b00f5d6b590d5a4c78114725d2a482c376a97f7adb8c6ad1e53c213e\": container with ID starting with 57058290b00f5d6b590d5a4c78114725d2a482c376a97f7adb8c6ad1e53c213e not found: ID does not exist" Feb 26 11:55:50 crc kubenswrapper[4699]: I0226 11:55:50.270774 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24a9d670-7b0f-45dc-ae64-5a2ef0c623ee" path="/var/lib/kubelet/pods/24a9d670-7b0f-45dc-ae64-5a2ef0c623ee/volumes" Feb 26 11:56:00 crc kubenswrapper[4699]: I0226 11:56:00.147980 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535116-fwrcj"] Feb 26 11:56:00 crc kubenswrapper[4699]: E0226 11:56:00.149311 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a9d670-7b0f-45dc-ae64-5a2ef0c623ee" containerName="extract-utilities" Feb 26 11:56:00 crc kubenswrapper[4699]: I0226 11:56:00.149334 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a9d670-7b0f-45dc-ae64-5a2ef0c623ee" containerName="extract-utilities" Feb 26 11:56:00 crc kubenswrapper[4699]: E0226 11:56:00.149365 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a9d670-7b0f-45dc-ae64-5a2ef0c623ee" containerName="registry-server" Feb 26 11:56:00 crc kubenswrapper[4699]: I0226 11:56:00.149377 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a9d670-7b0f-45dc-ae64-5a2ef0c623ee" containerName="registry-server" Feb 26 11:56:00 crc kubenswrapper[4699]: E0226 11:56:00.149406 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a9d670-7b0f-45dc-ae64-5a2ef0c623ee" containerName="extract-content" Feb 26 11:56:00 crc kubenswrapper[4699]: I0226 11:56:00.149414 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a9d670-7b0f-45dc-ae64-5a2ef0c623ee" containerName="extract-content" Feb 26 11:56:00 crc kubenswrapper[4699]: I0226 11:56:00.149675 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a9d670-7b0f-45dc-ae64-5a2ef0c623ee" containerName="registry-server" Feb 26 11:56:00 crc kubenswrapper[4699]: I0226 11:56:00.150496 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535116-fwrcj" Feb 26 11:56:00 crc kubenswrapper[4699]: I0226 11:56:00.153464 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 11:56:00 crc kubenswrapper[4699]: I0226 11:56:00.153541 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 11:56:00 crc kubenswrapper[4699]: I0226 11:56:00.153651 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 11:56:00 crc kubenswrapper[4699]: I0226 11:56:00.161755 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535116-fwrcj"] Feb 26 11:56:00 crc kubenswrapper[4699]: I0226 11:56:00.242192 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzl4g\" (UniqueName: \"kubernetes.io/projected/f67852f2-cfab-4e51-b986-30f2a582877d-kube-api-access-kzl4g\") pod \"auto-csr-approver-29535116-fwrcj\" (UID: \"f67852f2-cfab-4e51-b986-30f2a582877d\") " pod="openshift-infra/auto-csr-approver-29535116-fwrcj" Feb 26 11:56:00 crc kubenswrapper[4699]: I0226 11:56:00.343814 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzl4g\" (UniqueName: \"kubernetes.io/projected/f67852f2-cfab-4e51-b986-30f2a582877d-kube-api-access-kzl4g\") pod \"auto-csr-approver-29535116-fwrcj\" (UID: \"f67852f2-cfab-4e51-b986-30f2a582877d\") " pod="openshift-infra/auto-csr-approver-29535116-fwrcj" Feb 26 11:56:00 crc kubenswrapper[4699]: I0226 11:56:00.369367 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzl4g\" (UniqueName: \"kubernetes.io/projected/f67852f2-cfab-4e51-b986-30f2a582877d-kube-api-access-kzl4g\") pod \"auto-csr-approver-29535116-fwrcj\" (UID: \"f67852f2-cfab-4e51-b986-30f2a582877d\") " pod="openshift-infra/auto-csr-approver-29535116-fwrcj" Feb 26 11:56:00 crc kubenswrapper[4699]: I0226 11:56:00.476489 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535116-fwrcj" Feb 26 11:56:01 crc kubenswrapper[4699]: I0226 11:56:01.493991 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535116-fwrcj"] Feb 26 11:56:01 crc kubenswrapper[4699]: I0226 11:56:01.637322 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535116-fwrcj" event={"ID":"f67852f2-cfab-4e51-b986-30f2a582877d","Type":"ContainerStarted","Data":"60489b00888faac90b3b6d74d0dee71d9d5747773017ec11625f44cd730a2ce5"} Feb 26 11:56:02 crc kubenswrapper[4699]: I0226 11:56:02.117346 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2fx9d"] Feb 26 11:56:02 crc kubenswrapper[4699]: I0226 11:56:02.120362 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fx9d" Feb 26 11:56:02 crc kubenswrapper[4699]: I0226 11:56:02.130454 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fx9d"] Feb 26 11:56:02 crc kubenswrapper[4699]: I0226 11:56:02.285693 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6d70b9c-e164-4128-9ed5-36526cbc378a-catalog-content\") pod \"redhat-marketplace-2fx9d\" (UID: \"b6d70b9c-e164-4128-9ed5-36526cbc378a\") " pod="openshift-marketplace/redhat-marketplace-2fx9d" Feb 26 11:56:02 crc kubenswrapper[4699]: I0226 11:56:02.285845 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74pvl\" (UniqueName: \"kubernetes.io/projected/b6d70b9c-e164-4128-9ed5-36526cbc378a-kube-api-access-74pvl\") pod \"redhat-marketplace-2fx9d\" (UID: \"b6d70b9c-e164-4128-9ed5-36526cbc378a\") " pod="openshift-marketplace/redhat-marketplace-2fx9d" Feb 26 11:56:02 crc kubenswrapper[4699]: I0226 11:56:02.285945 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6d70b9c-e164-4128-9ed5-36526cbc378a-utilities\") pod \"redhat-marketplace-2fx9d\" (UID: \"b6d70b9c-e164-4128-9ed5-36526cbc378a\") " pod="openshift-marketplace/redhat-marketplace-2fx9d" Feb 26 11:56:02 crc kubenswrapper[4699]: I0226 11:56:02.387431 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6d70b9c-e164-4128-9ed5-36526cbc378a-catalog-content\") pod \"redhat-marketplace-2fx9d\" (UID: \"b6d70b9c-e164-4128-9ed5-36526cbc378a\") " pod="openshift-marketplace/redhat-marketplace-2fx9d" Feb 26 11:56:02 crc kubenswrapper[4699]: I0226 11:56:02.387958 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74pvl\" (UniqueName: \"kubernetes.io/projected/b6d70b9c-e164-4128-9ed5-36526cbc378a-kube-api-access-74pvl\") pod \"redhat-marketplace-2fx9d\" (UID: \"b6d70b9c-e164-4128-9ed5-36526cbc378a\") " pod="openshift-marketplace/redhat-marketplace-2fx9d" Feb 26 11:56:02 crc kubenswrapper[4699]: I0226 11:56:02.388047 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6d70b9c-e164-4128-9ed5-36526cbc378a-utilities\") pod \"redhat-marketplace-2fx9d\" (UID: \"b6d70b9c-e164-4128-9ed5-36526cbc378a\") " pod="openshift-marketplace/redhat-marketplace-2fx9d" Feb 26 11:56:02 crc kubenswrapper[4699]: I0226 11:56:02.388628 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6d70b9c-e164-4128-9ed5-36526cbc378a-utilities\") pod \"redhat-marketplace-2fx9d\" (UID: \"b6d70b9c-e164-4128-9ed5-36526cbc378a\") " pod="openshift-marketplace/redhat-marketplace-2fx9d" Feb 26 11:56:02 crc kubenswrapper[4699]: I0226 11:56:02.389936 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6d70b9c-e164-4128-9ed5-36526cbc378a-catalog-content\") pod \"redhat-marketplace-2fx9d\" (UID: \"b6d70b9c-e164-4128-9ed5-36526cbc378a\") " pod="openshift-marketplace/redhat-marketplace-2fx9d" Feb 26 11:56:02 crc kubenswrapper[4699]: I0226 11:56:02.418144 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74pvl\" (UniqueName: \"kubernetes.io/projected/b6d70b9c-e164-4128-9ed5-36526cbc378a-kube-api-access-74pvl\") pod \"redhat-marketplace-2fx9d\" (UID: \"b6d70b9c-e164-4128-9ed5-36526cbc378a\") " pod="openshift-marketplace/redhat-marketplace-2fx9d" Feb 26 11:56:02 crc kubenswrapper[4699]: I0226 11:56:02.449414 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fx9d" Feb 26 11:56:02 crc kubenswrapper[4699]: I0226 11:56:02.970923 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fx9d"] Feb 26 11:56:02 crc kubenswrapper[4699]: W0226 11:56:02.983394 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6d70b9c_e164_4128_9ed5_36526cbc378a.slice/crio-7ef6d935dedc52e617caaf85457a1f25f23f860e59a7f5ec692c0136e824c8d6 WatchSource:0}: Error finding container 7ef6d935dedc52e617caaf85457a1f25f23f860e59a7f5ec692c0136e824c8d6: Status 404 returned error can't find the container with id 7ef6d935dedc52e617caaf85457a1f25f23f860e59a7f5ec692c0136e824c8d6 Feb 26 11:56:03 crc kubenswrapper[4699]: I0226 11:56:03.665049 4699 generic.go:334] "Generic (PLEG): container finished" podID="f67852f2-cfab-4e51-b986-30f2a582877d" containerID="97b5ef4eef61ea4aaf36ee8c050903fab28c7dee69a56263785c220e6a8c6292" exitCode=0 Feb 26 11:56:03 crc kubenswrapper[4699]: I0226 11:56:03.665193 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535116-fwrcj" event={"ID":"f67852f2-cfab-4e51-b986-30f2a582877d","Type":"ContainerDied","Data":"97b5ef4eef61ea4aaf36ee8c050903fab28c7dee69a56263785c220e6a8c6292"} Feb 26 11:56:03 crc kubenswrapper[4699]: I0226 11:56:03.669147 4699 generic.go:334] "Generic (PLEG): container finished" podID="b6d70b9c-e164-4128-9ed5-36526cbc378a" containerID="322fd2dcdda72d493e836e24ccab86fb0862ced48640c0c0225773af5aa27f1b" exitCode=0 Feb 26 11:56:03 crc kubenswrapper[4699]: I0226 11:56:03.669210 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fx9d" event={"ID":"b6d70b9c-e164-4128-9ed5-36526cbc378a","Type":"ContainerDied","Data":"322fd2dcdda72d493e836e24ccab86fb0862ced48640c0c0225773af5aa27f1b"} Feb 26 11:56:03 crc kubenswrapper[4699]: I0226 11:56:03.669266 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fx9d" event={"ID":"b6d70b9c-e164-4128-9ed5-36526cbc378a","Type":"ContainerStarted","Data":"7ef6d935dedc52e617caaf85457a1f25f23f860e59a7f5ec692c0136e824c8d6"} Feb 26 11:56:05 crc kubenswrapper[4699]: I0226 11:56:05.021400 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535116-fwrcj" Feb 26 11:56:05 crc kubenswrapper[4699]: I0226 11:56:05.150653 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzl4g\" (UniqueName: \"kubernetes.io/projected/f67852f2-cfab-4e51-b986-30f2a582877d-kube-api-access-kzl4g\") pod \"f67852f2-cfab-4e51-b986-30f2a582877d\" (UID: \"f67852f2-cfab-4e51-b986-30f2a582877d\") " Feb 26 11:56:05 crc kubenswrapper[4699]: I0226 11:56:05.158713 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f67852f2-cfab-4e51-b986-30f2a582877d-kube-api-access-kzl4g" (OuterVolumeSpecName: "kube-api-access-kzl4g") pod "f67852f2-cfab-4e51-b986-30f2a582877d" (UID: "f67852f2-cfab-4e51-b986-30f2a582877d"). InnerVolumeSpecName "kube-api-access-kzl4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:56:05 crc kubenswrapper[4699]: I0226 11:56:05.253085 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzl4g\" (UniqueName: \"kubernetes.io/projected/f67852f2-cfab-4e51-b986-30f2a582877d-kube-api-access-kzl4g\") on node \"crc\" DevicePath \"\"" Feb 26 11:56:05 crc kubenswrapper[4699]: I0226 11:56:05.709744 4699 generic.go:334] "Generic (PLEG): container finished" podID="b6d70b9c-e164-4128-9ed5-36526cbc378a" containerID="b819d5a5eb8a18ead17e03fc7ee8635e05baef41d12dc5fde7391cfb14d92944" exitCode=0 Feb 26 11:56:05 crc kubenswrapper[4699]: I0226 11:56:05.709868 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fx9d" event={"ID":"b6d70b9c-e164-4128-9ed5-36526cbc378a","Type":"ContainerDied","Data":"b819d5a5eb8a18ead17e03fc7ee8635e05baef41d12dc5fde7391cfb14d92944"} Feb 26 11:56:05 crc kubenswrapper[4699]: I0226 11:56:05.711771 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535116-fwrcj" event={"ID":"f67852f2-cfab-4e51-b986-30f2a582877d","Type":"ContainerDied","Data":"60489b00888faac90b3b6d74d0dee71d9d5747773017ec11625f44cd730a2ce5"} Feb 26 11:56:05 crc kubenswrapper[4699]: I0226 11:56:05.711812 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60489b00888faac90b3b6d74d0dee71d9d5747773017ec11625f44cd730a2ce5" Feb 26 11:56:05 crc kubenswrapper[4699]: I0226 11:56:05.711963 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535116-fwrcj" Feb 26 11:56:06 crc kubenswrapper[4699]: I0226 11:56:06.106261 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535110-g2n8d"] Feb 26 11:56:06 crc kubenswrapper[4699]: I0226 11:56:06.116373 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535110-g2n8d"] Feb 26 11:56:06 crc kubenswrapper[4699]: I0226 11:56:06.274221 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e270a2c1-b1c4-498d-9adf-a3cbb51defce" path="/var/lib/kubelet/pods/e270a2c1-b1c4-498d-9adf-a3cbb51defce/volumes" Feb 26 11:56:06 crc kubenswrapper[4699]: I0226 11:56:06.724638 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fx9d" event={"ID":"b6d70b9c-e164-4128-9ed5-36526cbc378a","Type":"ContainerStarted","Data":"44e48f2cdac118ffb00f481fbec6095f53c866c7375c9c1b41c179d0e7de3da1"} Feb 26 11:56:06 crc kubenswrapper[4699]: I0226 11:56:06.748362 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2fx9d" podStartSLOduration=2.324149415 podStartE2EDuration="4.748343567s" podCreationTimestamp="2026-02-26 11:56:02 +0000 UTC" firstStartedPulling="2026-02-26 11:56:03.67297054 +0000 UTC m=+2709.483796974" lastFinishedPulling="2026-02-26 11:56:06.097164692 +0000 UTC m=+2711.907991126" observedRunningTime="2026-02-26 11:56:06.742922704 +0000 UTC m=+2712.553749158" watchObservedRunningTime="2026-02-26 11:56:06.748343567 +0000 UTC m=+2712.559170001" Feb 26 11:56:12 crc kubenswrapper[4699]: I0226 11:56:12.450006 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2fx9d" Feb 26 11:56:12 crc kubenswrapper[4699]: I0226 11:56:12.450774 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2fx9d" Feb 26 11:56:12 crc kubenswrapper[4699]: I0226 11:56:12.504033 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2fx9d" Feb 26 11:56:12 crc kubenswrapper[4699]: I0226 11:56:12.823676 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2fx9d" Feb 26 11:56:12 crc kubenswrapper[4699]: I0226 11:56:12.874661 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fx9d"] Feb 26 11:56:14 crc kubenswrapper[4699]: I0226 11:56:14.795613 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2fx9d" podUID="b6d70b9c-e164-4128-9ed5-36526cbc378a" containerName="registry-server" containerID="cri-o://44e48f2cdac118ffb00f481fbec6095f53c866c7375c9c1b41c179d0e7de3da1" gracePeriod=2 Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.238559 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fx9d" Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.371967 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6d70b9c-e164-4128-9ed5-36526cbc378a-catalog-content\") pod \"b6d70b9c-e164-4128-9ed5-36526cbc378a\" (UID: \"b6d70b9c-e164-4128-9ed5-36526cbc378a\") " Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.372292 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6d70b9c-e164-4128-9ed5-36526cbc378a-utilities\") pod \"b6d70b9c-e164-4128-9ed5-36526cbc378a\" (UID: \"b6d70b9c-e164-4128-9ed5-36526cbc378a\") " Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.373324 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6d70b9c-e164-4128-9ed5-36526cbc378a-utilities" (OuterVolumeSpecName: "utilities") pod "b6d70b9c-e164-4128-9ed5-36526cbc378a" (UID: "b6d70b9c-e164-4128-9ed5-36526cbc378a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.373653 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74pvl\" (UniqueName: \"kubernetes.io/projected/b6d70b9c-e164-4128-9ed5-36526cbc378a-kube-api-access-74pvl\") pod \"b6d70b9c-e164-4128-9ed5-36526cbc378a\" (UID: \"b6d70b9c-e164-4128-9ed5-36526cbc378a\") " Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.375242 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6d70b9c-e164-4128-9ed5-36526cbc378a-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.384497 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6d70b9c-e164-4128-9ed5-36526cbc378a-kube-api-access-74pvl" (OuterVolumeSpecName: "kube-api-access-74pvl") pod "b6d70b9c-e164-4128-9ed5-36526cbc378a" (UID: "b6d70b9c-e164-4128-9ed5-36526cbc378a"). InnerVolumeSpecName "kube-api-access-74pvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.403944 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6d70b9c-e164-4128-9ed5-36526cbc378a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6d70b9c-e164-4128-9ed5-36526cbc378a" (UID: "b6d70b9c-e164-4128-9ed5-36526cbc378a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.477320 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74pvl\" (UniqueName: \"kubernetes.io/projected/b6d70b9c-e164-4128-9ed5-36526cbc378a-kube-api-access-74pvl\") on node \"crc\" DevicePath \"\"" Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.477362 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6d70b9c-e164-4128-9ed5-36526cbc378a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.824880 4699 generic.go:334] "Generic (PLEG): container finished" podID="b6d70b9c-e164-4128-9ed5-36526cbc378a" containerID="44e48f2cdac118ffb00f481fbec6095f53c866c7375c9c1b41c179d0e7de3da1" exitCode=0 Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.824951 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fx9d" Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.824969 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fx9d" event={"ID":"b6d70b9c-e164-4128-9ed5-36526cbc378a","Type":"ContainerDied","Data":"44e48f2cdac118ffb00f481fbec6095f53c866c7375c9c1b41c179d0e7de3da1"} Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.826202 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fx9d" event={"ID":"b6d70b9c-e164-4128-9ed5-36526cbc378a","Type":"ContainerDied","Data":"7ef6d935dedc52e617caaf85457a1f25f23f860e59a7f5ec692c0136e824c8d6"} Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.826243 4699 scope.go:117] "RemoveContainer" containerID="44e48f2cdac118ffb00f481fbec6095f53c866c7375c9c1b41c179d0e7de3da1" Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.861600 4699 scope.go:117] "RemoveContainer" containerID="b819d5a5eb8a18ead17e03fc7ee8635e05baef41d12dc5fde7391cfb14d92944" Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.872395 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fx9d"] Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.880923 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fx9d"] Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.887526 4699 scope.go:117] "RemoveContainer" containerID="322fd2dcdda72d493e836e24ccab86fb0862ced48640c0c0225773af5aa27f1b" Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.927456 4699 scope.go:117] "RemoveContainer" containerID="44e48f2cdac118ffb00f481fbec6095f53c866c7375c9c1b41c179d0e7de3da1" Feb 26 11:56:15 crc kubenswrapper[4699]: E0226 11:56:15.927991 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44e48f2cdac118ffb00f481fbec6095f53c866c7375c9c1b41c179d0e7de3da1\": container with ID starting with 44e48f2cdac118ffb00f481fbec6095f53c866c7375c9c1b41c179d0e7de3da1 not found: ID does not exist" containerID="44e48f2cdac118ffb00f481fbec6095f53c866c7375c9c1b41c179d0e7de3da1" Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.928099 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44e48f2cdac118ffb00f481fbec6095f53c866c7375c9c1b41c179d0e7de3da1"} err="failed to get container status \"44e48f2cdac118ffb00f481fbec6095f53c866c7375c9c1b41c179d0e7de3da1\": rpc error: code = NotFound desc = could not find container \"44e48f2cdac118ffb00f481fbec6095f53c866c7375c9c1b41c179d0e7de3da1\": container with ID starting with 44e48f2cdac118ffb00f481fbec6095f53c866c7375c9c1b41c179d0e7de3da1 not found: ID does not exist" Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.928257 4699 scope.go:117] "RemoveContainer" containerID="b819d5a5eb8a18ead17e03fc7ee8635e05baef41d12dc5fde7391cfb14d92944" Feb 26 11:56:15 crc kubenswrapper[4699]: E0226 11:56:15.928796 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b819d5a5eb8a18ead17e03fc7ee8635e05baef41d12dc5fde7391cfb14d92944\": container with ID starting with b819d5a5eb8a18ead17e03fc7ee8635e05baef41d12dc5fde7391cfb14d92944 not found: ID does not exist" containerID="b819d5a5eb8a18ead17e03fc7ee8635e05baef41d12dc5fde7391cfb14d92944" Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.928833 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b819d5a5eb8a18ead17e03fc7ee8635e05baef41d12dc5fde7391cfb14d92944"} err="failed to get container status \"b819d5a5eb8a18ead17e03fc7ee8635e05baef41d12dc5fde7391cfb14d92944\": rpc error: code = NotFound desc = could not find container \"b819d5a5eb8a18ead17e03fc7ee8635e05baef41d12dc5fde7391cfb14d92944\": container with ID starting with b819d5a5eb8a18ead17e03fc7ee8635e05baef41d12dc5fde7391cfb14d92944 not found: ID does not exist" Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.928855 4699 scope.go:117] "RemoveContainer" containerID="322fd2dcdda72d493e836e24ccab86fb0862ced48640c0c0225773af5aa27f1b" Feb 26 11:56:15 crc kubenswrapper[4699]: E0226 11:56:15.929254 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"322fd2dcdda72d493e836e24ccab86fb0862ced48640c0c0225773af5aa27f1b\": container with ID starting with 322fd2dcdda72d493e836e24ccab86fb0862ced48640c0c0225773af5aa27f1b not found: ID does not exist" containerID="322fd2dcdda72d493e836e24ccab86fb0862ced48640c0c0225773af5aa27f1b" Feb 26 11:56:15 crc kubenswrapper[4699]: I0226 11:56:15.929353 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"322fd2dcdda72d493e836e24ccab86fb0862ced48640c0c0225773af5aa27f1b"} err="failed to get container status \"322fd2dcdda72d493e836e24ccab86fb0862ced48640c0c0225773af5aa27f1b\": rpc error: code = NotFound desc = could not find container \"322fd2dcdda72d493e836e24ccab86fb0862ced48640c0c0225773af5aa27f1b\": container with ID starting with 322fd2dcdda72d493e836e24ccab86fb0862ced48640c0c0225773af5aa27f1b not found: ID does not exist" Feb 26 11:56:16 crc kubenswrapper[4699]: I0226 11:56:16.272582 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6d70b9c-e164-4128-9ed5-36526cbc378a" path="/var/lib/kubelet/pods/b6d70b9c-e164-4128-9ed5-36526cbc378a/volumes" Feb 26 11:56:31 crc kubenswrapper[4699]: I0226 11:56:31.856628 4699 scope.go:117] "RemoveContainer" containerID="5c3d5a2f0c08caa11b3efe5f7dadcab2f42f5d3eecfcc331eaac28aadfec2f57" Feb 26 11:56:41 crc kubenswrapper[4699]: I0226 11:56:41.584564 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:56:41 crc kubenswrapper[4699]: I0226 11:56:41.585358 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:57:04 crc kubenswrapper[4699]: I0226 11:57:04.251174 4699 generic.go:334] "Generic (PLEG): container finished" podID="08bdd16a-fc18-4262-9175-a05b613a76c9" containerID="43d0f761beaf929bd7b88c678a07a81fe65b54f961758754c84f435ce7b8d8cb" exitCode=0 Feb 26 11:57:04 crc kubenswrapper[4699]: I0226 11:57:04.251296 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" event={"ID":"08bdd16a-fc18-4262-9175-a05b613a76c9","Type":"ContainerDied","Data":"43d0f761beaf929bd7b88c678a07a81fe65b54f961758754c84f435ce7b8d8cb"} Feb 26 11:57:05 crc kubenswrapper[4699]: I0226 11:57:05.685390 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:57:05 crc kubenswrapper[4699]: I0226 11:57:05.776043 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ceilometer-compute-config-data-1\") pod \"08bdd16a-fc18-4262-9175-a05b613a76c9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " Feb 26 11:57:05 crc kubenswrapper[4699]: I0226 11:57:05.805810 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "08bdd16a-fc18-4262-9175-a05b613a76c9" (UID: "08bdd16a-fc18-4262-9175-a05b613a76c9"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:57:05 crc kubenswrapper[4699]: I0226 11:57:05.877676 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ceilometer-compute-config-data-2\") pod \"08bdd16a-fc18-4262-9175-a05b613a76c9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " Feb 26 11:57:05 crc kubenswrapper[4699]: I0226 11:57:05.877762 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-telemetry-combined-ca-bundle\") pod \"08bdd16a-fc18-4262-9175-a05b613a76c9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " Feb 26 11:57:05 crc kubenswrapper[4699]: I0226 11:57:05.877924 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-inventory\") pod \"08bdd16a-fc18-4262-9175-a05b613a76c9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " Feb 26 11:57:05 crc kubenswrapper[4699]: I0226 11:57:05.878018 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ssh-key-openstack-edpm-ipam\") pod \"08bdd16a-fc18-4262-9175-a05b613a76c9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " Feb 26 11:57:05 crc kubenswrapper[4699]: I0226 11:57:05.878732 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjjd7\" (UniqueName: \"kubernetes.io/projected/08bdd16a-fc18-4262-9175-a05b613a76c9-kube-api-access-qjjd7\") pod \"08bdd16a-fc18-4262-9175-a05b613a76c9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " Feb 26 11:57:05 crc kubenswrapper[4699]: I0226 11:57:05.878756 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ceilometer-compute-config-data-0\") pod \"08bdd16a-fc18-4262-9175-a05b613a76c9\" (UID: \"08bdd16a-fc18-4262-9175-a05b613a76c9\") " Feb 26 11:57:05 crc kubenswrapper[4699]: I0226 11:57:05.879416 4699 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 26 11:57:05 crc kubenswrapper[4699]: I0226 11:57:05.881918 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "08bdd16a-fc18-4262-9175-a05b613a76c9" (UID: "08bdd16a-fc18-4262-9175-a05b613a76c9"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:57:05 crc kubenswrapper[4699]: I0226 11:57:05.882547 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08bdd16a-fc18-4262-9175-a05b613a76c9-kube-api-access-qjjd7" (OuterVolumeSpecName: "kube-api-access-qjjd7") pod "08bdd16a-fc18-4262-9175-a05b613a76c9" (UID: "08bdd16a-fc18-4262-9175-a05b613a76c9"). InnerVolumeSpecName "kube-api-access-qjjd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:57:05 crc kubenswrapper[4699]: I0226 11:57:05.903964 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "08bdd16a-fc18-4262-9175-a05b613a76c9" (UID: "08bdd16a-fc18-4262-9175-a05b613a76c9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:57:05 crc kubenswrapper[4699]: I0226 11:57:05.905014 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-inventory" (OuterVolumeSpecName: "inventory") pod "08bdd16a-fc18-4262-9175-a05b613a76c9" (UID: "08bdd16a-fc18-4262-9175-a05b613a76c9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:57:05 crc kubenswrapper[4699]: I0226 11:57:05.906809 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "08bdd16a-fc18-4262-9175-a05b613a76c9" (UID: "08bdd16a-fc18-4262-9175-a05b613a76c9"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:57:05 crc kubenswrapper[4699]: I0226 11:57:05.911780 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "08bdd16a-fc18-4262-9175-a05b613a76c9" (UID: "08bdd16a-fc18-4262-9175-a05b613a76c9"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 11:57:05 crc kubenswrapper[4699]: I0226 11:57:05.980829 4699 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 11:57:05 crc kubenswrapper[4699]: I0226 11:57:05.980871 4699 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 11:57:05 crc kubenswrapper[4699]: I0226 11:57:05.980882 4699 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 26 11:57:05 crc kubenswrapper[4699]: I0226 11:57:05.980892 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjjd7\" (UniqueName: \"kubernetes.io/projected/08bdd16a-fc18-4262-9175-a05b613a76c9-kube-api-access-qjjd7\") on node \"crc\" DevicePath \"\"" Feb 26 11:57:05 crc kubenswrapper[4699]: I0226 11:57:05.980902 4699 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 26 11:57:05 crc kubenswrapper[4699]: I0226 11:57:05.980917 4699 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08bdd16a-fc18-4262-9175-a05b613a76c9-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 11:57:06 crc kubenswrapper[4699]: I0226 11:57:06.275037 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" event={"ID":"08bdd16a-fc18-4262-9175-a05b613a76c9","Type":"ContainerDied","Data":"14466ace843fec23ce73560f034f012e5b9e5664261a1686721bb34a65e7ea16"} Feb 26 11:57:06 crc kubenswrapper[4699]: I0226 11:57:06.275074 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14466ace843fec23ce73560f034f012e5b9e5664261a1686721bb34a65e7ea16" Feb 26 11:57:06 crc kubenswrapper[4699]: I0226 11:57:06.275144 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9" Feb 26 11:57:11 crc kubenswrapper[4699]: I0226 11:57:11.584757 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:57:11 crc kubenswrapper[4699]: I0226 11:57:11.585550 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:57:41 crc kubenswrapper[4699]: I0226 11:57:41.585422 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:57:41 crc kubenswrapper[4699]: I0226 11:57:41.586075 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 11:57:41 crc kubenswrapper[4699]: I0226 11:57:41.586142 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 11:57:41 crc kubenswrapper[4699]: I0226 11:57:41.587024 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4c58903dcb4a12909fcb0583a1d55149dc4c175867d594a32d984dde51ae536f"} pod="openshift-machine-config-operator/machine-config-daemon-28p79" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 11:57:41 crc kubenswrapper[4699]: I0226 11:57:41.587091 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" containerID="cri-o://4c58903dcb4a12909fcb0583a1d55149dc4c175867d594a32d984dde51ae536f" gracePeriod=600 Feb 26 11:57:41 crc kubenswrapper[4699]: I0226 11:57:41.730782 4699 generic.go:334] "Generic (PLEG): container finished" podID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerID="4c58903dcb4a12909fcb0583a1d55149dc4c175867d594a32d984dde51ae536f" exitCode=0 Feb 26 11:57:41 crc kubenswrapper[4699]: I0226 11:57:41.730833 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerDied","Data":"4c58903dcb4a12909fcb0583a1d55149dc4c175867d594a32d984dde51ae536f"} Feb 26 11:57:41 crc kubenswrapper[4699]: I0226 11:57:41.730875 4699 scope.go:117] "RemoveContainer" containerID="c9ba98af8e384a46c220f982325ae50f5859e35e543bee096d657cca90e89dde" Feb 26 11:57:42 crc kubenswrapper[4699]: I0226 11:57:42.742405 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerStarted","Data":"2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948"} Feb 26 11:57:43 crc kubenswrapper[4699]: I0226 11:57:43.381533 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qrkbm"] Feb 26 11:57:43 crc kubenswrapper[4699]: E0226 11:57:43.382771 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f67852f2-cfab-4e51-b986-30f2a582877d" containerName="oc" Feb 26 11:57:43 crc kubenswrapper[4699]: I0226 11:57:43.382792 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f67852f2-cfab-4e51-b986-30f2a582877d" containerName="oc" Feb 26 11:57:43 crc kubenswrapper[4699]: E0226 11:57:43.382809 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6d70b9c-e164-4128-9ed5-36526cbc378a" containerName="registry-server" Feb 26 11:57:43 crc kubenswrapper[4699]: I0226 11:57:43.382816 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6d70b9c-e164-4128-9ed5-36526cbc378a" containerName="registry-server" Feb 26 11:57:43 crc kubenswrapper[4699]: E0226 11:57:43.382835 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6d70b9c-e164-4128-9ed5-36526cbc378a" containerName="extract-content" Feb 26 11:57:43 crc kubenswrapper[4699]: I0226 11:57:43.382842 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6d70b9c-e164-4128-9ed5-36526cbc378a" containerName="extract-content" Feb 26 11:57:43 crc kubenswrapper[4699]: E0226 11:57:43.382860 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08bdd16a-fc18-4262-9175-a05b613a76c9" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 26 11:57:43 crc kubenswrapper[4699]: I0226 11:57:43.382867 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="08bdd16a-fc18-4262-9175-a05b613a76c9" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 26 11:57:43 crc kubenswrapper[4699]: E0226 11:57:43.382876 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6d70b9c-e164-4128-9ed5-36526cbc378a" containerName="extract-utilities" Feb 26 11:57:43 crc kubenswrapper[4699]: I0226 11:57:43.382882 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6d70b9c-e164-4128-9ed5-36526cbc378a" containerName="extract-utilities" Feb 26 11:57:43 crc kubenswrapper[4699]: I0226 11:57:43.383086 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="08bdd16a-fc18-4262-9175-a05b613a76c9" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 26 11:57:43 crc kubenswrapper[4699]: I0226 11:57:43.383109 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6d70b9c-e164-4128-9ed5-36526cbc378a" containerName="registry-server" Feb 26 11:57:43 crc kubenswrapper[4699]: I0226 11:57:43.383142 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="f67852f2-cfab-4e51-b986-30f2a582877d" containerName="oc" Feb 26 11:57:43 crc kubenswrapper[4699]: I0226 11:57:43.384505 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrkbm" Feb 26 11:57:43 crc kubenswrapper[4699]: I0226 11:57:43.395394 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qrkbm"] Feb 26 11:57:43 crc kubenswrapper[4699]: I0226 11:57:43.397910 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9631f0a5-2f36-4dc0-a473-38fe1d97215d-catalog-content\") pod \"redhat-operators-qrkbm\" (UID: \"9631f0a5-2f36-4dc0-a473-38fe1d97215d\") " pod="openshift-marketplace/redhat-operators-qrkbm" Feb 26 11:57:43 crc kubenswrapper[4699]: I0226 11:57:43.397966 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9631f0a5-2f36-4dc0-a473-38fe1d97215d-utilities\") pod \"redhat-operators-qrkbm\" (UID: \"9631f0a5-2f36-4dc0-a473-38fe1d97215d\") " pod="openshift-marketplace/redhat-operators-qrkbm" Feb 26 11:57:43 crc kubenswrapper[4699]: I0226 11:57:43.398020 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grjbk\" (UniqueName: \"kubernetes.io/projected/9631f0a5-2f36-4dc0-a473-38fe1d97215d-kube-api-access-grjbk\") pod \"redhat-operators-qrkbm\" (UID: \"9631f0a5-2f36-4dc0-a473-38fe1d97215d\") " pod="openshift-marketplace/redhat-operators-qrkbm" Feb 26 11:57:43 crc kubenswrapper[4699]: I0226 11:57:43.500000 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9631f0a5-2f36-4dc0-a473-38fe1d97215d-catalog-content\") pod \"redhat-operators-qrkbm\" (UID: \"9631f0a5-2f36-4dc0-a473-38fe1d97215d\") " pod="openshift-marketplace/redhat-operators-qrkbm" Feb 26 11:57:43 crc kubenswrapper[4699]: I0226 11:57:43.500082 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9631f0a5-2f36-4dc0-a473-38fe1d97215d-utilities\") pod \"redhat-operators-qrkbm\" (UID: \"9631f0a5-2f36-4dc0-a473-38fe1d97215d\") " pod="openshift-marketplace/redhat-operators-qrkbm" Feb 26 11:57:43 crc kubenswrapper[4699]: I0226 11:57:43.500170 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grjbk\" (UniqueName: \"kubernetes.io/projected/9631f0a5-2f36-4dc0-a473-38fe1d97215d-kube-api-access-grjbk\") pod \"redhat-operators-qrkbm\" (UID: \"9631f0a5-2f36-4dc0-a473-38fe1d97215d\") " pod="openshift-marketplace/redhat-operators-qrkbm" Feb 26 11:57:43 crc kubenswrapper[4699]: I0226 11:57:43.500574 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9631f0a5-2f36-4dc0-a473-38fe1d97215d-catalog-content\") pod \"redhat-operators-qrkbm\" (UID: \"9631f0a5-2f36-4dc0-a473-38fe1d97215d\") " pod="openshift-marketplace/redhat-operators-qrkbm" Feb 26 11:57:43 crc kubenswrapper[4699]: I0226 11:57:43.500674 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9631f0a5-2f36-4dc0-a473-38fe1d97215d-utilities\") pod \"redhat-operators-qrkbm\" (UID: \"9631f0a5-2f36-4dc0-a473-38fe1d97215d\") " pod="openshift-marketplace/redhat-operators-qrkbm" Feb 26 11:57:43 crc kubenswrapper[4699]: I0226 11:57:43.524854 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grjbk\" (UniqueName: \"kubernetes.io/projected/9631f0a5-2f36-4dc0-a473-38fe1d97215d-kube-api-access-grjbk\") pod \"redhat-operators-qrkbm\" (UID: \"9631f0a5-2f36-4dc0-a473-38fe1d97215d\") " pod="openshift-marketplace/redhat-operators-qrkbm" Feb 26 11:57:43 crc kubenswrapper[4699]: I0226 11:57:43.706289 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrkbm" Feb 26 11:57:44 crc kubenswrapper[4699]: I0226 11:57:44.215765 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qrkbm"] Feb 26 11:57:44 crc kubenswrapper[4699]: W0226 11:57:44.218315 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9631f0a5_2f36_4dc0_a473_38fe1d97215d.slice/crio-bf3ecd22b85cbf24c93a2ad16cffc894ab5fa7f0bb05ff2228704a19c193be6a WatchSource:0}: Error finding container bf3ecd22b85cbf24c93a2ad16cffc894ab5fa7f0bb05ff2228704a19c193be6a: Status 404 returned error can't find the container with id bf3ecd22b85cbf24c93a2ad16cffc894ab5fa7f0bb05ff2228704a19c193be6a Feb 26 11:57:44 crc kubenswrapper[4699]: I0226 11:57:44.764237 4699 generic.go:334] "Generic (PLEG): container finished" podID="9631f0a5-2f36-4dc0-a473-38fe1d97215d" containerID="3d779da817a1ce4610ae811406c88b435f18d42b5fe4797bebcfec8947827eba" exitCode=0 Feb 26 11:57:44 crc kubenswrapper[4699]: I0226 11:57:44.764639 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrkbm" event={"ID":"9631f0a5-2f36-4dc0-a473-38fe1d97215d","Type":"ContainerDied","Data":"3d779da817a1ce4610ae811406c88b435f18d42b5fe4797bebcfec8947827eba"} Feb 26 11:57:44 crc kubenswrapper[4699]: I0226 11:57:44.764663 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrkbm" event={"ID":"9631f0a5-2f36-4dc0-a473-38fe1d97215d","Type":"ContainerStarted","Data":"bf3ecd22b85cbf24c93a2ad16cffc894ab5fa7f0bb05ff2228704a19c193be6a"} Feb 26 11:57:44 crc kubenswrapper[4699]: I0226 11:57:44.768033 4699 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 11:57:47 crc kubenswrapper[4699]: I0226 11:57:47.796312 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrkbm" event={"ID":"9631f0a5-2f36-4dc0-a473-38fe1d97215d","Type":"ContainerStarted","Data":"53f520b4d9fd9ed3058f67f634d54cf5c2b1235187b713b704194c6029177575"} Feb 26 11:57:50 crc kubenswrapper[4699]: I0226 11:57:50.829812 4699 generic.go:334] "Generic (PLEG): container finished" podID="9631f0a5-2f36-4dc0-a473-38fe1d97215d" containerID="53f520b4d9fd9ed3058f67f634d54cf5c2b1235187b713b704194c6029177575" exitCode=0 Feb 26 11:57:50 crc kubenswrapper[4699]: I0226 11:57:50.829904 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrkbm" event={"ID":"9631f0a5-2f36-4dc0-a473-38fe1d97215d","Type":"ContainerDied","Data":"53f520b4d9fd9ed3058f67f634d54cf5c2b1235187b713b704194c6029177575"} Feb 26 11:57:52 crc kubenswrapper[4699]: I0226 11:57:52.851535 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrkbm" event={"ID":"9631f0a5-2f36-4dc0-a473-38fe1d97215d","Type":"ContainerStarted","Data":"4c7e57aede1fc2c2d0c835fac1c21b609cc5544169e8bce6fd32a979b79f6ebc"} Feb 26 11:57:52 crc kubenswrapper[4699]: I0226 11:57:52.870678 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qrkbm" podStartSLOduration=2.535592525 podStartE2EDuration="9.870661088s" podCreationTimestamp="2026-02-26 11:57:43 +0000 UTC" firstStartedPulling="2026-02-26 11:57:44.767289204 +0000 UTC m=+2810.578115648" lastFinishedPulling="2026-02-26 11:57:52.102357777 +0000 UTC m=+2817.913184211" observedRunningTime="2026-02-26 11:57:52.86833041 +0000 UTC m=+2818.679156864" watchObservedRunningTime="2026-02-26 11:57:52.870661088 +0000 UTC m=+2818.681487512" Feb 26 11:57:53 crc kubenswrapper[4699]: I0226 11:57:53.706490 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qrkbm" Feb 26 11:57:53 crc kubenswrapper[4699]: I0226 11:57:53.706554 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qrkbm" Feb 26 11:57:54 crc kubenswrapper[4699]: I0226 11:57:54.770101 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qrkbm" podUID="9631f0a5-2f36-4dc0-a473-38fe1d97215d" containerName="registry-server" probeResult="failure" output=< Feb 26 11:57:54 crc kubenswrapper[4699]: timeout: failed to connect service ":50051" within 1s Feb 26 11:57:54 crc kubenswrapper[4699]: > Feb 26 11:58:00 crc kubenswrapper[4699]: I0226 11:58:00.148859 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535118-n92bn"] Feb 26 11:58:00 crc kubenswrapper[4699]: I0226 11:58:00.152193 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535118-n92bn" Feb 26 11:58:00 crc kubenswrapper[4699]: I0226 11:58:00.155180 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 11:58:00 crc kubenswrapper[4699]: I0226 11:58:00.155243 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 11:58:00 crc kubenswrapper[4699]: I0226 11:58:00.155287 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 11:58:00 crc kubenswrapper[4699]: I0226 11:58:00.162757 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535118-n92bn"] Feb 26 11:58:00 crc kubenswrapper[4699]: I0226 11:58:00.276771 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sll54\" (UniqueName: \"kubernetes.io/projected/6bde379e-7dd7-4b4b-bc25-b83d0174b100-kube-api-access-sll54\") pod \"auto-csr-approver-29535118-n92bn\" (UID: \"6bde379e-7dd7-4b4b-bc25-b83d0174b100\") " pod="openshift-infra/auto-csr-approver-29535118-n92bn" Feb 26 11:58:00 crc kubenswrapper[4699]: I0226 11:58:00.378267 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sll54\" (UniqueName: \"kubernetes.io/projected/6bde379e-7dd7-4b4b-bc25-b83d0174b100-kube-api-access-sll54\") pod \"auto-csr-approver-29535118-n92bn\" (UID: \"6bde379e-7dd7-4b4b-bc25-b83d0174b100\") " pod="openshift-infra/auto-csr-approver-29535118-n92bn" Feb 26 11:58:00 crc kubenswrapper[4699]: I0226 11:58:00.409357 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sll54\" (UniqueName: \"kubernetes.io/projected/6bde379e-7dd7-4b4b-bc25-b83d0174b100-kube-api-access-sll54\") pod \"auto-csr-approver-29535118-n92bn\" (UID: \"6bde379e-7dd7-4b4b-bc25-b83d0174b100\") " pod="openshift-infra/auto-csr-approver-29535118-n92bn" Feb 26 11:58:00 crc kubenswrapper[4699]: I0226 11:58:00.482740 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535118-n92bn" Feb 26 11:58:01 crc kubenswrapper[4699]: I0226 11:58:01.016231 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535118-n92bn"] Feb 26 11:58:01 crc kubenswrapper[4699]: W0226 11:58:01.027679 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bde379e_7dd7_4b4b_bc25_b83d0174b100.slice/crio-91099037474b511010fb5109b00e16bf53ed98c580d5b518a258c4893ae89cc4 WatchSource:0}: Error finding container 91099037474b511010fb5109b00e16bf53ed98c580d5b518a258c4893ae89cc4: Status 404 returned error can't find the container with id 91099037474b511010fb5109b00e16bf53ed98c580d5b518a258c4893ae89cc4 Feb 26 11:58:01 crc kubenswrapper[4699]: I0226 11:58:01.930984 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535118-n92bn" event={"ID":"6bde379e-7dd7-4b4b-bc25-b83d0174b100","Type":"ContainerStarted","Data":"91099037474b511010fb5109b00e16bf53ed98c580d5b518a258c4893ae89cc4"} Feb 26 11:58:02 crc kubenswrapper[4699]: I0226 11:58:02.940498 4699 generic.go:334] "Generic (PLEG): container finished" podID="6bde379e-7dd7-4b4b-bc25-b83d0174b100" containerID="7ee1327d152002290262452d2af09136d94e1e411a1eeb32531cce9b1d48c20c" exitCode=0 Feb 26 11:58:02 crc kubenswrapper[4699]: I0226 11:58:02.940554 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535118-n92bn" event={"ID":"6bde379e-7dd7-4b4b-bc25-b83d0174b100","Type":"ContainerDied","Data":"7ee1327d152002290262452d2af09136d94e1e411a1eeb32531cce9b1d48c20c"} Feb 26 11:58:03 crc kubenswrapper[4699]: I0226 11:58:03.753948 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qrkbm" Feb 26 11:58:03 crc kubenswrapper[4699]: I0226 11:58:03.802429 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qrkbm" Feb 26 11:58:03 crc kubenswrapper[4699]: I0226 11:58:03.991278 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qrkbm"] Feb 26 11:58:04 crc kubenswrapper[4699]: I0226 11:58:04.343659 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535118-n92bn" Feb 26 11:58:04 crc kubenswrapper[4699]: I0226 11:58:04.484142 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sll54\" (UniqueName: \"kubernetes.io/projected/6bde379e-7dd7-4b4b-bc25-b83d0174b100-kube-api-access-sll54\") pod \"6bde379e-7dd7-4b4b-bc25-b83d0174b100\" (UID: \"6bde379e-7dd7-4b4b-bc25-b83d0174b100\") " Feb 26 11:58:04 crc kubenswrapper[4699]: I0226 11:58:04.493488 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bde379e-7dd7-4b4b-bc25-b83d0174b100-kube-api-access-sll54" (OuterVolumeSpecName: "kube-api-access-sll54") pod "6bde379e-7dd7-4b4b-bc25-b83d0174b100" (UID: "6bde379e-7dd7-4b4b-bc25-b83d0174b100"). InnerVolumeSpecName "kube-api-access-sll54". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:58:04 crc kubenswrapper[4699]: I0226 11:58:04.586376 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sll54\" (UniqueName: \"kubernetes.io/projected/6bde379e-7dd7-4b4b-bc25-b83d0174b100-kube-api-access-sll54\") on node \"crc\" DevicePath \"\"" Feb 26 11:58:04 crc kubenswrapper[4699]: I0226 11:58:04.959744 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535118-n92bn" Feb 26 11:58:04 crc kubenswrapper[4699]: I0226 11:58:04.959757 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535118-n92bn" event={"ID":"6bde379e-7dd7-4b4b-bc25-b83d0174b100","Type":"ContainerDied","Data":"91099037474b511010fb5109b00e16bf53ed98c580d5b518a258c4893ae89cc4"} Feb 26 11:58:04 crc kubenswrapper[4699]: I0226 11:58:04.959799 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91099037474b511010fb5109b00e16bf53ed98c580d5b518a258c4893ae89cc4" Feb 26 11:58:04 crc kubenswrapper[4699]: I0226 11:58:04.960030 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qrkbm" podUID="9631f0a5-2f36-4dc0-a473-38fe1d97215d" containerName="registry-server" containerID="cri-o://4c7e57aede1fc2c2d0c835fac1c21b609cc5544169e8bce6fd32a979b79f6ebc" gracePeriod=2 Feb 26 11:58:05 crc kubenswrapper[4699]: I0226 11:58:05.426357 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535112-jg5nd"] Feb 26 11:58:05 crc kubenswrapper[4699]: I0226 11:58:05.437725 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535112-jg5nd"] Feb 26 11:58:05 crc kubenswrapper[4699]: I0226 11:58:05.528862 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrkbm" Feb 26 11:58:05 crc kubenswrapper[4699]: I0226 11:58:05.714364 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9631f0a5-2f36-4dc0-a473-38fe1d97215d-catalog-content\") pod \"9631f0a5-2f36-4dc0-a473-38fe1d97215d\" (UID: \"9631f0a5-2f36-4dc0-a473-38fe1d97215d\") " Feb 26 11:58:05 crc kubenswrapper[4699]: I0226 11:58:05.716076 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9631f0a5-2f36-4dc0-a473-38fe1d97215d-utilities\") pod \"9631f0a5-2f36-4dc0-a473-38fe1d97215d\" (UID: \"9631f0a5-2f36-4dc0-a473-38fe1d97215d\") " Feb 26 11:58:05 crc kubenswrapper[4699]: I0226 11:58:05.716211 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grjbk\" (UniqueName: \"kubernetes.io/projected/9631f0a5-2f36-4dc0-a473-38fe1d97215d-kube-api-access-grjbk\") pod \"9631f0a5-2f36-4dc0-a473-38fe1d97215d\" (UID: \"9631f0a5-2f36-4dc0-a473-38fe1d97215d\") " Feb 26 11:58:05 crc kubenswrapper[4699]: I0226 11:58:05.717070 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9631f0a5-2f36-4dc0-a473-38fe1d97215d-utilities" (OuterVolumeSpecName: "utilities") pod "9631f0a5-2f36-4dc0-a473-38fe1d97215d" (UID: "9631f0a5-2f36-4dc0-a473-38fe1d97215d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:58:05 crc kubenswrapper[4699]: I0226 11:58:05.728999 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9631f0a5-2f36-4dc0-a473-38fe1d97215d-kube-api-access-grjbk" (OuterVolumeSpecName: "kube-api-access-grjbk") pod "9631f0a5-2f36-4dc0-a473-38fe1d97215d" (UID: "9631f0a5-2f36-4dc0-a473-38fe1d97215d"). InnerVolumeSpecName "kube-api-access-grjbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 11:58:05 crc kubenswrapper[4699]: I0226 11:58:05.818812 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9631f0a5-2f36-4dc0-a473-38fe1d97215d-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 11:58:05 crc kubenswrapper[4699]: I0226 11:58:05.818847 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grjbk\" (UniqueName: \"kubernetes.io/projected/9631f0a5-2f36-4dc0-a473-38fe1d97215d-kube-api-access-grjbk\") on node \"crc\" DevicePath \"\"" Feb 26 11:58:05 crc kubenswrapper[4699]: I0226 11:58:05.835272 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9631f0a5-2f36-4dc0-a473-38fe1d97215d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9631f0a5-2f36-4dc0-a473-38fe1d97215d" (UID: "9631f0a5-2f36-4dc0-a473-38fe1d97215d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 11:58:05 crc kubenswrapper[4699]: I0226 11:58:05.920244 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9631f0a5-2f36-4dc0-a473-38fe1d97215d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 11:58:05 crc kubenswrapper[4699]: I0226 11:58:05.970655 4699 generic.go:334] "Generic (PLEG): container finished" podID="9631f0a5-2f36-4dc0-a473-38fe1d97215d" containerID="4c7e57aede1fc2c2d0c835fac1c21b609cc5544169e8bce6fd32a979b79f6ebc" exitCode=0 Feb 26 11:58:05 crc kubenswrapper[4699]: I0226 11:58:05.970721 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrkbm" Feb 26 11:58:05 crc kubenswrapper[4699]: I0226 11:58:05.970725 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrkbm" event={"ID":"9631f0a5-2f36-4dc0-a473-38fe1d97215d","Type":"ContainerDied","Data":"4c7e57aede1fc2c2d0c835fac1c21b609cc5544169e8bce6fd32a979b79f6ebc"} Feb 26 11:58:05 crc kubenswrapper[4699]: I0226 11:58:05.970838 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrkbm" event={"ID":"9631f0a5-2f36-4dc0-a473-38fe1d97215d","Type":"ContainerDied","Data":"bf3ecd22b85cbf24c93a2ad16cffc894ab5fa7f0bb05ff2228704a19c193be6a"} Feb 26 11:58:05 crc kubenswrapper[4699]: I0226 11:58:05.970858 4699 scope.go:117] "RemoveContainer" containerID="4c7e57aede1fc2c2d0c835fac1c21b609cc5544169e8bce6fd32a979b79f6ebc" Feb 26 11:58:05 crc kubenswrapper[4699]: I0226 11:58:05.998596 4699 scope.go:117] "RemoveContainer" containerID="53f520b4d9fd9ed3058f67f634d54cf5c2b1235187b713b704194c6029177575" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.010406 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qrkbm"] Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.018939 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qrkbm"] Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.041435 4699 scope.go:117] "RemoveContainer" containerID="3d779da817a1ce4610ae811406c88b435f18d42b5fe4797bebcfec8947827eba" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.063088 4699 scope.go:117] "RemoveContainer" containerID="4c7e57aede1fc2c2d0c835fac1c21b609cc5544169e8bce6fd32a979b79f6ebc" Feb 26 11:58:06 crc kubenswrapper[4699]: E0226 11:58:06.063671 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c7e57aede1fc2c2d0c835fac1c21b609cc5544169e8bce6fd32a979b79f6ebc\": container with ID starting with 4c7e57aede1fc2c2d0c835fac1c21b609cc5544169e8bce6fd32a979b79f6ebc not found: ID does not exist" containerID="4c7e57aede1fc2c2d0c835fac1c21b609cc5544169e8bce6fd32a979b79f6ebc" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.063707 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c7e57aede1fc2c2d0c835fac1c21b609cc5544169e8bce6fd32a979b79f6ebc"} err="failed to get container status \"4c7e57aede1fc2c2d0c835fac1c21b609cc5544169e8bce6fd32a979b79f6ebc\": rpc error: code = NotFound desc = could not find container \"4c7e57aede1fc2c2d0c835fac1c21b609cc5544169e8bce6fd32a979b79f6ebc\": container with ID starting with 4c7e57aede1fc2c2d0c835fac1c21b609cc5544169e8bce6fd32a979b79f6ebc not found: ID does not exist" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.063729 4699 scope.go:117] "RemoveContainer" containerID="53f520b4d9fd9ed3058f67f634d54cf5c2b1235187b713b704194c6029177575" Feb 26 11:58:06 crc kubenswrapper[4699]: E0226 11:58:06.064051 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53f520b4d9fd9ed3058f67f634d54cf5c2b1235187b713b704194c6029177575\": container with ID starting with 53f520b4d9fd9ed3058f67f634d54cf5c2b1235187b713b704194c6029177575 not found: ID does not exist" containerID="53f520b4d9fd9ed3058f67f634d54cf5c2b1235187b713b704194c6029177575" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.064077 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53f520b4d9fd9ed3058f67f634d54cf5c2b1235187b713b704194c6029177575"} err="failed to get container status \"53f520b4d9fd9ed3058f67f634d54cf5c2b1235187b713b704194c6029177575\": rpc error: code = NotFound desc = could not find container \"53f520b4d9fd9ed3058f67f634d54cf5c2b1235187b713b704194c6029177575\": container with ID starting with 53f520b4d9fd9ed3058f67f634d54cf5c2b1235187b713b704194c6029177575 not found: ID does not exist" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.064097 4699 scope.go:117] "RemoveContainer" containerID="3d779da817a1ce4610ae811406c88b435f18d42b5fe4797bebcfec8947827eba" Feb 26 11:58:06 crc kubenswrapper[4699]: E0226 11:58:06.064475 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d779da817a1ce4610ae811406c88b435f18d42b5fe4797bebcfec8947827eba\": container with ID starting with 3d779da817a1ce4610ae811406c88b435f18d42b5fe4797bebcfec8947827eba not found: ID does not exist" containerID="3d779da817a1ce4610ae811406c88b435f18d42b5fe4797bebcfec8947827eba" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.064503 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d779da817a1ce4610ae811406c88b435f18d42b5fe4797bebcfec8947827eba"} err="failed to get container status \"3d779da817a1ce4610ae811406c88b435f18d42b5fe4797bebcfec8947827eba\": rpc error: code = NotFound desc = could not find container \"3d779da817a1ce4610ae811406c88b435f18d42b5fe4797bebcfec8947827eba\": container with ID starting with 3d779da817a1ce4610ae811406c88b435f18d42b5fe4797bebcfec8947827eba not found: ID does not exist" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.270501 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ae8ab95-85cc-473a-bbe7-6065a75e5720" path="/var/lib/kubelet/pods/5ae8ab95-85cc-473a-bbe7-6065a75e5720/volumes" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.271296 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9631f0a5-2f36-4dc0-a473-38fe1d97215d" path="/var/lib/kubelet/pods/9631f0a5-2f36-4dc0-a473-38fe1d97215d/volumes" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.613678 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 26 11:58:06 crc kubenswrapper[4699]: E0226 11:58:06.614091 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9631f0a5-2f36-4dc0-a473-38fe1d97215d" containerName="extract-utilities" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.614105 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="9631f0a5-2f36-4dc0-a473-38fe1d97215d" containerName="extract-utilities" Feb 26 11:58:06 crc kubenswrapper[4699]: E0226 11:58:06.614130 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9631f0a5-2f36-4dc0-a473-38fe1d97215d" containerName="registry-server" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.614141 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="9631f0a5-2f36-4dc0-a473-38fe1d97215d" containerName="registry-server" Feb 26 11:58:06 crc kubenswrapper[4699]: E0226 11:58:06.614166 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bde379e-7dd7-4b4b-bc25-b83d0174b100" containerName="oc" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.614174 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bde379e-7dd7-4b4b-bc25-b83d0174b100" containerName="oc" Feb 26 11:58:06 crc kubenswrapper[4699]: E0226 11:58:06.614189 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9631f0a5-2f36-4dc0-a473-38fe1d97215d" containerName="extract-content" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.614195 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="9631f0a5-2f36-4dc0-a473-38fe1d97215d" containerName="extract-content" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.614382 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="9631f0a5-2f36-4dc0-a473-38fe1d97215d" containerName="registry-server" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.614398 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bde379e-7dd7-4b4b-bc25-b83d0174b100" containerName="oc" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.615231 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.618188 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-fmwlb" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.618440 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.618506 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.618442 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.636511 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.734506 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19e02200-91be-49f8-8174-4a0bf6cda9dd-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.734558 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.734597 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/19e02200-91be-49f8-8174-4a0bf6cda9dd-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.734640 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19e02200-91be-49f8-8174-4a0bf6cda9dd-config-data\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.734688 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/19e02200-91be-49f8-8174-4a0bf6cda9dd-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.734727 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw89z\" (UniqueName: \"kubernetes.io/projected/19e02200-91be-49f8-8174-4a0bf6cda9dd-kube-api-access-qw89z\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.734763 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/19e02200-91be-49f8-8174-4a0bf6cda9dd-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.734787 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/19e02200-91be-49f8-8174-4a0bf6cda9dd-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.734996 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/19e02200-91be-49f8-8174-4a0bf6cda9dd-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.836861 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/19e02200-91be-49f8-8174-4a0bf6cda9dd-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.836961 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/19e02200-91be-49f8-8174-4a0bf6cda9dd-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.837050 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19e02200-91be-49f8-8174-4a0bf6cda9dd-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.837080 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.837100 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/19e02200-91be-49f8-8174-4a0bf6cda9dd-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.837137 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19e02200-91be-49f8-8174-4a0bf6cda9dd-config-data\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.837161 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/19e02200-91be-49f8-8174-4a0bf6cda9dd-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.837186 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw89z\" (UniqueName: \"kubernetes.io/projected/19e02200-91be-49f8-8174-4a0bf6cda9dd-kube-api-access-qw89z\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.837217 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/19e02200-91be-49f8-8174-4a0bf6cda9dd-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.837610 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.837887 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/19e02200-91be-49f8-8174-4a0bf6cda9dd-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.837975 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/19e02200-91be-49f8-8174-4a0bf6cda9dd-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.838401 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/19e02200-91be-49f8-8174-4a0bf6cda9dd-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.840274 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19e02200-91be-49f8-8174-4a0bf6cda9dd-config-data\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.842721 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/19e02200-91be-49f8-8174-4a0bf6cda9dd-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.842893 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/19e02200-91be-49f8-8174-4a0bf6cda9dd-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.846652 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19e02200-91be-49f8-8174-4a0bf6cda9dd-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.858730 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw89z\" (UniqueName: \"kubernetes.io/projected/19e02200-91be-49f8-8174-4a0bf6cda9dd-kube-api-access-qw89z\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.877089 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " pod="openstack/tempest-tests-tempest" Feb 26 11:58:06 crc kubenswrapper[4699]: I0226 11:58:06.937469 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 26 11:58:07 crc kubenswrapper[4699]: I0226 11:58:07.381636 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 26 11:58:08 crc kubenswrapper[4699]: I0226 11:58:08.333829 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"19e02200-91be-49f8-8174-4a0bf6cda9dd","Type":"ContainerStarted","Data":"0f178f25ec5476c2b73a67092a0049cc1be8c1984e676d6f03c82e6dac970a0f"} Feb 26 11:58:31 crc kubenswrapper[4699]: I0226 11:58:31.961085 4699 scope.go:117] "RemoveContainer" containerID="e246a9fcedf1306ea4a405c16944f8ad4f9cf630b0ec81a4cd3160f4b051a918" Feb 26 11:58:38 crc kubenswrapper[4699]: E0226 11:58:38.991011 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Feb 26 11:58:38 crc kubenswrapper[4699]: E0226 11:58:38.991950 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qw89z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(19e02200-91be-49f8-8174-4a0bf6cda9dd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 11:58:38 crc kubenswrapper[4699]: E0226 11:58:38.993229 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="19e02200-91be-49f8-8174-4a0bf6cda9dd" Feb 26 11:58:39 crc kubenswrapper[4699]: E0226 11:58:39.638075 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="19e02200-91be-49f8-8174-4a0bf6cda9dd" Feb 26 11:58:55 crc kubenswrapper[4699]: I0226 11:58:55.781989 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"19e02200-91be-49f8-8174-4a0bf6cda9dd","Type":"ContainerStarted","Data":"084f210d6c46d1c100bf0bcfdc7ffd17238944ee1beffdf271d0e8035c249561"} Feb 26 11:58:55 crc kubenswrapper[4699]: I0226 11:58:55.799343 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.503911135 podStartE2EDuration="50.799327305s" podCreationTimestamp="2026-02-26 11:58:05 +0000 UTC" firstStartedPulling="2026-02-26 11:58:07.386191333 +0000 UTC m=+2833.197017767" lastFinishedPulling="2026-02-26 11:58:53.681607513 +0000 UTC m=+2879.492433937" observedRunningTime="2026-02-26 11:58:55.797550864 +0000 UTC m=+2881.608377308" watchObservedRunningTime="2026-02-26 11:58:55.799327305 +0000 UTC m=+2881.610153739" Feb 26 11:59:41 crc kubenswrapper[4699]: I0226 11:59:41.584769 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 11:59:41 crc kubenswrapper[4699]: I0226 11:59:41.585612 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.153432 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535120-q8qcg"] Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.155679 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535120-q8qcg" Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.158900 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.159154 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.164511 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535120-xkftf"] Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.166469 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535120-xkftf" Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.171998 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.172208 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.172233 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.174571 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535120-xkftf"] Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.183686 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535120-q8qcg"] Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.325638 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26ea785f-e6f4-487c-9c19-f7bff53a2a12-config-volume\") pod \"collect-profiles-29535120-q8qcg\" (UID: \"26ea785f-e6f4-487c-9c19-f7bff53a2a12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535120-q8qcg" Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.325692 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs9v4\" (UniqueName: \"kubernetes.io/projected/062171a4-9cf3-460e-822d-2dc7b5baaf9b-kube-api-access-rs9v4\") pod \"auto-csr-approver-29535120-xkftf\" (UID: \"062171a4-9cf3-460e-822d-2dc7b5baaf9b\") " pod="openshift-infra/auto-csr-approver-29535120-xkftf" Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.325986 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbhdd\" (UniqueName: \"kubernetes.io/projected/26ea785f-e6f4-487c-9c19-f7bff53a2a12-kube-api-access-fbhdd\") pod \"collect-profiles-29535120-q8qcg\" (UID: \"26ea785f-e6f4-487c-9c19-f7bff53a2a12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535120-q8qcg" Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.326219 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26ea785f-e6f4-487c-9c19-f7bff53a2a12-secret-volume\") pod \"collect-profiles-29535120-q8qcg\" (UID: \"26ea785f-e6f4-487c-9c19-f7bff53a2a12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535120-q8qcg" Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.428652 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26ea785f-e6f4-487c-9c19-f7bff53a2a12-secret-volume\") pod \"collect-profiles-29535120-q8qcg\" (UID: \"26ea785f-e6f4-487c-9c19-f7bff53a2a12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535120-q8qcg" Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.429203 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26ea785f-e6f4-487c-9c19-f7bff53a2a12-config-volume\") pod \"collect-profiles-29535120-q8qcg\" (UID: \"26ea785f-e6f4-487c-9c19-f7bff53a2a12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535120-q8qcg" Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.429246 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs9v4\" (UniqueName: \"kubernetes.io/projected/062171a4-9cf3-460e-822d-2dc7b5baaf9b-kube-api-access-rs9v4\") pod \"auto-csr-approver-29535120-xkftf\" (UID: \"062171a4-9cf3-460e-822d-2dc7b5baaf9b\") " pod="openshift-infra/auto-csr-approver-29535120-xkftf" Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.429380 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbhdd\" (UniqueName: \"kubernetes.io/projected/26ea785f-e6f4-487c-9c19-f7bff53a2a12-kube-api-access-fbhdd\") pod \"collect-profiles-29535120-q8qcg\" (UID: \"26ea785f-e6f4-487c-9c19-f7bff53a2a12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535120-q8qcg" Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.429990 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26ea785f-e6f4-487c-9c19-f7bff53a2a12-config-volume\") pod \"collect-profiles-29535120-q8qcg\" (UID: \"26ea785f-e6f4-487c-9c19-f7bff53a2a12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535120-q8qcg" Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.439984 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26ea785f-e6f4-487c-9c19-f7bff53a2a12-secret-volume\") pod \"collect-profiles-29535120-q8qcg\" (UID: \"26ea785f-e6f4-487c-9c19-f7bff53a2a12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535120-q8qcg" Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.444852 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs9v4\" (UniqueName: \"kubernetes.io/projected/062171a4-9cf3-460e-822d-2dc7b5baaf9b-kube-api-access-rs9v4\") pod \"auto-csr-approver-29535120-xkftf\" (UID: \"062171a4-9cf3-460e-822d-2dc7b5baaf9b\") " pod="openshift-infra/auto-csr-approver-29535120-xkftf" Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.447811 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbhdd\" (UniqueName: \"kubernetes.io/projected/26ea785f-e6f4-487c-9c19-f7bff53a2a12-kube-api-access-fbhdd\") pod \"collect-profiles-29535120-q8qcg\" (UID: \"26ea785f-e6f4-487c-9c19-f7bff53a2a12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535120-q8qcg" Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.488942 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535120-q8qcg" Feb 26 12:00:00 crc kubenswrapper[4699]: I0226 12:00:00.496322 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535120-xkftf" Feb 26 12:00:01 crc kubenswrapper[4699]: I0226 12:00:01.007930 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535120-xkftf"] Feb 26 12:00:01 crc kubenswrapper[4699]: I0226 12:00:01.018490 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535120-q8qcg"] Feb 26 12:00:01 crc kubenswrapper[4699]: I0226 12:00:01.768295 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535120-xkftf" event={"ID":"062171a4-9cf3-460e-822d-2dc7b5baaf9b","Type":"ContainerStarted","Data":"c26d871505780476f7154f1136c24b88741942dcf05b640336ae294176e2c781"} Feb 26 12:00:01 crc kubenswrapper[4699]: I0226 12:00:01.770770 4699 generic.go:334] "Generic (PLEG): container finished" podID="26ea785f-e6f4-487c-9c19-f7bff53a2a12" containerID="304eeefd135ff84fa620ff0aadd68b2912afb8f7d23f40cde4711f81278e81fe" exitCode=0 Feb 26 12:00:01 crc kubenswrapper[4699]: I0226 12:00:01.770834 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535120-q8qcg" event={"ID":"26ea785f-e6f4-487c-9c19-f7bff53a2a12","Type":"ContainerDied","Data":"304eeefd135ff84fa620ff0aadd68b2912afb8f7d23f40cde4711f81278e81fe"} Feb 26 12:00:01 crc kubenswrapper[4699]: I0226 12:00:01.770895 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535120-q8qcg" event={"ID":"26ea785f-e6f4-487c-9c19-f7bff53a2a12","Type":"ContainerStarted","Data":"8d63703d2ccf60e02ccc5992cdbb35dabbc475b53bce0c58d259ee68fd5667ef"} Feb 26 12:00:03 crc kubenswrapper[4699]: I0226 12:00:03.165251 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535120-q8qcg" Feb 26 12:00:03 crc kubenswrapper[4699]: I0226 12:00:03.284210 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26ea785f-e6f4-487c-9c19-f7bff53a2a12-secret-volume\") pod \"26ea785f-e6f4-487c-9c19-f7bff53a2a12\" (UID: \"26ea785f-e6f4-487c-9c19-f7bff53a2a12\") " Feb 26 12:00:03 crc kubenswrapper[4699]: I0226 12:00:03.284467 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26ea785f-e6f4-487c-9c19-f7bff53a2a12-config-volume\") pod \"26ea785f-e6f4-487c-9c19-f7bff53a2a12\" (UID: \"26ea785f-e6f4-487c-9c19-f7bff53a2a12\") " Feb 26 12:00:03 crc kubenswrapper[4699]: I0226 12:00:03.284491 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbhdd\" (UniqueName: \"kubernetes.io/projected/26ea785f-e6f4-487c-9c19-f7bff53a2a12-kube-api-access-fbhdd\") pod \"26ea785f-e6f4-487c-9c19-f7bff53a2a12\" (UID: \"26ea785f-e6f4-487c-9c19-f7bff53a2a12\") " Feb 26 12:00:03 crc kubenswrapper[4699]: I0226 12:00:03.285377 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26ea785f-e6f4-487c-9c19-f7bff53a2a12-config-volume" (OuterVolumeSpecName: "config-volume") pod "26ea785f-e6f4-487c-9c19-f7bff53a2a12" (UID: "26ea785f-e6f4-487c-9c19-f7bff53a2a12"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 12:00:03 crc kubenswrapper[4699]: I0226 12:00:03.290898 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26ea785f-e6f4-487c-9c19-f7bff53a2a12-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "26ea785f-e6f4-487c-9c19-f7bff53a2a12" (UID: "26ea785f-e6f4-487c-9c19-f7bff53a2a12"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 12:00:03 crc kubenswrapper[4699]: I0226 12:00:03.290937 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26ea785f-e6f4-487c-9c19-f7bff53a2a12-kube-api-access-fbhdd" (OuterVolumeSpecName: "kube-api-access-fbhdd") pod "26ea785f-e6f4-487c-9c19-f7bff53a2a12" (UID: "26ea785f-e6f4-487c-9c19-f7bff53a2a12"). InnerVolumeSpecName "kube-api-access-fbhdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:00:03 crc kubenswrapper[4699]: I0226 12:00:03.387775 4699 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26ea785f-e6f4-487c-9c19-f7bff53a2a12-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 12:00:03 crc kubenswrapper[4699]: I0226 12:00:03.387980 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbhdd\" (UniqueName: \"kubernetes.io/projected/26ea785f-e6f4-487c-9c19-f7bff53a2a12-kube-api-access-fbhdd\") on node \"crc\" DevicePath \"\"" Feb 26 12:00:03 crc kubenswrapper[4699]: I0226 12:00:03.388070 4699 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26ea785f-e6f4-487c-9c19-f7bff53a2a12-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 12:00:03 crc kubenswrapper[4699]: I0226 12:00:03.791181 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535120-q8qcg" event={"ID":"26ea785f-e6f4-487c-9c19-f7bff53a2a12","Type":"ContainerDied","Data":"8d63703d2ccf60e02ccc5992cdbb35dabbc475b53bce0c58d259ee68fd5667ef"} Feb 26 12:00:03 crc kubenswrapper[4699]: I0226 12:00:03.791227 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d63703d2ccf60e02ccc5992cdbb35dabbc475b53bce0c58d259ee68fd5667ef" Feb 26 12:00:03 crc kubenswrapper[4699]: I0226 12:00:03.791262 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535120-q8qcg" Feb 26 12:00:04 crc kubenswrapper[4699]: I0226 12:00:04.254831 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535075-hl4g4"] Feb 26 12:00:04 crc kubenswrapper[4699]: I0226 12:00:04.273276 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535075-hl4g4"] Feb 26 12:00:06 crc kubenswrapper[4699]: I0226 12:00:06.321360 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed8aec36-74ad-4c69-baf8-d672010495e9" path="/var/lib/kubelet/pods/ed8aec36-74ad-4c69-baf8-d672010495e9/volumes" Feb 26 12:00:06 crc kubenswrapper[4699]: I0226 12:00:06.819956 4699 generic.go:334] "Generic (PLEG): container finished" podID="062171a4-9cf3-460e-822d-2dc7b5baaf9b" containerID="133bac2de294eabd3d63693bc2552e8927f3fa0a60ee9ff7dd1f74c8eac8b98e" exitCode=0 Feb 26 12:00:06 crc kubenswrapper[4699]: I0226 12:00:06.820024 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535120-xkftf" event={"ID":"062171a4-9cf3-460e-822d-2dc7b5baaf9b","Type":"ContainerDied","Data":"133bac2de294eabd3d63693bc2552e8927f3fa0a60ee9ff7dd1f74c8eac8b98e"} Feb 26 12:00:08 crc kubenswrapper[4699]: I0226 12:00:08.444577 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535120-xkftf" Feb 26 12:00:08 crc kubenswrapper[4699]: I0226 12:00:08.561669 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs9v4\" (UniqueName: \"kubernetes.io/projected/062171a4-9cf3-460e-822d-2dc7b5baaf9b-kube-api-access-rs9v4\") pod \"062171a4-9cf3-460e-822d-2dc7b5baaf9b\" (UID: \"062171a4-9cf3-460e-822d-2dc7b5baaf9b\") " Feb 26 12:00:08 crc kubenswrapper[4699]: I0226 12:00:08.569419 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/062171a4-9cf3-460e-822d-2dc7b5baaf9b-kube-api-access-rs9v4" (OuterVolumeSpecName: "kube-api-access-rs9v4") pod "062171a4-9cf3-460e-822d-2dc7b5baaf9b" (UID: "062171a4-9cf3-460e-822d-2dc7b5baaf9b"). InnerVolumeSpecName "kube-api-access-rs9v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:00:08 crc kubenswrapper[4699]: I0226 12:00:08.665016 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs9v4\" (UniqueName: \"kubernetes.io/projected/062171a4-9cf3-460e-822d-2dc7b5baaf9b-kube-api-access-rs9v4\") on node \"crc\" DevicePath \"\"" Feb 26 12:00:08 crc kubenswrapper[4699]: I0226 12:00:08.839687 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535120-xkftf" event={"ID":"062171a4-9cf3-460e-822d-2dc7b5baaf9b","Type":"ContainerDied","Data":"c26d871505780476f7154f1136c24b88741942dcf05b640336ae294176e2c781"} Feb 26 12:00:08 crc kubenswrapper[4699]: I0226 12:00:08.839715 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535120-xkftf" Feb 26 12:00:08 crc kubenswrapper[4699]: I0226 12:00:08.839728 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c26d871505780476f7154f1136c24b88741942dcf05b640336ae294176e2c781" Feb 26 12:00:09 crc kubenswrapper[4699]: I0226 12:00:09.503009 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535114-zp6df"] Feb 26 12:00:09 crc kubenswrapper[4699]: I0226 12:00:09.511445 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535114-zp6df"] Feb 26 12:00:10 crc kubenswrapper[4699]: I0226 12:00:10.273598 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8acba14-233d-44a8-98b6-93df64a45300" path="/var/lib/kubelet/pods/c8acba14-233d-44a8-98b6-93df64a45300/volumes" Feb 26 12:00:11 crc kubenswrapper[4699]: I0226 12:00:11.584861 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 12:00:11 crc kubenswrapper[4699]: I0226 12:00:11.586286 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 12:00:38 crc kubenswrapper[4699]: I0226 12:00:38.948709 4699 scope.go:117] "RemoveContainer" containerID="1a649c81866f7635a569ca368b86ef4aadb641a91575dd77e87694a700822950" Feb 26 12:00:38 crc kubenswrapper[4699]: I0226 12:00:38.983470 4699 scope.go:117] "RemoveContainer" containerID="ffa425939368131f51ca5df0c799cff39019457552b4886c8f2b5719e7868319" Feb 26 12:00:41 crc kubenswrapper[4699]: I0226 12:00:41.584948 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 12:00:41 crc kubenswrapper[4699]: I0226 12:00:41.585535 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 12:00:41 crc kubenswrapper[4699]: I0226 12:00:41.585621 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 12:00:41 crc kubenswrapper[4699]: I0226 12:00:41.586912 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948"} pod="openshift-machine-config-operator/machine-config-daemon-28p79" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 12:00:41 crc kubenswrapper[4699]: I0226 12:00:41.587009 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" containerID="cri-o://2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" gracePeriod=600 Feb 26 12:00:41 crc kubenswrapper[4699]: E0226 12:00:41.709655 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:00:42 crc kubenswrapper[4699]: I0226 12:00:42.147396 4699 generic.go:334] "Generic (PLEG): container finished" podID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" exitCode=0 Feb 26 12:00:42 crc kubenswrapper[4699]: I0226 12:00:42.147479 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerDied","Data":"2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948"} Feb 26 12:00:42 crc kubenswrapper[4699]: I0226 12:00:42.147551 4699 scope.go:117] "RemoveContainer" containerID="4c58903dcb4a12909fcb0583a1d55149dc4c175867d594a32d984dde51ae536f" Feb 26 12:00:42 crc kubenswrapper[4699]: I0226 12:00:42.148861 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:00:42 crc kubenswrapper[4699]: E0226 12:00:42.149429 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:00:53 crc kubenswrapper[4699]: I0226 12:00:53.261318 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:00:53 crc kubenswrapper[4699]: E0226 12:00:53.262199 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:01:00 crc kubenswrapper[4699]: I0226 12:01:00.155355 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29535121-plvtd"] Feb 26 12:01:00 crc kubenswrapper[4699]: E0226 12:01:00.157360 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26ea785f-e6f4-487c-9c19-f7bff53a2a12" containerName="collect-profiles" Feb 26 12:01:00 crc kubenswrapper[4699]: I0226 12:01:00.157451 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="26ea785f-e6f4-487c-9c19-f7bff53a2a12" containerName="collect-profiles" Feb 26 12:01:00 crc kubenswrapper[4699]: E0226 12:01:00.157510 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="062171a4-9cf3-460e-822d-2dc7b5baaf9b" containerName="oc" Feb 26 12:01:00 crc kubenswrapper[4699]: I0226 12:01:00.157563 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="062171a4-9cf3-460e-822d-2dc7b5baaf9b" containerName="oc" Feb 26 12:01:00 crc kubenswrapper[4699]: I0226 12:01:00.157798 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="26ea785f-e6f4-487c-9c19-f7bff53a2a12" containerName="collect-profiles" Feb 26 12:01:00 crc kubenswrapper[4699]: I0226 12:01:00.157873 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="062171a4-9cf3-460e-822d-2dc7b5baaf9b" containerName="oc" Feb 26 12:01:00 crc kubenswrapper[4699]: I0226 12:01:00.158691 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535121-plvtd" Feb 26 12:01:00 crc kubenswrapper[4699]: I0226 12:01:00.169770 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29535121-plvtd"] Feb 26 12:01:00 crc kubenswrapper[4699]: I0226 12:01:00.353870 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-config-data\") pod \"keystone-cron-29535121-plvtd\" (UID: \"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68\") " pod="openstack/keystone-cron-29535121-plvtd" Feb 26 12:01:00 crc kubenswrapper[4699]: I0226 12:01:00.353993 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-combined-ca-bundle\") pod \"keystone-cron-29535121-plvtd\" (UID: \"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68\") " pod="openstack/keystone-cron-29535121-plvtd" Feb 26 12:01:00 crc kubenswrapper[4699]: I0226 12:01:00.354148 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-fernet-keys\") pod \"keystone-cron-29535121-plvtd\" (UID: \"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68\") " pod="openstack/keystone-cron-29535121-plvtd" Feb 26 12:01:00 crc kubenswrapper[4699]: I0226 12:01:00.354185 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ng9p\" (UniqueName: \"kubernetes.io/projected/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-kube-api-access-6ng9p\") pod \"keystone-cron-29535121-plvtd\" (UID: \"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68\") " pod="openstack/keystone-cron-29535121-plvtd" Feb 26 12:01:00 crc kubenswrapper[4699]: I0226 12:01:00.455659 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-combined-ca-bundle\") pod \"keystone-cron-29535121-plvtd\" (UID: \"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68\") " pod="openstack/keystone-cron-29535121-plvtd" Feb 26 12:01:00 crc kubenswrapper[4699]: I0226 12:01:00.455715 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-fernet-keys\") pod \"keystone-cron-29535121-plvtd\" (UID: \"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68\") " pod="openstack/keystone-cron-29535121-plvtd" Feb 26 12:01:00 crc kubenswrapper[4699]: I0226 12:01:00.455732 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ng9p\" (UniqueName: \"kubernetes.io/projected/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-kube-api-access-6ng9p\") pod \"keystone-cron-29535121-plvtd\" (UID: \"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68\") " pod="openstack/keystone-cron-29535121-plvtd" Feb 26 12:01:00 crc kubenswrapper[4699]: I0226 12:01:00.455844 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-config-data\") pod \"keystone-cron-29535121-plvtd\" (UID: \"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68\") " pod="openstack/keystone-cron-29535121-plvtd" Feb 26 12:01:00 crc kubenswrapper[4699]: I0226 12:01:00.464212 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-combined-ca-bundle\") pod \"keystone-cron-29535121-plvtd\" (UID: \"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68\") " pod="openstack/keystone-cron-29535121-plvtd" Feb 26 12:01:00 crc kubenswrapper[4699]: I0226 12:01:00.464417 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-config-data\") pod \"keystone-cron-29535121-plvtd\" (UID: \"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68\") " pod="openstack/keystone-cron-29535121-plvtd" Feb 26 12:01:00 crc kubenswrapper[4699]: I0226 12:01:00.464839 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-fernet-keys\") pod \"keystone-cron-29535121-plvtd\" (UID: \"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68\") " pod="openstack/keystone-cron-29535121-plvtd" Feb 26 12:01:00 crc kubenswrapper[4699]: I0226 12:01:00.475892 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ng9p\" (UniqueName: \"kubernetes.io/projected/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-kube-api-access-6ng9p\") pod \"keystone-cron-29535121-plvtd\" (UID: \"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68\") " pod="openstack/keystone-cron-29535121-plvtd" Feb 26 12:01:00 crc kubenswrapper[4699]: I0226 12:01:00.499709 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535121-plvtd" Feb 26 12:01:00 crc kubenswrapper[4699]: I0226 12:01:00.956667 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29535121-plvtd"] Feb 26 12:01:00 crc kubenswrapper[4699]: W0226 12:01:00.961411 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef1e8bd7_66e8_4eef_979e_8bf3e57b2a68.slice/crio-3d58b8dcce2fcc1b977dea1a22af6516e2315c5295cd925274627d773832fceb WatchSource:0}: Error finding container 3d58b8dcce2fcc1b977dea1a22af6516e2315c5295cd925274627d773832fceb: Status 404 returned error can't find the container with id 3d58b8dcce2fcc1b977dea1a22af6516e2315c5295cd925274627d773832fceb Feb 26 12:01:01 crc kubenswrapper[4699]: I0226 12:01:01.327850 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535121-plvtd" event={"ID":"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68","Type":"ContainerStarted","Data":"f82b7e69cb8fe7ef0bdf92eb3048b514e80df2fb3095107990fb1a608f73583a"} Feb 26 12:01:01 crc kubenswrapper[4699]: I0226 12:01:01.328455 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535121-plvtd" event={"ID":"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68","Type":"ContainerStarted","Data":"3d58b8dcce2fcc1b977dea1a22af6516e2315c5295cd925274627d773832fceb"} Feb 26 12:01:01 crc kubenswrapper[4699]: I0226 12:01:01.354722 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29535121-plvtd" podStartSLOduration=1.354690347 podStartE2EDuration="1.354690347s" podCreationTimestamp="2026-02-26 12:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 12:01:01.344929117 +0000 UTC m=+3007.155755571" watchObservedRunningTime="2026-02-26 12:01:01.354690347 +0000 UTC m=+3007.165516791" Feb 26 12:01:03 crc kubenswrapper[4699]: I0226 12:01:03.346695 4699 generic.go:334] "Generic (PLEG): container finished" podID="ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68" containerID="f82b7e69cb8fe7ef0bdf92eb3048b514e80df2fb3095107990fb1a608f73583a" exitCode=0 Feb 26 12:01:03 crc kubenswrapper[4699]: I0226 12:01:03.346785 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535121-plvtd" event={"ID":"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68","Type":"ContainerDied","Data":"f82b7e69cb8fe7ef0bdf92eb3048b514e80df2fb3095107990fb1a608f73583a"} Feb 26 12:01:04 crc kubenswrapper[4699]: I0226 12:01:04.262023 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:01:04 crc kubenswrapper[4699]: E0226 12:01:04.262352 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:01:04 crc kubenswrapper[4699]: I0226 12:01:04.751931 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535121-plvtd" Feb 26 12:01:04 crc kubenswrapper[4699]: I0226 12:01:04.851616 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ng9p\" (UniqueName: \"kubernetes.io/projected/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-kube-api-access-6ng9p\") pod \"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68\" (UID: \"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68\") " Feb 26 12:01:04 crc kubenswrapper[4699]: I0226 12:01:04.851711 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-config-data\") pod \"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68\" (UID: \"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68\") " Feb 26 12:01:04 crc kubenswrapper[4699]: I0226 12:01:04.851751 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-combined-ca-bundle\") pod \"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68\" (UID: \"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68\") " Feb 26 12:01:04 crc kubenswrapper[4699]: I0226 12:01:04.851805 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-fernet-keys\") pod \"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68\" (UID: \"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68\") " Feb 26 12:01:04 crc kubenswrapper[4699]: I0226 12:01:04.857875 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68" (UID: "ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 12:01:04 crc kubenswrapper[4699]: I0226 12:01:04.871503 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-kube-api-access-6ng9p" (OuterVolumeSpecName: "kube-api-access-6ng9p") pod "ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68" (UID: "ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68"). InnerVolumeSpecName "kube-api-access-6ng9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:01:04 crc kubenswrapper[4699]: I0226 12:01:04.894553 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68" (UID: "ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 12:01:04 crc kubenswrapper[4699]: I0226 12:01:04.917810 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-config-data" (OuterVolumeSpecName: "config-data") pod "ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68" (UID: "ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 12:01:04 crc kubenswrapper[4699]: I0226 12:01:04.954763 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ng9p\" (UniqueName: \"kubernetes.io/projected/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-kube-api-access-6ng9p\") on node \"crc\" DevicePath \"\"" Feb 26 12:01:04 crc kubenswrapper[4699]: I0226 12:01:04.954796 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 12:01:04 crc kubenswrapper[4699]: I0226 12:01:04.954805 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 12:01:04 crc kubenswrapper[4699]: I0226 12:01:04.954813 4699 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 26 12:01:05 crc kubenswrapper[4699]: I0226 12:01:05.369234 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535121-plvtd" event={"ID":"ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68","Type":"ContainerDied","Data":"3d58b8dcce2fcc1b977dea1a22af6516e2315c5295cd925274627d773832fceb"} Feb 26 12:01:05 crc kubenswrapper[4699]: I0226 12:01:05.369702 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d58b8dcce2fcc1b977dea1a22af6516e2315c5295cd925274627d773832fceb" Feb 26 12:01:05 crc kubenswrapper[4699]: I0226 12:01:05.369309 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535121-plvtd" Feb 26 12:01:18 crc kubenswrapper[4699]: I0226 12:01:18.260782 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:01:18 crc kubenswrapper[4699]: E0226 12:01:18.261576 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:01:33 crc kubenswrapper[4699]: I0226 12:01:33.261234 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:01:33 crc kubenswrapper[4699]: E0226 12:01:33.262419 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:01:46 crc kubenswrapper[4699]: I0226 12:01:46.270380 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:01:46 crc kubenswrapper[4699]: E0226 12:01:46.271356 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:02:00 crc kubenswrapper[4699]: I0226 12:02:00.152052 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535122-xp67b"] Feb 26 12:02:00 crc kubenswrapper[4699]: E0226 12:02:00.153266 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68" containerName="keystone-cron" Feb 26 12:02:00 crc kubenswrapper[4699]: I0226 12:02:00.153284 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68" containerName="keystone-cron" Feb 26 12:02:00 crc kubenswrapper[4699]: I0226 12:02:00.153572 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68" containerName="keystone-cron" Feb 26 12:02:00 crc kubenswrapper[4699]: I0226 12:02:00.154378 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535122-xp67b" Feb 26 12:02:00 crc kubenswrapper[4699]: I0226 12:02:00.159663 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 12:02:00 crc kubenswrapper[4699]: I0226 12:02:00.159685 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 12:02:00 crc kubenswrapper[4699]: I0226 12:02:00.160468 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 12:02:00 crc kubenswrapper[4699]: I0226 12:02:00.162459 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535122-xp67b"] Feb 26 12:02:00 crc kubenswrapper[4699]: I0226 12:02:00.268195 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg5h4\" (UniqueName: \"kubernetes.io/projected/27a271ab-4d30-4863-b3f6-74750cc65a91-kube-api-access-dg5h4\") pod \"auto-csr-approver-29535122-xp67b\" (UID: \"27a271ab-4d30-4863-b3f6-74750cc65a91\") " pod="openshift-infra/auto-csr-approver-29535122-xp67b" Feb 26 12:02:00 crc kubenswrapper[4699]: I0226 12:02:00.370426 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg5h4\" (UniqueName: \"kubernetes.io/projected/27a271ab-4d30-4863-b3f6-74750cc65a91-kube-api-access-dg5h4\") pod \"auto-csr-approver-29535122-xp67b\" (UID: \"27a271ab-4d30-4863-b3f6-74750cc65a91\") " pod="openshift-infra/auto-csr-approver-29535122-xp67b" Feb 26 12:02:00 crc kubenswrapper[4699]: I0226 12:02:00.393027 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg5h4\" (UniqueName: \"kubernetes.io/projected/27a271ab-4d30-4863-b3f6-74750cc65a91-kube-api-access-dg5h4\") pod \"auto-csr-approver-29535122-xp67b\" (UID: \"27a271ab-4d30-4863-b3f6-74750cc65a91\") " pod="openshift-infra/auto-csr-approver-29535122-xp67b" Feb 26 12:02:00 crc kubenswrapper[4699]: I0226 12:02:00.490848 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535122-xp67b" Feb 26 12:02:00 crc kubenswrapper[4699]: I0226 12:02:00.956147 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535122-xp67b"] Feb 26 12:02:01 crc kubenswrapper[4699]: I0226 12:02:01.031331 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535122-xp67b" event={"ID":"27a271ab-4d30-4863-b3f6-74750cc65a91","Type":"ContainerStarted","Data":"429c6a30970d5da958deab22c2e4bb6cf27c687d3ad8243aeacaea04a0d870dd"} Feb 26 12:02:01 crc kubenswrapper[4699]: I0226 12:02:01.260633 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:02:01 crc kubenswrapper[4699]: E0226 12:02:01.261011 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:02:03 crc kubenswrapper[4699]: I0226 12:02:03.070183 4699 generic.go:334] "Generic (PLEG): container finished" podID="27a271ab-4d30-4863-b3f6-74750cc65a91" containerID="f8af8d4fb65b858c79bdd65cde626e347dc9e20fd0df6dcb1821aae0c9ee9b41" exitCode=0 Feb 26 12:02:03 crc kubenswrapper[4699]: I0226 12:02:03.070289 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535122-xp67b" event={"ID":"27a271ab-4d30-4863-b3f6-74750cc65a91","Type":"ContainerDied","Data":"f8af8d4fb65b858c79bdd65cde626e347dc9e20fd0df6dcb1821aae0c9ee9b41"} Feb 26 12:02:04 crc kubenswrapper[4699]: I0226 12:02:04.468938 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535122-xp67b" Feb 26 12:02:04 crc kubenswrapper[4699]: I0226 12:02:04.650851 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg5h4\" (UniqueName: \"kubernetes.io/projected/27a271ab-4d30-4863-b3f6-74750cc65a91-kube-api-access-dg5h4\") pod \"27a271ab-4d30-4863-b3f6-74750cc65a91\" (UID: \"27a271ab-4d30-4863-b3f6-74750cc65a91\") " Feb 26 12:02:04 crc kubenswrapper[4699]: I0226 12:02:04.657999 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27a271ab-4d30-4863-b3f6-74750cc65a91-kube-api-access-dg5h4" (OuterVolumeSpecName: "kube-api-access-dg5h4") pod "27a271ab-4d30-4863-b3f6-74750cc65a91" (UID: "27a271ab-4d30-4863-b3f6-74750cc65a91"). InnerVolumeSpecName "kube-api-access-dg5h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:02:04 crc kubenswrapper[4699]: I0226 12:02:04.753821 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg5h4\" (UniqueName: \"kubernetes.io/projected/27a271ab-4d30-4863-b3f6-74750cc65a91-kube-api-access-dg5h4\") on node \"crc\" DevicePath \"\"" Feb 26 12:02:05 crc kubenswrapper[4699]: I0226 12:02:05.093127 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535122-xp67b" event={"ID":"27a271ab-4d30-4863-b3f6-74750cc65a91","Type":"ContainerDied","Data":"429c6a30970d5da958deab22c2e4bb6cf27c687d3ad8243aeacaea04a0d870dd"} Feb 26 12:02:05 crc kubenswrapper[4699]: I0226 12:02:05.093169 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535122-xp67b" Feb 26 12:02:05 crc kubenswrapper[4699]: I0226 12:02:05.093183 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="429c6a30970d5da958deab22c2e4bb6cf27c687d3ad8243aeacaea04a0d870dd" Feb 26 12:02:05 crc kubenswrapper[4699]: I0226 12:02:05.535952 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535116-fwrcj"] Feb 26 12:02:05 crc kubenswrapper[4699]: I0226 12:02:05.544802 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535116-fwrcj"] Feb 26 12:02:06 crc kubenswrapper[4699]: I0226 12:02:06.274250 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f67852f2-cfab-4e51-b986-30f2a582877d" path="/var/lib/kubelet/pods/f67852f2-cfab-4e51-b986-30f2a582877d/volumes" Feb 26 12:02:14 crc kubenswrapper[4699]: I0226 12:02:14.261423 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:02:14 crc kubenswrapper[4699]: E0226 12:02:14.263826 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:02:26 crc kubenswrapper[4699]: I0226 12:02:26.270075 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:02:26 crc kubenswrapper[4699]: E0226 12:02:26.271247 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:02:39 crc kubenswrapper[4699]: I0226 12:02:39.164004 4699 scope.go:117] "RemoveContainer" containerID="97b5ef4eef61ea4aaf36ee8c050903fab28c7dee69a56263785c220e6a8c6292" Feb 26 12:02:41 crc kubenswrapper[4699]: I0226 12:02:41.260779 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:02:41 crc kubenswrapper[4699]: E0226 12:02:41.261472 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:02:55 crc kubenswrapper[4699]: I0226 12:02:55.261556 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:02:55 crc kubenswrapper[4699]: E0226 12:02:55.262570 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:03:08 crc kubenswrapper[4699]: I0226 12:03:08.260407 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:03:08 crc kubenswrapper[4699]: E0226 12:03:08.261218 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:03:20 crc kubenswrapper[4699]: I0226 12:03:20.261577 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:03:20 crc kubenswrapper[4699]: E0226 12:03:20.262425 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:03:34 crc kubenswrapper[4699]: I0226 12:03:34.424912 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:03:34 crc kubenswrapper[4699]: E0226 12:03:34.425681 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:03:39 crc kubenswrapper[4699]: I0226 12:03:39.082023 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8vqbk"] Feb 26 12:03:39 crc kubenswrapper[4699]: E0226 12:03:39.083415 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a271ab-4d30-4863-b3f6-74750cc65a91" containerName="oc" Feb 26 12:03:39 crc kubenswrapper[4699]: I0226 12:03:39.083440 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a271ab-4d30-4863-b3f6-74750cc65a91" containerName="oc" Feb 26 12:03:39 crc kubenswrapper[4699]: I0226 12:03:39.083743 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="27a271ab-4d30-4863-b3f6-74750cc65a91" containerName="oc" Feb 26 12:03:39 crc kubenswrapper[4699]: I0226 12:03:39.086019 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vqbk" Feb 26 12:03:39 crc kubenswrapper[4699]: I0226 12:03:39.092701 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8vqbk"] Feb 26 12:03:39 crc kubenswrapper[4699]: I0226 12:03:39.207469 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvxc8\" (UniqueName: \"kubernetes.io/projected/e2958cda-c404-4b0e-a6b4-e32bdd5b2b22-kube-api-access-wvxc8\") pod \"community-operators-8vqbk\" (UID: \"e2958cda-c404-4b0e-a6b4-e32bdd5b2b22\") " pod="openshift-marketplace/community-operators-8vqbk" Feb 26 12:03:39 crc kubenswrapper[4699]: I0226 12:03:39.207516 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2958cda-c404-4b0e-a6b4-e32bdd5b2b22-utilities\") pod \"community-operators-8vqbk\" (UID: \"e2958cda-c404-4b0e-a6b4-e32bdd5b2b22\") " pod="openshift-marketplace/community-operators-8vqbk" Feb 26 12:03:39 crc kubenswrapper[4699]: I0226 12:03:39.207599 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2958cda-c404-4b0e-a6b4-e32bdd5b2b22-catalog-content\") pod \"community-operators-8vqbk\" (UID: \"e2958cda-c404-4b0e-a6b4-e32bdd5b2b22\") " pod="openshift-marketplace/community-operators-8vqbk" Feb 26 12:03:39 crc kubenswrapper[4699]: I0226 12:03:39.309831 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2958cda-c404-4b0e-a6b4-e32bdd5b2b22-catalog-content\") pod \"community-operators-8vqbk\" (UID: \"e2958cda-c404-4b0e-a6b4-e32bdd5b2b22\") " pod="openshift-marketplace/community-operators-8vqbk" Feb 26 12:03:39 crc kubenswrapper[4699]: I0226 12:03:39.310263 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvxc8\" (UniqueName: \"kubernetes.io/projected/e2958cda-c404-4b0e-a6b4-e32bdd5b2b22-kube-api-access-wvxc8\") pod \"community-operators-8vqbk\" (UID: \"e2958cda-c404-4b0e-a6b4-e32bdd5b2b22\") " pod="openshift-marketplace/community-operators-8vqbk" Feb 26 12:03:39 crc kubenswrapper[4699]: I0226 12:03:39.310343 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2958cda-c404-4b0e-a6b4-e32bdd5b2b22-catalog-content\") pod \"community-operators-8vqbk\" (UID: \"e2958cda-c404-4b0e-a6b4-e32bdd5b2b22\") " pod="openshift-marketplace/community-operators-8vqbk" Feb 26 12:03:39 crc kubenswrapper[4699]: I0226 12:03:39.310462 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2958cda-c404-4b0e-a6b4-e32bdd5b2b22-utilities\") pod \"community-operators-8vqbk\" (UID: \"e2958cda-c404-4b0e-a6b4-e32bdd5b2b22\") " pod="openshift-marketplace/community-operators-8vqbk" Feb 26 12:03:39 crc kubenswrapper[4699]: I0226 12:03:39.310791 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2958cda-c404-4b0e-a6b4-e32bdd5b2b22-utilities\") pod \"community-operators-8vqbk\" (UID: \"e2958cda-c404-4b0e-a6b4-e32bdd5b2b22\") " pod="openshift-marketplace/community-operators-8vqbk" Feb 26 12:03:39 crc kubenswrapper[4699]: I0226 12:03:39.329762 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvxc8\" (UniqueName: \"kubernetes.io/projected/e2958cda-c404-4b0e-a6b4-e32bdd5b2b22-kube-api-access-wvxc8\") pod \"community-operators-8vqbk\" (UID: \"e2958cda-c404-4b0e-a6b4-e32bdd5b2b22\") " pod="openshift-marketplace/community-operators-8vqbk" Feb 26 12:03:39 crc kubenswrapper[4699]: I0226 12:03:39.406243 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vqbk" Feb 26 12:03:39 crc kubenswrapper[4699]: I0226 12:03:39.960230 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8vqbk"] Feb 26 12:03:40 crc kubenswrapper[4699]: I0226 12:03:40.115642 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqbk" event={"ID":"e2958cda-c404-4b0e-a6b4-e32bdd5b2b22","Type":"ContainerStarted","Data":"c60a2faa98f74227c56a6cae4e1cd0d9f59ceb45b2853ca42298e281b80a3b6c"} Feb 26 12:03:41 crc kubenswrapper[4699]: I0226 12:03:41.125306 4699 generic.go:334] "Generic (PLEG): container finished" podID="e2958cda-c404-4b0e-a6b4-e32bdd5b2b22" containerID="654fc2f37bae8ca8d3c43bcce6dc15128e8e0a662bbe83f02c4ba7d9f7bc3ba0" exitCode=0 Feb 26 12:03:41 crc kubenswrapper[4699]: I0226 12:03:41.125369 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqbk" event={"ID":"e2958cda-c404-4b0e-a6b4-e32bdd5b2b22","Type":"ContainerDied","Data":"654fc2f37bae8ca8d3c43bcce6dc15128e8e0a662bbe83f02c4ba7d9f7bc3ba0"} Feb 26 12:03:41 crc kubenswrapper[4699]: I0226 12:03:41.127328 4699 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 12:03:42 crc kubenswrapper[4699]: I0226 12:03:42.136178 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqbk" event={"ID":"e2958cda-c404-4b0e-a6b4-e32bdd5b2b22","Type":"ContainerStarted","Data":"fa8520e59eea88936b0e5fd507b8acacf32d49bb81261491b76e719b13c07aed"} Feb 26 12:03:43 crc kubenswrapper[4699]: I0226 12:03:43.145239 4699 generic.go:334] "Generic (PLEG): container finished" podID="e2958cda-c404-4b0e-a6b4-e32bdd5b2b22" containerID="fa8520e59eea88936b0e5fd507b8acacf32d49bb81261491b76e719b13c07aed" exitCode=0 Feb 26 12:03:43 crc kubenswrapper[4699]: I0226 12:03:43.145358 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqbk" event={"ID":"e2958cda-c404-4b0e-a6b4-e32bdd5b2b22","Type":"ContainerDied","Data":"fa8520e59eea88936b0e5fd507b8acacf32d49bb81261491b76e719b13c07aed"} Feb 26 12:03:44 crc kubenswrapper[4699]: I0226 12:03:44.164668 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqbk" event={"ID":"e2958cda-c404-4b0e-a6b4-e32bdd5b2b22","Type":"ContainerStarted","Data":"f4243150cc96c5ee66d4d44a57548498e0161326cbcaf1706a67a62345ae76a5"} Feb 26 12:03:44 crc kubenswrapper[4699]: I0226 12:03:44.184646 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8vqbk" podStartSLOduration=2.73788143 podStartE2EDuration="5.184601837s" podCreationTimestamp="2026-02-26 12:03:39 +0000 UTC" firstStartedPulling="2026-02-26 12:03:41.127039089 +0000 UTC m=+3166.937865523" lastFinishedPulling="2026-02-26 12:03:43.573759496 +0000 UTC m=+3169.384585930" observedRunningTime="2026-02-26 12:03:44.18153607 +0000 UTC m=+3169.992362504" watchObservedRunningTime="2026-02-26 12:03:44.184601837 +0000 UTC m=+3169.995428281" Feb 26 12:03:48 crc kubenswrapper[4699]: I0226 12:03:48.261024 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:03:48 crc kubenswrapper[4699]: E0226 12:03:48.262005 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:03:49 crc kubenswrapper[4699]: I0226 12:03:49.406626 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8vqbk" Feb 26 12:03:49 crc kubenswrapper[4699]: I0226 12:03:49.406942 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8vqbk" Feb 26 12:03:49 crc kubenswrapper[4699]: I0226 12:03:49.454408 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8vqbk" Feb 26 12:03:50 crc kubenswrapper[4699]: I0226 12:03:50.253666 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8vqbk" Feb 26 12:03:50 crc kubenswrapper[4699]: I0226 12:03:50.308531 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8vqbk"] Feb 26 12:03:52 crc kubenswrapper[4699]: I0226 12:03:52.227926 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8vqbk" podUID="e2958cda-c404-4b0e-a6b4-e32bdd5b2b22" containerName="registry-server" containerID="cri-o://f4243150cc96c5ee66d4d44a57548498e0161326cbcaf1706a67a62345ae76a5" gracePeriod=2 Feb 26 12:03:52 crc kubenswrapper[4699]: I0226 12:03:52.736308 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vqbk" Feb 26 12:03:52 crc kubenswrapper[4699]: I0226 12:03:52.875026 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2958cda-c404-4b0e-a6b4-e32bdd5b2b22-utilities\") pod \"e2958cda-c404-4b0e-a6b4-e32bdd5b2b22\" (UID: \"e2958cda-c404-4b0e-a6b4-e32bdd5b2b22\") " Feb 26 12:03:52 crc kubenswrapper[4699]: I0226 12:03:52.875087 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2958cda-c404-4b0e-a6b4-e32bdd5b2b22-catalog-content\") pod \"e2958cda-c404-4b0e-a6b4-e32bdd5b2b22\" (UID: \"e2958cda-c404-4b0e-a6b4-e32bdd5b2b22\") " Feb 26 12:03:52 crc kubenswrapper[4699]: I0226 12:03:52.875210 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvxc8\" (UniqueName: \"kubernetes.io/projected/e2958cda-c404-4b0e-a6b4-e32bdd5b2b22-kube-api-access-wvxc8\") pod \"e2958cda-c404-4b0e-a6b4-e32bdd5b2b22\" (UID: \"e2958cda-c404-4b0e-a6b4-e32bdd5b2b22\") " Feb 26 12:03:52 crc kubenswrapper[4699]: I0226 12:03:52.876507 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2958cda-c404-4b0e-a6b4-e32bdd5b2b22-utilities" (OuterVolumeSpecName: "utilities") pod "e2958cda-c404-4b0e-a6b4-e32bdd5b2b22" (UID: "e2958cda-c404-4b0e-a6b4-e32bdd5b2b22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:03:52 crc kubenswrapper[4699]: I0226 12:03:52.880954 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2958cda-c404-4b0e-a6b4-e32bdd5b2b22-kube-api-access-wvxc8" (OuterVolumeSpecName: "kube-api-access-wvxc8") pod "e2958cda-c404-4b0e-a6b4-e32bdd5b2b22" (UID: "e2958cda-c404-4b0e-a6b4-e32bdd5b2b22"). InnerVolumeSpecName "kube-api-access-wvxc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:03:52 crc kubenswrapper[4699]: I0226 12:03:52.934002 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2958cda-c404-4b0e-a6b4-e32bdd5b2b22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2958cda-c404-4b0e-a6b4-e32bdd5b2b22" (UID: "e2958cda-c404-4b0e-a6b4-e32bdd5b2b22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:03:52 crc kubenswrapper[4699]: I0226 12:03:52.976814 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2958cda-c404-4b0e-a6b4-e32bdd5b2b22-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 12:03:52 crc kubenswrapper[4699]: I0226 12:03:52.976842 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2958cda-c404-4b0e-a6b4-e32bdd5b2b22-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 12:03:52 crc kubenswrapper[4699]: I0226 12:03:52.976853 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvxc8\" (UniqueName: \"kubernetes.io/projected/e2958cda-c404-4b0e-a6b4-e32bdd5b2b22-kube-api-access-wvxc8\") on node \"crc\" DevicePath \"\"" Feb 26 12:03:53 crc kubenswrapper[4699]: I0226 12:03:53.239578 4699 generic.go:334] "Generic (PLEG): container finished" podID="e2958cda-c404-4b0e-a6b4-e32bdd5b2b22" containerID="f4243150cc96c5ee66d4d44a57548498e0161326cbcaf1706a67a62345ae76a5" exitCode=0 Feb 26 12:03:53 crc kubenswrapper[4699]: I0226 12:03:53.239629 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqbk" event={"ID":"e2958cda-c404-4b0e-a6b4-e32bdd5b2b22","Type":"ContainerDied","Data":"f4243150cc96c5ee66d4d44a57548498e0161326cbcaf1706a67a62345ae76a5"} Feb 26 12:03:53 crc kubenswrapper[4699]: I0226 12:03:53.239657 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqbk" event={"ID":"e2958cda-c404-4b0e-a6b4-e32bdd5b2b22","Type":"ContainerDied","Data":"c60a2faa98f74227c56a6cae4e1cd0d9f59ceb45b2853ca42298e281b80a3b6c"} Feb 26 12:03:53 crc kubenswrapper[4699]: I0226 12:03:53.239677 4699 scope.go:117] "RemoveContainer" containerID="f4243150cc96c5ee66d4d44a57548498e0161326cbcaf1706a67a62345ae76a5" Feb 26 12:03:53 crc kubenswrapper[4699]: I0226 12:03:53.239836 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vqbk" Feb 26 12:03:53 crc kubenswrapper[4699]: I0226 12:03:53.291084 4699 scope.go:117] "RemoveContainer" containerID="fa8520e59eea88936b0e5fd507b8acacf32d49bb81261491b76e719b13c07aed" Feb 26 12:03:53 crc kubenswrapper[4699]: I0226 12:03:53.301205 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8vqbk"] Feb 26 12:03:53 crc kubenswrapper[4699]: I0226 12:03:53.317826 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8vqbk"] Feb 26 12:03:53 crc kubenswrapper[4699]: I0226 12:03:53.330748 4699 scope.go:117] "RemoveContainer" containerID="654fc2f37bae8ca8d3c43bcce6dc15128e8e0a662bbe83f02c4ba7d9f7bc3ba0" Feb 26 12:03:53 crc kubenswrapper[4699]: I0226 12:03:53.364917 4699 scope.go:117] "RemoveContainer" containerID="f4243150cc96c5ee66d4d44a57548498e0161326cbcaf1706a67a62345ae76a5" Feb 26 12:03:53 crc kubenswrapper[4699]: E0226 12:03:53.365678 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4243150cc96c5ee66d4d44a57548498e0161326cbcaf1706a67a62345ae76a5\": container with ID starting with f4243150cc96c5ee66d4d44a57548498e0161326cbcaf1706a67a62345ae76a5 not found: ID does not exist" containerID="f4243150cc96c5ee66d4d44a57548498e0161326cbcaf1706a67a62345ae76a5" Feb 26 12:03:53 crc kubenswrapper[4699]: I0226 12:03:53.365751 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4243150cc96c5ee66d4d44a57548498e0161326cbcaf1706a67a62345ae76a5"} err="failed to get container status \"f4243150cc96c5ee66d4d44a57548498e0161326cbcaf1706a67a62345ae76a5\": rpc error: code = NotFound desc = could not find container \"f4243150cc96c5ee66d4d44a57548498e0161326cbcaf1706a67a62345ae76a5\": container with ID starting with f4243150cc96c5ee66d4d44a57548498e0161326cbcaf1706a67a62345ae76a5 not found: ID does not exist" Feb 26 12:03:53 crc kubenswrapper[4699]: I0226 12:03:53.365782 4699 scope.go:117] "RemoveContainer" containerID="fa8520e59eea88936b0e5fd507b8acacf32d49bb81261491b76e719b13c07aed" Feb 26 12:03:53 crc kubenswrapper[4699]: E0226 12:03:53.366301 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa8520e59eea88936b0e5fd507b8acacf32d49bb81261491b76e719b13c07aed\": container with ID starting with fa8520e59eea88936b0e5fd507b8acacf32d49bb81261491b76e719b13c07aed not found: ID does not exist" containerID="fa8520e59eea88936b0e5fd507b8acacf32d49bb81261491b76e719b13c07aed" Feb 26 12:03:53 crc kubenswrapper[4699]: I0226 12:03:53.366338 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa8520e59eea88936b0e5fd507b8acacf32d49bb81261491b76e719b13c07aed"} err="failed to get container status \"fa8520e59eea88936b0e5fd507b8acacf32d49bb81261491b76e719b13c07aed\": rpc error: code = NotFound desc = could not find container \"fa8520e59eea88936b0e5fd507b8acacf32d49bb81261491b76e719b13c07aed\": container with ID starting with fa8520e59eea88936b0e5fd507b8acacf32d49bb81261491b76e719b13c07aed not found: ID does not exist" Feb 26 12:03:53 crc kubenswrapper[4699]: I0226 12:03:53.366360 4699 scope.go:117] "RemoveContainer" containerID="654fc2f37bae8ca8d3c43bcce6dc15128e8e0a662bbe83f02c4ba7d9f7bc3ba0" Feb 26 12:03:53 crc kubenswrapper[4699]: E0226 12:03:53.366682 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"654fc2f37bae8ca8d3c43bcce6dc15128e8e0a662bbe83f02c4ba7d9f7bc3ba0\": container with ID starting with 654fc2f37bae8ca8d3c43bcce6dc15128e8e0a662bbe83f02c4ba7d9f7bc3ba0 not found: ID does not exist" containerID="654fc2f37bae8ca8d3c43bcce6dc15128e8e0a662bbe83f02c4ba7d9f7bc3ba0" Feb 26 12:03:53 crc kubenswrapper[4699]: I0226 12:03:53.366704 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"654fc2f37bae8ca8d3c43bcce6dc15128e8e0a662bbe83f02c4ba7d9f7bc3ba0"} err="failed to get container status \"654fc2f37bae8ca8d3c43bcce6dc15128e8e0a662bbe83f02c4ba7d9f7bc3ba0\": rpc error: code = NotFound desc = could not find container \"654fc2f37bae8ca8d3c43bcce6dc15128e8e0a662bbe83f02c4ba7d9f7bc3ba0\": container with ID starting with 654fc2f37bae8ca8d3c43bcce6dc15128e8e0a662bbe83f02c4ba7d9f7bc3ba0 not found: ID does not exist" Feb 26 12:03:54 crc kubenswrapper[4699]: I0226 12:03:54.282494 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2958cda-c404-4b0e-a6b4-e32bdd5b2b22" path="/var/lib/kubelet/pods/e2958cda-c404-4b0e-a6b4-e32bdd5b2b22/volumes" Feb 26 12:04:00 crc kubenswrapper[4699]: I0226 12:04:00.159411 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535124-gqg7z"] Feb 26 12:04:00 crc kubenswrapper[4699]: E0226 12:04:00.160619 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2958cda-c404-4b0e-a6b4-e32bdd5b2b22" containerName="extract-content" Feb 26 12:04:00 crc kubenswrapper[4699]: I0226 12:04:00.160644 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2958cda-c404-4b0e-a6b4-e32bdd5b2b22" containerName="extract-content" Feb 26 12:04:00 crc kubenswrapper[4699]: E0226 12:04:00.160665 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2958cda-c404-4b0e-a6b4-e32bdd5b2b22" containerName="registry-server" Feb 26 12:04:00 crc kubenswrapper[4699]: I0226 12:04:00.160676 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2958cda-c404-4b0e-a6b4-e32bdd5b2b22" containerName="registry-server" Feb 26 12:04:00 crc kubenswrapper[4699]: E0226 12:04:00.160750 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2958cda-c404-4b0e-a6b4-e32bdd5b2b22" containerName="extract-utilities" Feb 26 12:04:00 crc kubenswrapper[4699]: I0226 12:04:00.160764 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2958cda-c404-4b0e-a6b4-e32bdd5b2b22" containerName="extract-utilities" Feb 26 12:04:00 crc kubenswrapper[4699]: I0226 12:04:00.161128 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2958cda-c404-4b0e-a6b4-e32bdd5b2b22" containerName="registry-server" Feb 26 12:04:00 crc kubenswrapper[4699]: I0226 12:04:00.162228 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535124-gqg7z" Feb 26 12:04:00 crc kubenswrapper[4699]: I0226 12:04:00.164633 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 12:04:00 crc kubenswrapper[4699]: I0226 12:04:00.164644 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 12:04:00 crc kubenswrapper[4699]: I0226 12:04:00.165083 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 12:04:00 crc kubenswrapper[4699]: I0226 12:04:00.168824 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535124-gqg7z"] Feb 26 12:04:00 crc kubenswrapper[4699]: I0226 12:04:00.324350 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-968p8\" (UniqueName: \"kubernetes.io/projected/af10706a-2423-4bb2-b0a5-de33b64b4b64-kube-api-access-968p8\") pod \"auto-csr-approver-29535124-gqg7z\" (UID: \"af10706a-2423-4bb2-b0a5-de33b64b4b64\") " pod="openshift-infra/auto-csr-approver-29535124-gqg7z" Feb 26 12:04:00 crc kubenswrapper[4699]: I0226 12:04:00.426379 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-968p8\" (UniqueName: \"kubernetes.io/projected/af10706a-2423-4bb2-b0a5-de33b64b4b64-kube-api-access-968p8\") pod \"auto-csr-approver-29535124-gqg7z\" (UID: \"af10706a-2423-4bb2-b0a5-de33b64b4b64\") " pod="openshift-infra/auto-csr-approver-29535124-gqg7z" Feb 26 12:04:00 crc kubenswrapper[4699]: I0226 12:04:00.444824 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-968p8\" (UniqueName: \"kubernetes.io/projected/af10706a-2423-4bb2-b0a5-de33b64b4b64-kube-api-access-968p8\") pod \"auto-csr-approver-29535124-gqg7z\" (UID: \"af10706a-2423-4bb2-b0a5-de33b64b4b64\") " pod="openshift-infra/auto-csr-approver-29535124-gqg7z" Feb 26 12:04:00 crc kubenswrapper[4699]: I0226 12:04:00.480916 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535124-gqg7z" Feb 26 12:04:00 crc kubenswrapper[4699]: I0226 12:04:00.911465 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535124-gqg7z"] Feb 26 12:04:01 crc kubenswrapper[4699]: I0226 12:04:01.261630 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:04:01 crc kubenswrapper[4699]: E0226 12:04:01.261949 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:04:01 crc kubenswrapper[4699]: I0226 12:04:01.407042 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535124-gqg7z" event={"ID":"af10706a-2423-4bb2-b0a5-de33b64b4b64","Type":"ContainerStarted","Data":"918aa5b648841b62f88b5b7c296ae0cea3220bd06e5b4ae982efbab913fb89fa"} Feb 26 12:04:03 crc kubenswrapper[4699]: I0226 12:04:03.425066 4699 generic.go:334] "Generic (PLEG): container finished" podID="af10706a-2423-4bb2-b0a5-de33b64b4b64" containerID="bee6179034d0d615200cc2b0cca46b2b7ac3bbc955a96024e317fe4212ffc149" exitCode=0 Feb 26 12:04:03 crc kubenswrapper[4699]: I0226 12:04:03.425550 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535124-gqg7z" event={"ID":"af10706a-2423-4bb2-b0a5-de33b64b4b64","Type":"ContainerDied","Data":"bee6179034d0d615200cc2b0cca46b2b7ac3bbc955a96024e317fe4212ffc149"} Feb 26 12:04:04 crc kubenswrapper[4699]: I0226 12:04:04.907904 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535124-gqg7z" Feb 26 12:04:05 crc kubenswrapper[4699]: I0226 12:04:05.028768 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-968p8\" (UniqueName: \"kubernetes.io/projected/af10706a-2423-4bb2-b0a5-de33b64b4b64-kube-api-access-968p8\") pod \"af10706a-2423-4bb2-b0a5-de33b64b4b64\" (UID: \"af10706a-2423-4bb2-b0a5-de33b64b4b64\") " Feb 26 12:04:05 crc kubenswrapper[4699]: I0226 12:04:05.037340 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af10706a-2423-4bb2-b0a5-de33b64b4b64-kube-api-access-968p8" (OuterVolumeSpecName: "kube-api-access-968p8") pod "af10706a-2423-4bb2-b0a5-de33b64b4b64" (UID: "af10706a-2423-4bb2-b0a5-de33b64b4b64"). InnerVolumeSpecName "kube-api-access-968p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:04:05 crc kubenswrapper[4699]: I0226 12:04:05.131920 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-968p8\" (UniqueName: \"kubernetes.io/projected/af10706a-2423-4bb2-b0a5-de33b64b4b64-kube-api-access-968p8\") on node \"crc\" DevicePath \"\"" Feb 26 12:04:05 crc kubenswrapper[4699]: I0226 12:04:05.442405 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535124-gqg7z" event={"ID":"af10706a-2423-4bb2-b0a5-de33b64b4b64","Type":"ContainerDied","Data":"918aa5b648841b62f88b5b7c296ae0cea3220bd06e5b4ae982efbab913fb89fa"} Feb 26 12:04:05 crc kubenswrapper[4699]: I0226 12:04:05.442446 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="918aa5b648841b62f88b5b7c296ae0cea3220bd06e5b4ae982efbab913fb89fa" Feb 26 12:04:05 crc kubenswrapper[4699]: I0226 12:04:05.442514 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535124-gqg7z" Feb 26 12:04:05 crc kubenswrapper[4699]: I0226 12:04:05.974505 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535118-n92bn"] Feb 26 12:04:05 crc kubenswrapper[4699]: I0226 12:04:05.983234 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535118-n92bn"] Feb 26 12:04:06 crc kubenswrapper[4699]: I0226 12:04:06.273026 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bde379e-7dd7-4b4b-bc25-b83d0174b100" path="/var/lib/kubelet/pods/6bde379e-7dd7-4b4b-bc25-b83d0174b100/volumes" Feb 26 12:04:15 crc kubenswrapper[4699]: I0226 12:04:15.261030 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:04:15 crc kubenswrapper[4699]: E0226 12:04:15.262018 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:04:27 crc kubenswrapper[4699]: I0226 12:04:27.262204 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:04:27 crc kubenswrapper[4699]: E0226 12:04:27.263678 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:04:38 crc kubenswrapper[4699]: I0226 12:04:38.261410 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:04:38 crc kubenswrapper[4699]: E0226 12:04:38.262282 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:04:39 crc kubenswrapper[4699]: I0226 12:04:39.263483 4699 scope.go:117] "RemoveContainer" containerID="7ee1327d152002290262452d2af09136d94e1e411a1eeb32531cce9b1d48c20c" Feb 26 12:04:50 crc kubenswrapper[4699]: I0226 12:04:50.263621 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:04:50 crc kubenswrapper[4699]: E0226 12:04:50.264501 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:05:02 crc kubenswrapper[4699]: I0226 12:05:02.278851 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:05:02 crc kubenswrapper[4699]: E0226 12:05:02.279638 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:05:16 crc kubenswrapper[4699]: I0226 12:05:16.267316 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:05:16 crc kubenswrapper[4699]: E0226 12:05:16.268298 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:05:31 crc kubenswrapper[4699]: I0226 12:05:31.260899 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:05:31 crc kubenswrapper[4699]: E0226 12:05:31.261750 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:05:44 crc kubenswrapper[4699]: I0226 12:05:44.260629 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:05:45 crc kubenswrapper[4699]: I0226 12:05:45.336213 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerStarted","Data":"243333360a6594f8acbe71e1e9197448e74ac1a0258671779fb6af974ca032dd"} Feb 26 12:06:00 crc kubenswrapper[4699]: I0226 12:06:00.207473 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535126-n7gpm"] Feb 26 12:06:00 crc kubenswrapper[4699]: E0226 12:06:00.208400 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af10706a-2423-4bb2-b0a5-de33b64b4b64" containerName="oc" Feb 26 12:06:00 crc kubenswrapper[4699]: I0226 12:06:00.208416 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="af10706a-2423-4bb2-b0a5-de33b64b4b64" containerName="oc" Feb 26 12:06:00 crc kubenswrapper[4699]: I0226 12:06:00.208596 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="af10706a-2423-4bb2-b0a5-de33b64b4b64" containerName="oc" Feb 26 12:06:00 crc kubenswrapper[4699]: I0226 12:06:00.209246 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535126-n7gpm" Feb 26 12:06:00 crc kubenswrapper[4699]: I0226 12:06:00.211501 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 12:06:00 crc kubenswrapper[4699]: I0226 12:06:00.211501 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 12:06:00 crc kubenswrapper[4699]: I0226 12:06:00.211556 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 12:06:00 crc kubenswrapper[4699]: I0226 12:06:00.231613 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535126-n7gpm"] Feb 26 12:06:00 crc kubenswrapper[4699]: I0226 12:06:00.360253 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxqxp\" (UniqueName: \"kubernetes.io/projected/d62b893f-dc84-4f3a-9c62-5c49c65be99f-kube-api-access-jxqxp\") pod \"auto-csr-approver-29535126-n7gpm\" (UID: \"d62b893f-dc84-4f3a-9c62-5c49c65be99f\") " pod="openshift-infra/auto-csr-approver-29535126-n7gpm" Feb 26 12:06:00 crc kubenswrapper[4699]: I0226 12:06:00.462850 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxqxp\" (UniqueName: \"kubernetes.io/projected/d62b893f-dc84-4f3a-9c62-5c49c65be99f-kube-api-access-jxqxp\") pod \"auto-csr-approver-29535126-n7gpm\" (UID: \"d62b893f-dc84-4f3a-9c62-5c49c65be99f\") " pod="openshift-infra/auto-csr-approver-29535126-n7gpm" Feb 26 12:06:00 crc kubenswrapper[4699]: I0226 12:06:00.483003 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxqxp\" (UniqueName: \"kubernetes.io/projected/d62b893f-dc84-4f3a-9c62-5c49c65be99f-kube-api-access-jxqxp\") pod \"auto-csr-approver-29535126-n7gpm\" (UID: \"d62b893f-dc84-4f3a-9c62-5c49c65be99f\") " pod="openshift-infra/auto-csr-approver-29535126-n7gpm" Feb 26 12:06:00 crc kubenswrapper[4699]: I0226 12:06:00.531909 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535126-n7gpm" Feb 26 12:06:01 crc kubenswrapper[4699]: I0226 12:06:01.028463 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535126-n7gpm"] Feb 26 12:06:01 crc kubenswrapper[4699]: W0226 12:06:01.032312 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd62b893f_dc84_4f3a_9c62_5c49c65be99f.slice/crio-dc3a2105108f55c117666ba4cf91597c2e85da666a181b7ff3883c021b5428a4 WatchSource:0}: Error finding container dc3a2105108f55c117666ba4cf91597c2e85da666a181b7ff3883c021b5428a4: Status 404 returned error can't find the container with id dc3a2105108f55c117666ba4cf91597c2e85da666a181b7ff3883c021b5428a4 Feb 26 12:06:01 crc kubenswrapper[4699]: I0226 12:06:01.469539 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535126-n7gpm" event={"ID":"d62b893f-dc84-4f3a-9c62-5c49c65be99f","Type":"ContainerStarted","Data":"dc3a2105108f55c117666ba4cf91597c2e85da666a181b7ff3883c021b5428a4"} Feb 26 12:06:03 crc kubenswrapper[4699]: I0226 12:06:03.488495 4699 generic.go:334] "Generic (PLEG): container finished" podID="d62b893f-dc84-4f3a-9c62-5c49c65be99f" containerID="0a9b5f9a5f2d730b937d8d7362f22b7e6fe3edad8ecb5523a71d611f339c4a8e" exitCode=0 Feb 26 12:06:03 crc kubenswrapper[4699]: I0226 12:06:03.488584 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535126-n7gpm" event={"ID":"d62b893f-dc84-4f3a-9c62-5c49c65be99f","Type":"ContainerDied","Data":"0a9b5f9a5f2d730b937d8d7362f22b7e6fe3edad8ecb5523a71d611f339c4a8e"} Feb 26 12:06:04 crc kubenswrapper[4699]: I0226 12:06:04.904413 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535126-n7gpm" Feb 26 12:06:05 crc kubenswrapper[4699]: I0226 12:06:05.059633 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxqxp\" (UniqueName: \"kubernetes.io/projected/d62b893f-dc84-4f3a-9c62-5c49c65be99f-kube-api-access-jxqxp\") pod \"d62b893f-dc84-4f3a-9c62-5c49c65be99f\" (UID: \"d62b893f-dc84-4f3a-9c62-5c49c65be99f\") " Feb 26 12:06:05 crc kubenswrapper[4699]: I0226 12:06:05.065811 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d62b893f-dc84-4f3a-9c62-5c49c65be99f-kube-api-access-jxqxp" (OuterVolumeSpecName: "kube-api-access-jxqxp") pod "d62b893f-dc84-4f3a-9c62-5c49c65be99f" (UID: "d62b893f-dc84-4f3a-9c62-5c49c65be99f"). InnerVolumeSpecName "kube-api-access-jxqxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:06:05 crc kubenswrapper[4699]: I0226 12:06:05.162261 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxqxp\" (UniqueName: \"kubernetes.io/projected/d62b893f-dc84-4f3a-9c62-5c49c65be99f-kube-api-access-jxqxp\") on node \"crc\" DevicePath \"\"" Feb 26 12:06:05 crc kubenswrapper[4699]: I0226 12:06:05.508915 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535126-n7gpm" event={"ID":"d62b893f-dc84-4f3a-9c62-5c49c65be99f","Type":"ContainerDied","Data":"dc3a2105108f55c117666ba4cf91597c2e85da666a181b7ff3883c021b5428a4"} Feb 26 12:06:05 crc kubenswrapper[4699]: I0226 12:06:05.509225 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc3a2105108f55c117666ba4cf91597c2e85da666a181b7ff3883c021b5428a4" Feb 26 12:06:05 crc kubenswrapper[4699]: I0226 12:06:05.508973 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535126-n7gpm" Feb 26 12:06:05 crc kubenswrapper[4699]: I0226 12:06:05.981433 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535120-xkftf"] Feb 26 12:06:05 crc kubenswrapper[4699]: I0226 12:06:05.991106 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535120-xkftf"] Feb 26 12:06:06 crc kubenswrapper[4699]: I0226 12:06:06.272590 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="062171a4-9cf3-460e-822d-2dc7b5baaf9b" path="/var/lib/kubelet/pods/062171a4-9cf3-460e-822d-2dc7b5baaf9b/volumes" Feb 26 12:06:10 crc kubenswrapper[4699]: I0226 12:06:10.198970 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zjml6"] Feb 26 12:06:10 crc kubenswrapper[4699]: E0226 12:06:10.200077 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62b893f-dc84-4f3a-9c62-5c49c65be99f" containerName="oc" Feb 26 12:06:10 crc kubenswrapper[4699]: I0226 12:06:10.200098 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62b893f-dc84-4f3a-9c62-5c49c65be99f" containerName="oc" Feb 26 12:06:10 crc kubenswrapper[4699]: I0226 12:06:10.200408 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62b893f-dc84-4f3a-9c62-5c49c65be99f" containerName="oc" Feb 26 12:06:10 crc kubenswrapper[4699]: I0226 12:06:10.201759 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zjml6" Feb 26 12:06:10 crc kubenswrapper[4699]: I0226 12:06:10.214240 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zjml6"] Feb 26 12:06:10 crc kubenswrapper[4699]: I0226 12:06:10.234491 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1592fb-fd51-4a05-ac77-6094fe72263b-utilities\") pod \"certified-operators-zjml6\" (UID: \"9d1592fb-fd51-4a05-ac77-6094fe72263b\") " pod="openshift-marketplace/certified-operators-zjml6" Feb 26 12:06:10 crc kubenswrapper[4699]: I0226 12:06:10.234728 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1592fb-fd51-4a05-ac77-6094fe72263b-catalog-content\") pod \"certified-operators-zjml6\" (UID: \"9d1592fb-fd51-4a05-ac77-6094fe72263b\") " pod="openshift-marketplace/certified-operators-zjml6" Feb 26 12:06:10 crc kubenswrapper[4699]: I0226 12:06:10.234804 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zjkp\" (UniqueName: \"kubernetes.io/projected/9d1592fb-fd51-4a05-ac77-6094fe72263b-kube-api-access-8zjkp\") pod \"certified-operators-zjml6\" (UID: \"9d1592fb-fd51-4a05-ac77-6094fe72263b\") " pod="openshift-marketplace/certified-operators-zjml6" Feb 26 12:06:10 crc kubenswrapper[4699]: I0226 12:06:10.336039 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1592fb-fd51-4a05-ac77-6094fe72263b-utilities\") pod \"certified-operators-zjml6\" (UID: \"9d1592fb-fd51-4a05-ac77-6094fe72263b\") " pod="openshift-marketplace/certified-operators-zjml6" Feb 26 12:06:10 crc kubenswrapper[4699]: I0226 12:06:10.336419 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1592fb-fd51-4a05-ac77-6094fe72263b-catalog-content\") pod \"certified-operators-zjml6\" (UID: \"9d1592fb-fd51-4a05-ac77-6094fe72263b\") " pod="openshift-marketplace/certified-operators-zjml6" Feb 26 12:06:10 crc kubenswrapper[4699]: I0226 12:06:10.336469 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zjkp\" (UniqueName: \"kubernetes.io/projected/9d1592fb-fd51-4a05-ac77-6094fe72263b-kube-api-access-8zjkp\") pod \"certified-operators-zjml6\" (UID: \"9d1592fb-fd51-4a05-ac77-6094fe72263b\") " pod="openshift-marketplace/certified-operators-zjml6" Feb 26 12:06:10 crc kubenswrapper[4699]: I0226 12:06:10.336790 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1592fb-fd51-4a05-ac77-6094fe72263b-utilities\") pod \"certified-operators-zjml6\" (UID: \"9d1592fb-fd51-4a05-ac77-6094fe72263b\") " pod="openshift-marketplace/certified-operators-zjml6" Feb 26 12:06:10 crc kubenswrapper[4699]: I0226 12:06:10.337081 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1592fb-fd51-4a05-ac77-6094fe72263b-catalog-content\") pod \"certified-operators-zjml6\" (UID: \"9d1592fb-fd51-4a05-ac77-6094fe72263b\") " pod="openshift-marketplace/certified-operators-zjml6" Feb 26 12:06:10 crc kubenswrapper[4699]: I0226 12:06:10.357689 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zjkp\" (UniqueName: \"kubernetes.io/projected/9d1592fb-fd51-4a05-ac77-6094fe72263b-kube-api-access-8zjkp\") pod \"certified-operators-zjml6\" (UID: \"9d1592fb-fd51-4a05-ac77-6094fe72263b\") " pod="openshift-marketplace/certified-operators-zjml6" Feb 26 12:06:10 crc kubenswrapper[4699]: I0226 12:06:10.523777 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zjml6" Feb 26 12:06:11 crc kubenswrapper[4699]: I0226 12:06:11.036556 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zjml6"] Feb 26 12:06:11 crc kubenswrapper[4699]: I0226 12:06:11.566641 4699 generic.go:334] "Generic (PLEG): container finished" podID="9d1592fb-fd51-4a05-ac77-6094fe72263b" containerID="ba60cd8917eafa6b99bbae2da6aee01e1b66669a8cfe9e66cb7282c9c7dbc3db" exitCode=0 Feb 26 12:06:11 crc kubenswrapper[4699]: I0226 12:06:11.566698 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zjml6" event={"ID":"9d1592fb-fd51-4a05-ac77-6094fe72263b","Type":"ContainerDied","Data":"ba60cd8917eafa6b99bbae2da6aee01e1b66669a8cfe9e66cb7282c9c7dbc3db"} Feb 26 12:06:11 crc kubenswrapper[4699]: I0226 12:06:11.567001 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zjml6" event={"ID":"9d1592fb-fd51-4a05-ac77-6094fe72263b","Type":"ContainerStarted","Data":"d39ed2334cbf7e63d9cedce8990750b2dd76b1f97bcd1a16679cb77c3660aa1e"} Feb 26 12:06:12 crc kubenswrapper[4699]: I0226 12:06:12.604510 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zjml6" event={"ID":"9d1592fb-fd51-4a05-ac77-6094fe72263b","Type":"ContainerStarted","Data":"8fa98d33fc879620f19d54870e7f72b49540998de1c851b86a918b5e6e0ac2c7"} Feb 26 12:06:13 crc kubenswrapper[4699]: I0226 12:06:13.614939 4699 generic.go:334] "Generic (PLEG): container finished" podID="9d1592fb-fd51-4a05-ac77-6094fe72263b" containerID="8fa98d33fc879620f19d54870e7f72b49540998de1c851b86a918b5e6e0ac2c7" exitCode=0 Feb 26 12:06:13 crc kubenswrapper[4699]: I0226 12:06:13.614970 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zjml6" event={"ID":"9d1592fb-fd51-4a05-ac77-6094fe72263b","Type":"ContainerDied","Data":"8fa98d33fc879620f19d54870e7f72b49540998de1c851b86a918b5e6e0ac2c7"} Feb 26 12:06:17 crc kubenswrapper[4699]: I0226 12:06:17.655728 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zjml6" event={"ID":"9d1592fb-fd51-4a05-ac77-6094fe72263b","Type":"ContainerStarted","Data":"edbe3597e20e2d7f855f28a4fbb01d8f5d1116713fa18a3bb3297d50952c9300"} Feb 26 12:06:17 crc kubenswrapper[4699]: I0226 12:06:17.676170 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zjml6" podStartSLOduration=2.466819862 podStartE2EDuration="7.676148113s" podCreationTimestamp="2026-02-26 12:06:10 +0000 UTC" firstStartedPulling="2026-02-26 12:06:11.56851023 +0000 UTC m=+3317.379336664" lastFinishedPulling="2026-02-26 12:06:16.777838481 +0000 UTC m=+3322.588664915" observedRunningTime="2026-02-26 12:06:17.674353202 +0000 UTC m=+3323.485179646" watchObservedRunningTime="2026-02-26 12:06:17.676148113 +0000 UTC m=+3323.486974547" Feb 26 12:06:20 crc kubenswrapper[4699]: I0226 12:06:20.524465 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zjml6" Feb 26 12:06:20 crc kubenswrapper[4699]: I0226 12:06:20.525088 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zjml6" Feb 26 12:06:20 crc kubenswrapper[4699]: I0226 12:06:20.573056 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zjml6" Feb 26 12:06:29 crc kubenswrapper[4699]: I0226 12:06:29.420326 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p28sp"] Feb 26 12:06:29 crc kubenswrapper[4699]: I0226 12:06:29.433985 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p28sp"] Feb 26 12:06:29 crc kubenswrapper[4699]: I0226 12:06:29.434145 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p28sp" Feb 26 12:06:29 crc kubenswrapper[4699]: I0226 12:06:29.525954 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e885b191-c6b8-4780-bc61-eaae0d82ad32-utilities\") pod \"redhat-marketplace-p28sp\" (UID: \"e885b191-c6b8-4780-bc61-eaae0d82ad32\") " pod="openshift-marketplace/redhat-marketplace-p28sp" Feb 26 12:06:29 crc kubenswrapper[4699]: I0226 12:06:29.526028 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e885b191-c6b8-4780-bc61-eaae0d82ad32-catalog-content\") pod \"redhat-marketplace-p28sp\" (UID: \"e885b191-c6b8-4780-bc61-eaae0d82ad32\") " pod="openshift-marketplace/redhat-marketplace-p28sp" Feb 26 12:06:29 crc kubenswrapper[4699]: I0226 12:06:29.526219 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsd4j\" (UniqueName: \"kubernetes.io/projected/e885b191-c6b8-4780-bc61-eaae0d82ad32-kube-api-access-jsd4j\") pod \"redhat-marketplace-p28sp\" (UID: \"e885b191-c6b8-4780-bc61-eaae0d82ad32\") " pod="openshift-marketplace/redhat-marketplace-p28sp" Feb 26 12:06:29 crc kubenswrapper[4699]: I0226 12:06:29.628439 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsd4j\" (UniqueName: \"kubernetes.io/projected/e885b191-c6b8-4780-bc61-eaae0d82ad32-kube-api-access-jsd4j\") pod \"redhat-marketplace-p28sp\" (UID: \"e885b191-c6b8-4780-bc61-eaae0d82ad32\") " pod="openshift-marketplace/redhat-marketplace-p28sp" Feb 26 12:06:29 crc kubenswrapper[4699]: I0226 12:06:29.628596 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e885b191-c6b8-4780-bc61-eaae0d82ad32-utilities\") pod \"redhat-marketplace-p28sp\" (UID: \"e885b191-c6b8-4780-bc61-eaae0d82ad32\") " pod="openshift-marketplace/redhat-marketplace-p28sp" Feb 26 12:06:29 crc kubenswrapper[4699]: I0226 12:06:29.628647 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e885b191-c6b8-4780-bc61-eaae0d82ad32-catalog-content\") pod \"redhat-marketplace-p28sp\" (UID: \"e885b191-c6b8-4780-bc61-eaae0d82ad32\") " pod="openshift-marketplace/redhat-marketplace-p28sp" Feb 26 12:06:29 crc kubenswrapper[4699]: I0226 12:06:29.629081 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e885b191-c6b8-4780-bc61-eaae0d82ad32-catalog-content\") pod \"redhat-marketplace-p28sp\" (UID: \"e885b191-c6b8-4780-bc61-eaae0d82ad32\") " pod="openshift-marketplace/redhat-marketplace-p28sp" Feb 26 12:06:29 crc kubenswrapper[4699]: I0226 12:06:29.629174 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e885b191-c6b8-4780-bc61-eaae0d82ad32-utilities\") pod \"redhat-marketplace-p28sp\" (UID: \"e885b191-c6b8-4780-bc61-eaae0d82ad32\") " pod="openshift-marketplace/redhat-marketplace-p28sp" Feb 26 12:06:29 crc kubenswrapper[4699]: I0226 12:06:29.646987 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsd4j\" (UniqueName: \"kubernetes.io/projected/e885b191-c6b8-4780-bc61-eaae0d82ad32-kube-api-access-jsd4j\") pod \"redhat-marketplace-p28sp\" (UID: \"e885b191-c6b8-4780-bc61-eaae0d82ad32\") " pod="openshift-marketplace/redhat-marketplace-p28sp" Feb 26 12:06:29 crc kubenswrapper[4699]: I0226 12:06:29.756560 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p28sp" Feb 26 12:06:30 crc kubenswrapper[4699]: I0226 12:06:30.291986 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p28sp"] Feb 26 12:06:30 crc kubenswrapper[4699]: I0226 12:06:30.571578 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zjml6" Feb 26 12:06:30 crc kubenswrapper[4699]: I0226 12:06:30.813036 4699 generic.go:334] "Generic (PLEG): container finished" podID="e885b191-c6b8-4780-bc61-eaae0d82ad32" containerID="8160ec562f5b448ada6b4d935ee7e1221ceccd01d09aa2645648d7aa891ef261" exitCode=0 Feb 26 12:06:30 crc kubenswrapper[4699]: I0226 12:06:30.813091 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p28sp" event={"ID":"e885b191-c6b8-4780-bc61-eaae0d82ad32","Type":"ContainerDied","Data":"8160ec562f5b448ada6b4d935ee7e1221ceccd01d09aa2645648d7aa891ef261"} Feb 26 12:06:30 crc kubenswrapper[4699]: I0226 12:06:30.813137 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p28sp" event={"ID":"e885b191-c6b8-4780-bc61-eaae0d82ad32","Type":"ContainerStarted","Data":"909ae1a31f3b190a5c879bc6499a8426947ec5b29159b933f594b746178602d1"} Feb 26 12:06:32 crc kubenswrapper[4699]: I0226 12:06:32.836380 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p28sp" event={"ID":"e885b191-c6b8-4780-bc61-eaae0d82ad32","Type":"ContainerStarted","Data":"dd638da9b07bb5495355fbba9db461caf931aa048c85356f172f22e39da72d75"} Feb 26 12:06:32 crc kubenswrapper[4699]: I0226 12:06:32.996993 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zjml6"] Feb 26 12:06:32 crc kubenswrapper[4699]: I0226 12:06:32.997632 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zjml6" podUID="9d1592fb-fd51-4a05-ac77-6094fe72263b" containerName="registry-server" containerID="cri-o://edbe3597e20e2d7f855f28a4fbb01d8f5d1116713fa18a3bb3297d50952c9300" gracePeriod=2 Feb 26 12:06:33 crc kubenswrapper[4699]: I0226 12:06:33.855099 4699 generic.go:334] "Generic (PLEG): container finished" podID="e885b191-c6b8-4780-bc61-eaae0d82ad32" containerID="dd638da9b07bb5495355fbba9db461caf931aa048c85356f172f22e39da72d75" exitCode=0 Feb 26 12:06:33 crc kubenswrapper[4699]: I0226 12:06:33.855293 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p28sp" event={"ID":"e885b191-c6b8-4780-bc61-eaae0d82ad32","Type":"ContainerDied","Data":"dd638da9b07bb5495355fbba9db461caf931aa048c85356f172f22e39da72d75"} Feb 26 12:06:33 crc kubenswrapper[4699]: I0226 12:06:33.863588 4699 generic.go:334] "Generic (PLEG): container finished" podID="9d1592fb-fd51-4a05-ac77-6094fe72263b" containerID="edbe3597e20e2d7f855f28a4fbb01d8f5d1116713fa18a3bb3297d50952c9300" exitCode=0 Feb 26 12:06:33 crc kubenswrapper[4699]: I0226 12:06:33.863639 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zjml6" event={"ID":"9d1592fb-fd51-4a05-ac77-6094fe72263b","Type":"ContainerDied","Data":"edbe3597e20e2d7f855f28a4fbb01d8f5d1116713fa18a3bb3297d50952c9300"} Feb 26 12:06:33 crc kubenswrapper[4699]: I0226 12:06:33.978410 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zjml6" Feb 26 12:06:34 crc kubenswrapper[4699]: I0226 12:06:34.114134 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1592fb-fd51-4a05-ac77-6094fe72263b-catalog-content\") pod \"9d1592fb-fd51-4a05-ac77-6094fe72263b\" (UID: \"9d1592fb-fd51-4a05-ac77-6094fe72263b\") " Feb 26 12:06:34 crc kubenswrapper[4699]: I0226 12:06:34.114212 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1592fb-fd51-4a05-ac77-6094fe72263b-utilities\") pod \"9d1592fb-fd51-4a05-ac77-6094fe72263b\" (UID: \"9d1592fb-fd51-4a05-ac77-6094fe72263b\") " Feb 26 12:06:34 crc kubenswrapper[4699]: I0226 12:06:34.114424 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zjkp\" (UniqueName: \"kubernetes.io/projected/9d1592fb-fd51-4a05-ac77-6094fe72263b-kube-api-access-8zjkp\") pod \"9d1592fb-fd51-4a05-ac77-6094fe72263b\" (UID: \"9d1592fb-fd51-4a05-ac77-6094fe72263b\") " Feb 26 12:06:34 crc kubenswrapper[4699]: I0226 12:06:34.115299 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d1592fb-fd51-4a05-ac77-6094fe72263b-utilities" (OuterVolumeSpecName: "utilities") pod "9d1592fb-fd51-4a05-ac77-6094fe72263b" (UID: "9d1592fb-fd51-4a05-ac77-6094fe72263b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:06:34 crc kubenswrapper[4699]: I0226 12:06:34.120314 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d1592fb-fd51-4a05-ac77-6094fe72263b-kube-api-access-8zjkp" (OuterVolumeSpecName: "kube-api-access-8zjkp") pod "9d1592fb-fd51-4a05-ac77-6094fe72263b" (UID: "9d1592fb-fd51-4a05-ac77-6094fe72263b"). InnerVolumeSpecName "kube-api-access-8zjkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:06:34 crc kubenswrapper[4699]: I0226 12:06:34.168419 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d1592fb-fd51-4a05-ac77-6094fe72263b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d1592fb-fd51-4a05-ac77-6094fe72263b" (UID: "9d1592fb-fd51-4a05-ac77-6094fe72263b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:06:34 crc kubenswrapper[4699]: I0226 12:06:34.216848 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1592fb-fd51-4a05-ac77-6094fe72263b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 12:06:34 crc kubenswrapper[4699]: I0226 12:06:34.216889 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1592fb-fd51-4a05-ac77-6094fe72263b-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 12:06:34 crc kubenswrapper[4699]: I0226 12:06:34.216899 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zjkp\" (UniqueName: \"kubernetes.io/projected/9d1592fb-fd51-4a05-ac77-6094fe72263b-kube-api-access-8zjkp\") on node \"crc\" DevicePath \"\"" Feb 26 12:06:34 crc kubenswrapper[4699]: I0226 12:06:34.875919 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p28sp" event={"ID":"e885b191-c6b8-4780-bc61-eaae0d82ad32","Type":"ContainerStarted","Data":"ca3b9ef1bb6b5ba34f2b3ea120521b34766663e638086b148598e9ebcbfb2efb"} Feb 26 12:06:34 crc kubenswrapper[4699]: I0226 12:06:34.879091 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zjml6" event={"ID":"9d1592fb-fd51-4a05-ac77-6094fe72263b","Type":"ContainerDied","Data":"d39ed2334cbf7e63d9cedce8990750b2dd76b1f97bcd1a16679cb77c3660aa1e"} Feb 26 12:06:34 crc kubenswrapper[4699]: I0226 12:06:34.879160 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zjml6" Feb 26 12:06:34 crc kubenswrapper[4699]: I0226 12:06:34.879163 4699 scope.go:117] "RemoveContainer" containerID="edbe3597e20e2d7f855f28a4fbb01d8f5d1116713fa18a3bb3297d50952c9300" Feb 26 12:06:34 crc kubenswrapper[4699]: I0226 12:06:34.898468 4699 scope.go:117] "RemoveContainer" containerID="8fa98d33fc879620f19d54870e7f72b49540998de1c851b86a918b5e6e0ac2c7" Feb 26 12:06:34 crc kubenswrapper[4699]: I0226 12:06:34.902583 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p28sp" podStartSLOduration=2.262914161 podStartE2EDuration="5.902563085s" podCreationTimestamp="2026-02-26 12:06:29 +0000 UTC" firstStartedPulling="2026-02-26 12:06:30.814815924 +0000 UTC m=+3336.625642358" lastFinishedPulling="2026-02-26 12:06:34.454464848 +0000 UTC m=+3340.265291282" observedRunningTime="2026-02-26 12:06:34.898627404 +0000 UTC m=+3340.709453838" watchObservedRunningTime="2026-02-26 12:06:34.902563085 +0000 UTC m=+3340.713389519" Feb 26 12:06:34 crc kubenswrapper[4699]: I0226 12:06:34.945865 4699 scope.go:117] "RemoveContainer" containerID="ba60cd8917eafa6b99bbae2da6aee01e1b66669a8cfe9e66cb7282c9c7dbc3db" Feb 26 12:06:34 crc kubenswrapper[4699]: I0226 12:06:34.956977 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zjml6"] Feb 26 12:06:34 crc kubenswrapper[4699]: I0226 12:06:34.965634 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zjml6"] Feb 26 12:06:36 crc kubenswrapper[4699]: I0226 12:06:36.272392 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d1592fb-fd51-4a05-ac77-6094fe72263b" path="/var/lib/kubelet/pods/9d1592fb-fd51-4a05-ac77-6094fe72263b/volumes" Feb 26 12:06:39 crc kubenswrapper[4699]: I0226 12:06:39.381405 4699 scope.go:117] "RemoveContainer" containerID="133bac2de294eabd3d63693bc2552e8927f3fa0a60ee9ff7dd1f74c8eac8b98e" Feb 26 12:06:39 crc kubenswrapper[4699]: I0226 12:06:39.756773 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p28sp" Feb 26 12:06:39 crc kubenswrapper[4699]: I0226 12:06:39.756896 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p28sp" Feb 26 12:06:39 crc kubenswrapper[4699]: I0226 12:06:39.805326 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p28sp" Feb 26 12:06:39 crc kubenswrapper[4699]: I0226 12:06:39.988800 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p28sp" Feb 26 12:06:40 crc kubenswrapper[4699]: I0226 12:06:40.047533 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p28sp"] Feb 26 12:06:41 crc kubenswrapper[4699]: I0226 12:06:41.946097 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p28sp" podUID="e885b191-c6b8-4780-bc61-eaae0d82ad32" containerName="registry-server" containerID="cri-o://ca3b9ef1bb6b5ba34f2b3ea120521b34766663e638086b148598e9ebcbfb2efb" gracePeriod=2 Feb 26 12:06:42 crc kubenswrapper[4699]: I0226 12:06:42.458195 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p28sp" Feb 26 12:06:42 crc kubenswrapper[4699]: I0226 12:06:42.599665 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e885b191-c6b8-4780-bc61-eaae0d82ad32-catalog-content\") pod \"e885b191-c6b8-4780-bc61-eaae0d82ad32\" (UID: \"e885b191-c6b8-4780-bc61-eaae0d82ad32\") " Feb 26 12:06:42 crc kubenswrapper[4699]: I0226 12:06:42.599809 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsd4j\" (UniqueName: \"kubernetes.io/projected/e885b191-c6b8-4780-bc61-eaae0d82ad32-kube-api-access-jsd4j\") pod \"e885b191-c6b8-4780-bc61-eaae0d82ad32\" (UID: \"e885b191-c6b8-4780-bc61-eaae0d82ad32\") " Feb 26 12:06:42 crc kubenswrapper[4699]: I0226 12:06:42.600081 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e885b191-c6b8-4780-bc61-eaae0d82ad32-utilities\") pod \"e885b191-c6b8-4780-bc61-eaae0d82ad32\" (UID: \"e885b191-c6b8-4780-bc61-eaae0d82ad32\") " Feb 26 12:06:42 crc kubenswrapper[4699]: I0226 12:06:42.601448 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e885b191-c6b8-4780-bc61-eaae0d82ad32-utilities" (OuterVolumeSpecName: "utilities") pod "e885b191-c6b8-4780-bc61-eaae0d82ad32" (UID: "e885b191-c6b8-4780-bc61-eaae0d82ad32"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:06:42 crc kubenswrapper[4699]: I0226 12:06:42.607478 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e885b191-c6b8-4780-bc61-eaae0d82ad32-kube-api-access-jsd4j" (OuterVolumeSpecName: "kube-api-access-jsd4j") pod "e885b191-c6b8-4780-bc61-eaae0d82ad32" (UID: "e885b191-c6b8-4780-bc61-eaae0d82ad32"). InnerVolumeSpecName "kube-api-access-jsd4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:06:42 crc kubenswrapper[4699]: I0226 12:06:42.625176 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e885b191-c6b8-4780-bc61-eaae0d82ad32-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e885b191-c6b8-4780-bc61-eaae0d82ad32" (UID: "e885b191-c6b8-4780-bc61-eaae0d82ad32"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:06:42 crc kubenswrapper[4699]: I0226 12:06:42.702102 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsd4j\" (UniqueName: \"kubernetes.io/projected/e885b191-c6b8-4780-bc61-eaae0d82ad32-kube-api-access-jsd4j\") on node \"crc\" DevicePath \"\"" Feb 26 12:06:42 crc kubenswrapper[4699]: I0226 12:06:42.702330 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e885b191-c6b8-4780-bc61-eaae0d82ad32-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 12:06:42 crc kubenswrapper[4699]: I0226 12:06:42.702405 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e885b191-c6b8-4780-bc61-eaae0d82ad32-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 12:06:42 crc kubenswrapper[4699]: I0226 12:06:42.958703 4699 generic.go:334] "Generic (PLEG): container finished" podID="e885b191-c6b8-4780-bc61-eaae0d82ad32" containerID="ca3b9ef1bb6b5ba34f2b3ea120521b34766663e638086b148598e9ebcbfb2efb" exitCode=0 Feb 26 12:06:42 crc kubenswrapper[4699]: I0226 12:06:42.958787 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p28sp" Feb 26 12:06:42 crc kubenswrapper[4699]: I0226 12:06:42.958805 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p28sp" event={"ID":"e885b191-c6b8-4780-bc61-eaae0d82ad32","Type":"ContainerDied","Data":"ca3b9ef1bb6b5ba34f2b3ea120521b34766663e638086b148598e9ebcbfb2efb"} Feb 26 12:06:42 crc kubenswrapper[4699]: I0226 12:06:42.960169 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p28sp" event={"ID":"e885b191-c6b8-4780-bc61-eaae0d82ad32","Type":"ContainerDied","Data":"909ae1a31f3b190a5c879bc6499a8426947ec5b29159b933f594b746178602d1"} Feb 26 12:06:42 crc kubenswrapper[4699]: I0226 12:06:42.960199 4699 scope.go:117] "RemoveContainer" containerID="ca3b9ef1bb6b5ba34f2b3ea120521b34766663e638086b148598e9ebcbfb2efb" Feb 26 12:06:43 crc kubenswrapper[4699]: I0226 12:06:43.009179 4699 scope.go:117] "RemoveContainer" containerID="dd638da9b07bb5495355fbba9db461caf931aa048c85356f172f22e39da72d75" Feb 26 12:06:43 crc kubenswrapper[4699]: I0226 12:06:43.033270 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p28sp"] Feb 26 12:06:43 crc kubenswrapper[4699]: I0226 12:06:43.039894 4699 scope.go:117] "RemoveContainer" containerID="8160ec562f5b448ada6b4d935ee7e1221ceccd01d09aa2645648d7aa891ef261" Feb 26 12:06:43 crc kubenswrapper[4699]: I0226 12:06:43.068470 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p28sp"] Feb 26 12:06:43 crc kubenswrapper[4699]: I0226 12:06:43.117452 4699 scope.go:117] "RemoveContainer" containerID="ca3b9ef1bb6b5ba34f2b3ea120521b34766663e638086b148598e9ebcbfb2efb" Feb 26 12:06:43 crc kubenswrapper[4699]: E0226 12:06:43.120621 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca3b9ef1bb6b5ba34f2b3ea120521b34766663e638086b148598e9ebcbfb2efb\": container with ID starting with ca3b9ef1bb6b5ba34f2b3ea120521b34766663e638086b148598e9ebcbfb2efb not found: ID does not exist" containerID="ca3b9ef1bb6b5ba34f2b3ea120521b34766663e638086b148598e9ebcbfb2efb" Feb 26 12:06:43 crc kubenswrapper[4699]: I0226 12:06:43.120675 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca3b9ef1bb6b5ba34f2b3ea120521b34766663e638086b148598e9ebcbfb2efb"} err="failed to get container status \"ca3b9ef1bb6b5ba34f2b3ea120521b34766663e638086b148598e9ebcbfb2efb\": rpc error: code = NotFound desc = could not find container \"ca3b9ef1bb6b5ba34f2b3ea120521b34766663e638086b148598e9ebcbfb2efb\": container with ID starting with ca3b9ef1bb6b5ba34f2b3ea120521b34766663e638086b148598e9ebcbfb2efb not found: ID does not exist" Feb 26 12:06:43 crc kubenswrapper[4699]: I0226 12:06:43.120705 4699 scope.go:117] "RemoveContainer" containerID="dd638da9b07bb5495355fbba9db461caf931aa048c85356f172f22e39da72d75" Feb 26 12:06:43 crc kubenswrapper[4699]: E0226 12:06:43.124503 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd638da9b07bb5495355fbba9db461caf931aa048c85356f172f22e39da72d75\": container with ID starting with dd638da9b07bb5495355fbba9db461caf931aa048c85356f172f22e39da72d75 not found: ID does not exist" containerID="dd638da9b07bb5495355fbba9db461caf931aa048c85356f172f22e39da72d75" Feb 26 12:06:43 crc kubenswrapper[4699]: I0226 12:06:43.124568 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd638da9b07bb5495355fbba9db461caf931aa048c85356f172f22e39da72d75"} err="failed to get container status \"dd638da9b07bb5495355fbba9db461caf931aa048c85356f172f22e39da72d75\": rpc error: code = NotFound desc = could not find container \"dd638da9b07bb5495355fbba9db461caf931aa048c85356f172f22e39da72d75\": container with ID starting with dd638da9b07bb5495355fbba9db461caf931aa048c85356f172f22e39da72d75 not found: ID does not exist" Feb 26 12:06:43 crc kubenswrapper[4699]: I0226 12:06:43.124606 4699 scope.go:117] "RemoveContainer" containerID="8160ec562f5b448ada6b4d935ee7e1221ceccd01d09aa2645648d7aa891ef261" Feb 26 12:06:43 crc kubenswrapper[4699]: E0226 12:06:43.128389 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8160ec562f5b448ada6b4d935ee7e1221ceccd01d09aa2645648d7aa891ef261\": container with ID starting with 8160ec562f5b448ada6b4d935ee7e1221ceccd01d09aa2645648d7aa891ef261 not found: ID does not exist" containerID="8160ec562f5b448ada6b4d935ee7e1221ceccd01d09aa2645648d7aa891ef261" Feb 26 12:06:43 crc kubenswrapper[4699]: I0226 12:06:43.128429 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8160ec562f5b448ada6b4d935ee7e1221ceccd01d09aa2645648d7aa891ef261"} err="failed to get container status \"8160ec562f5b448ada6b4d935ee7e1221ceccd01d09aa2645648d7aa891ef261\": rpc error: code = NotFound desc = could not find container \"8160ec562f5b448ada6b4d935ee7e1221ceccd01d09aa2645648d7aa891ef261\": container with ID starting with 8160ec562f5b448ada6b4d935ee7e1221ceccd01d09aa2645648d7aa891ef261 not found: ID does not exist" Feb 26 12:06:44 crc kubenswrapper[4699]: I0226 12:06:44.272540 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e885b191-c6b8-4780-bc61-eaae0d82ad32" path="/var/lib/kubelet/pods/e885b191-c6b8-4780-bc61-eaae0d82ad32/volumes" Feb 26 12:08:00 crc kubenswrapper[4699]: I0226 12:08:00.142653 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535128-4vkb4"] Feb 26 12:08:00 crc kubenswrapper[4699]: E0226 12:08:00.143588 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d1592fb-fd51-4a05-ac77-6094fe72263b" containerName="extract-content" Feb 26 12:08:00 crc kubenswrapper[4699]: I0226 12:08:00.143603 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d1592fb-fd51-4a05-ac77-6094fe72263b" containerName="extract-content" Feb 26 12:08:00 crc kubenswrapper[4699]: E0226 12:08:00.143616 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d1592fb-fd51-4a05-ac77-6094fe72263b" containerName="registry-server" Feb 26 12:08:00 crc kubenswrapper[4699]: I0226 12:08:00.143622 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d1592fb-fd51-4a05-ac77-6094fe72263b" containerName="registry-server" Feb 26 12:08:00 crc kubenswrapper[4699]: E0226 12:08:00.143652 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e885b191-c6b8-4780-bc61-eaae0d82ad32" containerName="extract-utilities" Feb 26 12:08:00 crc kubenswrapper[4699]: I0226 12:08:00.143660 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="e885b191-c6b8-4780-bc61-eaae0d82ad32" containerName="extract-utilities" Feb 26 12:08:00 crc kubenswrapper[4699]: E0226 12:08:00.143672 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e885b191-c6b8-4780-bc61-eaae0d82ad32" containerName="extract-content" Feb 26 12:08:00 crc kubenswrapper[4699]: I0226 12:08:00.143678 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="e885b191-c6b8-4780-bc61-eaae0d82ad32" containerName="extract-content" Feb 26 12:08:00 crc kubenswrapper[4699]: E0226 12:08:00.143689 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d1592fb-fd51-4a05-ac77-6094fe72263b" containerName="extract-utilities" Feb 26 12:08:00 crc kubenswrapper[4699]: I0226 12:08:00.143695 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d1592fb-fd51-4a05-ac77-6094fe72263b" containerName="extract-utilities" Feb 26 12:08:00 crc kubenswrapper[4699]: E0226 12:08:00.143707 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e885b191-c6b8-4780-bc61-eaae0d82ad32" containerName="registry-server" Feb 26 12:08:00 crc kubenswrapper[4699]: I0226 12:08:00.143713 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="e885b191-c6b8-4780-bc61-eaae0d82ad32" containerName="registry-server" Feb 26 12:08:00 crc kubenswrapper[4699]: I0226 12:08:00.143886 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="e885b191-c6b8-4780-bc61-eaae0d82ad32" containerName="registry-server" Feb 26 12:08:00 crc kubenswrapper[4699]: I0226 12:08:00.143901 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d1592fb-fd51-4a05-ac77-6094fe72263b" containerName="registry-server" Feb 26 12:08:00 crc kubenswrapper[4699]: I0226 12:08:00.144586 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535128-4vkb4" Feb 26 12:08:00 crc kubenswrapper[4699]: I0226 12:08:00.147309 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 12:08:00 crc kubenswrapper[4699]: I0226 12:08:00.149054 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 12:08:00 crc kubenswrapper[4699]: I0226 12:08:00.149264 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 12:08:00 crc kubenswrapper[4699]: I0226 12:08:00.151483 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535128-4vkb4"] Feb 26 12:08:00 crc kubenswrapper[4699]: I0226 12:08:00.196713 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8wjp\" (UniqueName: \"kubernetes.io/projected/0e519986-41ca-4360-b9bd-14a485e9a635-kube-api-access-s8wjp\") pod \"auto-csr-approver-29535128-4vkb4\" (UID: \"0e519986-41ca-4360-b9bd-14a485e9a635\") " pod="openshift-infra/auto-csr-approver-29535128-4vkb4" Feb 26 12:08:00 crc kubenswrapper[4699]: I0226 12:08:00.298657 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8wjp\" (UniqueName: \"kubernetes.io/projected/0e519986-41ca-4360-b9bd-14a485e9a635-kube-api-access-s8wjp\") pod \"auto-csr-approver-29535128-4vkb4\" (UID: \"0e519986-41ca-4360-b9bd-14a485e9a635\") " pod="openshift-infra/auto-csr-approver-29535128-4vkb4" Feb 26 12:08:00 crc kubenswrapper[4699]: I0226 12:08:00.320884 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8wjp\" (UniqueName: \"kubernetes.io/projected/0e519986-41ca-4360-b9bd-14a485e9a635-kube-api-access-s8wjp\") pod \"auto-csr-approver-29535128-4vkb4\" (UID: \"0e519986-41ca-4360-b9bd-14a485e9a635\") " pod="openshift-infra/auto-csr-approver-29535128-4vkb4" Feb 26 12:08:00 crc kubenswrapper[4699]: I0226 12:08:00.467652 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535128-4vkb4" Feb 26 12:08:00 crc kubenswrapper[4699]: I0226 12:08:00.891867 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535128-4vkb4"] Feb 26 12:08:01 crc kubenswrapper[4699]: I0226 12:08:01.688689 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535128-4vkb4" event={"ID":"0e519986-41ca-4360-b9bd-14a485e9a635","Type":"ContainerStarted","Data":"a2c85cba99d29d47dea8e6e3a3d1b5671f747058fa2193a1bb1b8f06acfab4fd"} Feb 26 12:08:02 crc kubenswrapper[4699]: I0226 12:08:02.698807 4699 generic.go:334] "Generic (PLEG): container finished" podID="0e519986-41ca-4360-b9bd-14a485e9a635" containerID="edcce5d1b2431ea73d4d1a16900e65c51edf48fce3e10f865133733ba98e31ff" exitCode=0 Feb 26 12:08:02 crc kubenswrapper[4699]: I0226 12:08:02.698867 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535128-4vkb4" event={"ID":"0e519986-41ca-4360-b9bd-14a485e9a635","Type":"ContainerDied","Data":"edcce5d1b2431ea73d4d1a16900e65c51edf48fce3e10f865133733ba98e31ff"} Feb 26 12:08:04 crc kubenswrapper[4699]: I0226 12:08:04.171785 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535128-4vkb4" Feb 26 12:08:04 crc kubenswrapper[4699]: I0226 12:08:04.273100 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8wjp\" (UniqueName: \"kubernetes.io/projected/0e519986-41ca-4360-b9bd-14a485e9a635-kube-api-access-s8wjp\") pod \"0e519986-41ca-4360-b9bd-14a485e9a635\" (UID: \"0e519986-41ca-4360-b9bd-14a485e9a635\") " Feb 26 12:08:04 crc kubenswrapper[4699]: I0226 12:08:04.279013 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e519986-41ca-4360-b9bd-14a485e9a635-kube-api-access-s8wjp" (OuterVolumeSpecName: "kube-api-access-s8wjp") pod "0e519986-41ca-4360-b9bd-14a485e9a635" (UID: "0e519986-41ca-4360-b9bd-14a485e9a635"). InnerVolumeSpecName "kube-api-access-s8wjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:08:04 crc kubenswrapper[4699]: I0226 12:08:04.378417 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8wjp\" (UniqueName: \"kubernetes.io/projected/0e519986-41ca-4360-b9bd-14a485e9a635-kube-api-access-s8wjp\") on node \"crc\" DevicePath \"\"" Feb 26 12:08:04 crc kubenswrapper[4699]: I0226 12:08:04.719979 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535128-4vkb4" event={"ID":"0e519986-41ca-4360-b9bd-14a485e9a635","Type":"ContainerDied","Data":"a2c85cba99d29d47dea8e6e3a3d1b5671f747058fa2193a1bb1b8f06acfab4fd"} Feb 26 12:08:04 crc kubenswrapper[4699]: I0226 12:08:04.720238 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2c85cba99d29d47dea8e6e3a3d1b5671f747058fa2193a1bb1b8f06acfab4fd" Feb 26 12:08:04 crc kubenswrapper[4699]: I0226 12:08:04.720044 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535128-4vkb4" Feb 26 12:08:05 crc kubenswrapper[4699]: I0226 12:08:05.253177 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535122-xp67b"] Feb 26 12:08:05 crc kubenswrapper[4699]: I0226 12:08:05.261251 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535122-xp67b"] Feb 26 12:08:06 crc kubenswrapper[4699]: I0226 12:08:06.272730 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27a271ab-4d30-4863-b3f6-74750cc65a91" path="/var/lib/kubelet/pods/27a271ab-4d30-4863-b3f6-74750cc65a91/volumes" Feb 26 12:08:11 crc kubenswrapper[4699]: I0226 12:08:11.585174 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 12:08:11 crc kubenswrapper[4699]: I0226 12:08:11.585734 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 12:08:30 crc kubenswrapper[4699]: I0226 12:08:30.218834 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j9cvm"] Feb 26 12:08:30 crc kubenswrapper[4699]: E0226 12:08:30.219974 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e519986-41ca-4360-b9bd-14a485e9a635" containerName="oc" Feb 26 12:08:30 crc kubenswrapper[4699]: I0226 12:08:30.219990 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e519986-41ca-4360-b9bd-14a485e9a635" containerName="oc" Feb 26 12:08:30 crc kubenswrapper[4699]: I0226 12:08:30.220333 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e519986-41ca-4360-b9bd-14a485e9a635" containerName="oc" Feb 26 12:08:30 crc kubenswrapper[4699]: I0226 12:08:30.222159 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j9cvm" Feb 26 12:08:30 crc kubenswrapper[4699]: I0226 12:08:30.237157 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j9cvm"] Feb 26 12:08:30 crc kubenswrapper[4699]: I0226 12:08:30.295010 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60aaedcf-19db-44db-848e-bc1c1f21bf5e-catalog-content\") pod \"redhat-operators-j9cvm\" (UID: \"60aaedcf-19db-44db-848e-bc1c1f21bf5e\") " pod="openshift-marketplace/redhat-operators-j9cvm" Feb 26 12:08:30 crc kubenswrapper[4699]: I0226 12:08:30.295081 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60aaedcf-19db-44db-848e-bc1c1f21bf5e-utilities\") pod \"redhat-operators-j9cvm\" (UID: \"60aaedcf-19db-44db-848e-bc1c1f21bf5e\") " pod="openshift-marketplace/redhat-operators-j9cvm" Feb 26 12:08:30 crc kubenswrapper[4699]: I0226 12:08:30.295137 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5hsq\" (UniqueName: \"kubernetes.io/projected/60aaedcf-19db-44db-848e-bc1c1f21bf5e-kube-api-access-c5hsq\") pod \"redhat-operators-j9cvm\" (UID: \"60aaedcf-19db-44db-848e-bc1c1f21bf5e\") " pod="openshift-marketplace/redhat-operators-j9cvm" Feb 26 12:08:30 crc kubenswrapper[4699]: I0226 12:08:30.397976 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60aaedcf-19db-44db-848e-bc1c1f21bf5e-catalog-content\") pod \"redhat-operators-j9cvm\" (UID: \"60aaedcf-19db-44db-848e-bc1c1f21bf5e\") " pod="openshift-marketplace/redhat-operators-j9cvm" Feb 26 12:08:30 crc kubenswrapper[4699]: I0226 12:08:30.398084 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60aaedcf-19db-44db-848e-bc1c1f21bf5e-utilities\") pod \"redhat-operators-j9cvm\" (UID: \"60aaedcf-19db-44db-848e-bc1c1f21bf5e\") " pod="openshift-marketplace/redhat-operators-j9cvm" Feb 26 12:08:30 crc kubenswrapper[4699]: I0226 12:08:30.398146 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5hsq\" (UniqueName: \"kubernetes.io/projected/60aaedcf-19db-44db-848e-bc1c1f21bf5e-kube-api-access-c5hsq\") pod \"redhat-operators-j9cvm\" (UID: \"60aaedcf-19db-44db-848e-bc1c1f21bf5e\") " pod="openshift-marketplace/redhat-operators-j9cvm" Feb 26 12:08:30 crc kubenswrapper[4699]: I0226 12:08:30.399753 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60aaedcf-19db-44db-848e-bc1c1f21bf5e-utilities\") pod \"redhat-operators-j9cvm\" (UID: \"60aaedcf-19db-44db-848e-bc1c1f21bf5e\") " pod="openshift-marketplace/redhat-operators-j9cvm" Feb 26 12:08:30 crc kubenswrapper[4699]: I0226 12:08:30.399927 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60aaedcf-19db-44db-848e-bc1c1f21bf5e-catalog-content\") pod \"redhat-operators-j9cvm\" (UID: \"60aaedcf-19db-44db-848e-bc1c1f21bf5e\") " pod="openshift-marketplace/redhat-operators-j9cvm" Feb 26 12:08:30 crc kubenswrapper[4699]: I0226 12:08:30.428363 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5hsq\" (UniqueName: \"kubernetes.io/projected/60aaedcf-19db-44db-848e-bc1c1f21bf5e-kube-api-access-c5hsq\") pod \"redhat-operators-j9cvm\" (UID: \"60aaedcf-19db-44db-848e-bc1c1f21bf5e\") " pod="openshift-marketplace/redhat-operators-j9cvm" Feb 26 12:08:30 crc kubenswrapper[4699]: I0226 12:08:30.546166 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j9cvm" Feb 26 12:08:31 crc kubenswrapper[4699]: I0226 12:08:31.034711 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j9cvm"] Feb 26 12:08:31 crc kubenswrapper[4699]: I0226 12:08:31.962702 4699 generic.go:334] "Generic (PLEG): container finished" podID="60aaedcf-19db-44db-848e-bc1c1f21bf5e" containerID="52cffe73c6ff4eeb0777e02dbbe4e74948138030e8e11c482a1bb38359f089c9" exitCode=0 Feb 26 12:08:31 crc kubenswrapper[4699]: I0226 12:08:31.962821 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9cvm" event={"ID":"60aaedcf-19db-44db-848e-bc1c1f21bf5e","Type":"ContainerDied","Data":"52cffe73c6ff4eeb0777e02dbbe4e74948138030e8e11c482a1bb38359f089c9"} Feb 26 12:08:31 crc kubenswrapper[4699]: I0226 12:08:31.963109 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9cvm" event={"ID":"60aaedcf-19db-44db-848e-bc1c1f21bf5e","Type":"ContainerStarted","Data":"34c5da8ce72a6b46d599d468d487b9c2112b612ace915f6720aa7df89d07fd4d"} Feb 26 12:08:35 crc kubenswrapper[4699]: I0226 12:08:35.993796 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9cvm" event={"ID":"60aaedcf-19db-44db-848e-bc1c1f21bf5e","Type":"ContainerStarted","Data":"99b42523d61f3676353772bdbec5886c84aaf0dac806fa9435ac0bae9259e417"} Feb 26 12:08:39 crc kubenswrapper[4699]: I0226 12:08:39.516026 4699 scope.go:117] "RemoveContainer" containerID="f8af8d4fb65b858c79bdd65cde626e347dc9e20fd0df6dcb1821aae0c9ee9b41" Feb 26 12:08:41 crc kubenswrapper[4699]: I0226 12:08:41.585150 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 12:08:41 crc kubenswrapper[4699]: I0226 12:08:41.585496 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 12:08:42 crc kubenswrapper[4699]: I0226 12:08:42.052939 4699 generic.go:334] "Generic (PLEG): container finished" podID="60aaedcf-19db-44db-848e-bc1c1f21bf5e" containerID="99b42523d61f3676353772bdbec5886c84aaf0dac806fa9435ac0bae9259e417" exitCode=0 Feb 26 12:08:42 crc kubenswrapper[4699]: I0226 12:08:42.052998 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9cvm" event={"ID":"60aaedcf-19db-44db-848e-bc1c1f21bf5e","Type":"ContainerDied","Data":"99b42523d61f3676353772bdbec5886c84aaf0dac806fa9435ac0bae9259e417"} Feb 26 12:08:42 crc kubenswrapper[4699]: I0226 12:08:42.056003 4699 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 12:08:43 crc kubenswrapper[4699]: I0226 12:08:43.065425 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9cvm" event={"ID":"60aaedcf-19db-44db-848e-bc1c1f21bf5e","Type":"ContainerStarted","Data":"b36ccaaf0a6bf8d5b84ac0b134b909bb1978a209c041449567b49679e51dd705"} Feb 26 12:08:43 crc kubenswrapper[4699]: I0226 12:08:43.090944 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j9cvm" podStartSLOduration=2.642229999 podStartE2EDuration="13.090914088s" podCreationTimestamp="2026-02-26 12:08:30 +0000 UTC" firstStartedPulling="2026-02-26 12:08:31.965020508 +0000 UTC m=+3457.775846942" lastFinishedPulling="2026-02-26 12:08:42.413704577 +0000 UTC m=+3468.224531031" observedRunningTime="2026-02-26 12:08:43.083439016 +0000 UTC m=+3468.894265460" watchObservedRunningTime="2026-02-26 12:08:43.090914088 +0000 UTC m=+3468.901740532" Feb 26 12:08:50 crc kubenswrapper[4699]: I0226 12:08:50.547321 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j9cvm" Feb 26 12:08:50 crc kubenswrapper[4699]: I0226 12:08:50.547882 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j9cvm" Feb 26 12:08:50 crc kubenswrapper[4699]: I0226 12:08:50.598130 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j9cvm" Feb 26 12:08:51 crc kubenswrapper[4699]: I0226 12:08:51.189966 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j9cvm" Feb 26 12:08:51 crc kubenswrapper[4699]: I0226 12:08:51.235284 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j9cvm"] Feb 26 12:08:53 crc kubenswrapper[4699]: I0226 12:08:53.152062 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j9cvm" podUID="60aaedcf-19db-44db-848e-bc1c1f21bf5e" containerName="registry-server" containerID="cri-o://b36ccaaf0a6bf8d5b84ac0b134b909bb1978a209c041449567b49679e51dd705" gracePeriod=2 Feb 26 12:08:53 crc kubenswrapper[4699]: I0226 12:08:53.614973 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j9cvm" Feb 26 12:08:53 crc kubenswrapper[4699]: I0226 12:08:53.702580 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60aaedcf-19db-44db-848e-bc1c1f21bf5e-utilities\") pod \"60aaedcf-19db-44db-848e-bc1c1f21bf5e\" (UID: \"60aaedcf-19db-44db-848e-bc1c1f21bf5e\") " Feb 26 12:08:53 crc kubenswrapper[4699]: I0226 12:08:53.702754 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5hsq\" (UniqueName: \"kubernetes.io/projected/60aaedcf-19db-44db-848e-bc1c1f21bf5e-kube-api-access-c5hsq\") pod \"60aaedcf-19db-44db-848e-bc1c1f21bf5e\" (UID: \"60aaedcf-19db-44db-848e-bc1c1f21bf5e\") " Feb 26 12:08:53 crc kubenswrapper[4699]: I0226 12:08:53.702904 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60aaedcf-19db-44db-848e-bc1c1f21bf5e-catalog-content\") pod \"60aaedcf-19db-44db-848e-bc1c1f21bf5e\" (UID: \"60aaedcf-19db-44db-848e-bc1c1f21bf5e\") " Feb 26 12:08:53 crc kubenswrapper[4699]: I0226 12:08:53.703245 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60aaedcf-19db-44db-848e-bc1c1f21bf5e-utilities" (OuterVolumeSpecName: "utilities") pod "60aaedcf-19db-44db-848e-bc1c1f21bf5e" (UID: "60aaedcf-19db-44db-848e-bc1c1f21bf5e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:08:53 crc kubenswrapper[4699]: I0226 12:08:53.703614 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60aaedcf-19db-44db-848e-bc1c1f21bf5e-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 12:08:53 crc kubenswrapper[4699]: I0226 12:08:53.708549 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60aaedcf-19db-44db-848e-bc1c1f21bf5e-kube-api-access-c5hsq" (OuterVolumeSpecName: "kube-api-access-c5hsq") pod "60aaedcf-19db-44db-848e-bc1c1f21bf5e" (UID: "60aaedcf-19db-44db-848e-bc1c1f21bf5e"). InnerVolumeSpecName "kube-api-access-c5hsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:08:53 crc kubenswrapper[4699]: I0226 12:08:53.805273 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5hsq\" (UniqueName: \"kubernetes.io/projected/60aaedcf-19db-44db-848e-bc1c1f21bf5e-kube-api-access-c5hsq\") on node \"crc\" DevicePath \"\"" Feb 26 12:08:53 crc kubenswrapper[4699]: I0226 12:08:53.828186 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60aaedcf-19db-44db-848e-bc1c1f21bf5e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60aaedcf-19db-44db-848e-bc1c1f21bf5e" (UID: "60aaedcf-19db-44db-848e-bc1c1f21bf5e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:08:53 crc kubenswrapper[4699]: I0226 12:08:53.907053 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60aaedcf-19db-44db-848e-bc1c1f21bf5e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 12:08:54 crc kubenswrapper[4699]: I0226 12:08:54.163633 4699 generic.go:334] "Generic (PLEG): container finished" podID="60aaedcf-19db-44db-848e-bc1c1f21bf5e" containerID="b36ccaaf0a6bf8d5b84ac0b134b909bb1978a209c041449567b49679e51dd705" exitCode=0 Feb 26 12:08:54 crc kubenswrapper[4699]: I0226 12:08:54.163678 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j9cvm" Feb 26 12:08:54 crc kubenswrapper[4699]: I0226 12:08:54.163687 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9cvm" event={"ID":"60aaedcf-19db-44db-848e-bc1c1f21bf5e","Type":"ContainerDied","Data":"b36ccaaf0a6bf8d5b84ac0b134b909bb1978a209c041449567b49679e51dd705"} Feb 26 12:08:54 crc kubenswrapper[4699]: I0226 12:08:54.163733 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9cvm" event={"ID":"60aaedcf-19db-44db-848e-bc1c1f21bf5e","Type":"ContainerDied","Data":"34c5da8ce72a6b46d599d468d487b9c2112b612ace915f6720aa7df89d07fd4d"} Feb 26 12:08:54 crc kubenswrapper[4699]: I0226 12:08:54.163755 4699 scope.go:117] "RemoveContainer" containerID="b36ccaaf0a6bf8d5b84ac0b134b909bb1978a209c041449567b49679e51dd705" Feb 26 12:08:54 crc kubenswrapper[4699]: I0226 12:08:54.190017 4699 scope.go:117] "RemoveContainer" containerID="99b42523d61f3676353772bdbec5886c84aaf0dac806fa9435ac0bae9259e417" Feb 26 12:08:54 crc kubenswrapper[4699]: I0226 12:08:54.197588 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j9cvm"] Feb 26 12:08:54 crc kubenswrapper[4699]: I0226 12:08:54.207383 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j9cvm"] Feb 26 12:08:54 crc kubenswrapper[4699]: I0226 12:08:54.224954 4699 scope.go:117] "RemoveContainer" containerID="52cffe73c6ff4eeb0777e02dbbe4e74948138030e8e11c482a1bb38359f089c9" Feb 26 12:08:54 crc kubenswrapper[4699]: I0226 12:08:54.252320 4699 scope.go:117] "RemoveContainer" containerID="b36ccaaf0a6bf8d5b84ac0b134b909bb1978a209c041449567b49679e51dd705" Feb 26 12:08:54 crc kubenswrapper[4699]: E0226 12:08:54.253086 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b36ccaaf0a6bf8d5b84ac0b134b909bb1978a209c041449567b49679e51dd705\": container with ID starting with b36ccaaf0a6bf8d5b84ac0b134b909bb1978a209c041449567b49679e51dd705 not found: ID does not exist" containerID="b36ccaaf0a6bf8d5b84ac0b134b909bb1978a209c041449567b49679e51dd705" Feb 26 12:08:54 crc kubenswrapper[4699]: I0226 12:08:54.253211 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b36ccaaf0a6bf8d5b84ac0b134b909bb1978a209c041449567b49679e51dd705"} err="failed to get container status \"b36ccaaf0a6bf8d5b84ac0b134b909bb1978a209c041449567b49679e51dd705\": rpc error: code = NotFound desc = could not find container \"b36ccaaf0a6bf8d5b84ac0b134b909bb1978a209c041449567b49679e51dd705\": container with ID starting with b36ccaaf0a6bf8d5b84ac0b134b909bb1978a209c041449567b49679e51dd705 not found: ID does not exist" Feb 26 12:08:54 crc kubenswrapper[4699]: I0226 12:08:54.253248 4699 scope.go:117] "RemoveContainer" containerID="99b42523d61f3676353772bdbec5886c84aaf0dac806fa9435ac0bae9259e417" Feb 26 12:08:54 crc kubenswrapper[4699]: E0226 12:08:54.253582 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99b42523d61f3676353772bdbec5886c84aaf0dac806fa9435ac0bae9259e417\": container with ID starting with 99b42523d61f3676353772bdbec5886c84aaf0dac806fa9435ac0bae9259e417 not found: ID does not exist" containerID="99b42523d61f3676353772bdbec5886c84aaf0dac806fa9435ac0bae9259e417" Feb 26 12:08:54 crc kubenswrapper[4699]: I0226 12:08:54.253612 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99b42523d61f3676353772bdbec5886c84aaf0dac806fa9435ac0bae9259e417"} err="failed to get container status \"99b42523d61f3676353772bdbec5886c84aaf0dac806fa9435ac0bae9259e417\": rpc error: code = NotFound desc = could not find container \"99b42523d61f3676353772bdbec5886c84aaf0dac806fa9435ac0bae9259e417\": container with ID starting with 99b42523d61f3676353772bdbec5886c84aaf0dac806fa9435ac0bae9259e417 not found: ID does not exist" Feb 26 12:08:54 crc kubenswrapper[4699]: I0226 12:08:54.253644 4699 scope.go:117] "RemoveContainer" containerID="52cffe73c6ff4eeb0777e02dbbe4e74948138030e8e11c482a1bb38359f089c9" Feb 26 12:08:54 crc kubenswrapper[4699]: E0226 12:08:54.253915 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52cffe73c6ff4eeb0777e02dbbe4e74948138030e8e11c482a1bb38359f089c9\": container with ID starting with 52cffe73c6ff4eeb0777e02dbbe4e74948138030e8e11c482a1bb38359f089c9 not found: ID does not exist" containerID="52cffe73c6ff4eeb0777e02dbbe4e74948138030e8e11c482a1bb38359f089c9" Feb 26 12:08:54 crc kubenswrapper[4699]: I0226 12:08:54.253960 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52cffe73c6ff4eeb0777e02dbbe4e74948138030e8e11c482a1bb38359f089c9"} err="failed to get container status \"52cffe73c6ff4eeb0777e02dbbe4e74948138030e8e11c482a1bb38359f089c9\": rpc error: code = NotFound desc = could not find container \"52cffe73c6ff4eeb0777e02dbbe4e74948138030e8e11c482a1bb38359f089c9\": container with ID starting with 52cffe73c6ff4eeb0777e02dbbe4e74948138030e8e11c482a1bb38359f089c9 not found: ID does not exist" Feb 26 12:08:54 crc kubenswrapper[4699]: I0226 12:08:54.271480 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60aaedcf-19db-44db-848e-bc1c1f21bf5e" path="/var/lib/kubelet/pods/60aaedcf-19db-44db-848e-bc1c1f21bf5e/volumes" Feb 26 12:09:11 crc kubenswrapper[4699]: I0226 12:09:11.585194 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 12:09:11 crc kubenswrapper[4699]: I0226 12:09:11.585727 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 12:09:11 crc kubenswrapper[4699]: I0226 12:09:11.585787 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 12:09:11 crc kubenswrapper[4699]: I0226 12:09:11.586585 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"243333360a6594f8acbe71e1e9197448e74ac1a0258671779fb6af974ca032dd"} pod="openshift-machine-config-operator/machine-config-daemon-28p79" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 12:09:11 crc kubenswrapper[4699]: I0226 12:09:11.586640 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" containerID="cri-o://243333360a6594f8acbe71e1e9197448e74ac1a0258671779fb6af974ca032dd" gracePeriod=600 Feb 26 12:09:12 crc kubenswrapper[4699]: I0226 12:09:12.318256 4699 generic.go:334] "Generic (PLEG): container finished" podID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerID="243333360a6594f8acbe71e1e9197448e74ac1a0258671779fb6af974ca032dd" exitCode=0 Feb 26 12:09:12 crc kubenswrapper[4699]: I0226 12:09:12.318341 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerDied","Data":"243333360a6594f8acbe71e1e9197448e74ac1a0258671779fb6af974ca032dd"} Feb 26 12:09:12 crc kubenswrapper[4699]: I0226 12:09:12.318560 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerStarted","Data":"994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7"} Feb 26 12:09:12 crc kubenswrapper[4699]: I0226 12:09:12.318580 4699 scope.go:117] "RemoveContainer" containerID="2f17994adbe3e6b82be8b555f7785740546350970bac8074db0cb6adbacaf948" Feb 26 12:09:36 crc kubenswrapper[4699]: I0226 12:09:36.569690 4699 generic.go:334] "Generic (PLEG): container finished" podID="19e02200-91be-49f8-8174-4a0bf6cda9dd" containerID="084f210d6c46d1c100bf0bcfdc7ffd17238944ee1beffdf271d0e8035c249561" exitCode=0 Feb 26 12:09:36 crc kubenswrapper[4699]: I0226 12:09:36.569773 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"19e02200-91be-49f8-8174-4a0bf6cda9dd","Type":"ContainerDied","Data":"084f210d6c46d1c100bf0bcfdc7ffd17238944ee1beffdf271d0e8035c249561"} Feb 26 12:09:37 crc kubenswrapper[4699]: I0226 12:09:37.990703 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.040541 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qw89z\" (UniqueName: \"kubernetes.io/projected/19e02200-91be-49f8-8174-4a0bf6cda9dd-kube-api-access-qw89z\") pod \"19e02200-91be-49f8-8174-4a0bf6cda9dd\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.040625 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"19e02200-91be-49f8-8174-4a0bf6cda9dd\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.040677 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/19e02200-91be-49f8-8174-4a0bf6cda9dd-openstack-config-secret\") pod \"19e02200-91be-49f8-8174-4a0bf6cda9dd\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.040702 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19e02200-91be-49f8-8174-4a0bf6cda9dd-config-data\") pod \"19e02200-91be-49f8-8174-4a0bf6cda9dd\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.040776 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/19e02200-91be-49f8-8174-4a0bf6cda9dd-test-operator-ephemeral-workdir\") pod \"19e02200-91be-49f8-8174-4a0bf6cda9dd\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.040836 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/19e02200-91be-49f8-8174-4a0bf6cda9dd-openstack-config\") pod \"19e02200-91be-49f8-8174-4a0bf6cda9dd\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.040871 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/19e02200-91be-49f8-8174-4a0bf6cda9dd-test-operator-ephemeral-temporary\") pod \"19e02200-91be-49f8-8174-4a0bf6cda9dd\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.040896 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/19e02200-91be-49f8-8174-4a0bf6cda9dd-ca-certs\") pod \"19e02200-91be-49f8-8174-4a0bf6cda9dd\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.040937 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19e02200-91be-49f8-8174-4a0bf6cda9dd-ssh-key\") pod \"19e02200-91be-49f8-8174-4a0bf6cda9dd\" (UID: \"19e02200-91be-49f8-8174-4a0bf6cda9dd\") " Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.041366 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19e02200-91be-49f8-8174-4a0bf6cda9dd-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "19e02200-91be-49f8-8174-4a0bf6cda9dd" (UID: "19e02200-91be-49f8-8174-4a0bf6cda9dd"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.041632 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19e02200-91be-49f8-8174-4a0bf6cda9dd-config-data" (OuterVolumeSpecName: "config-data") pod "19e02200-91be-49f8-8174-4a0bf6cda9dd" (UID: "19e02200-91be-49f8-8174-4a0bf6cda9dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.041860 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19e02200-91be-49f8-8174-4a0bf6cda9dd-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.041879 4699 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/19e02200-91be-49f8-8174-4a0bf6cda9dd-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.045737 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19e02200-91be-49f8-8174-4a0bf6cda9dd-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "19e02200-91be-49f8-8174-4a0bf6cda9dd" (UID: "19e02200-91be-49f8-8174-4a0bf6cda9dd"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.046092 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "19e02200-91be-49f8-8174-4a0bf6cda9dd" (UID: "19e02200-91be-49f8-8174-4a0bf6cda9dd"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.046424 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19e02200-91be-49f8-8174-4a0bf6cda9dd-kube-api-access-qw89z" (OuterVolumeSpecName: "kube-api-access-qw89z") pod "19e02200-91be-49f8-8174-4a0bf6cda9dd" (UID: "19e02200-91be-49f8-8174-4a0bf6cda9dd"). InnerVolumeSpecName "kube-api-access-qw89z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.074795 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19e02200-91be-49f8-8174-4a0bf6cda9dd-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "19e02200-91be-49f8-8174-4a0bf6cda9dd" (UID: "19e02200-91be-49f8-8174-4a0bf6cda9dd"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.076229 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19e02200-91be-49f8-8174-4a0bf6cda9dd-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "19e02200-91be-49f8-8174-4a0bf6cda9dd" (UID: "19e02200-91be-49f8-8174-4a0bf6cda9dd"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.088640 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19e02200-91be-49f8-8174-4a0bf6cda9dd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "19e02200-91be-49f8-8174-4a0bf6cda9dd" (UID: "19e02200-91be-49f8-8174-4a0bf6cda9dd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.100695 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19e02200-91be-49f8-8174-4a0bf6cda9dd-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "19e02200-91be-49f8-8174-4a0bf6cda9dd" (UID: "19e02200-91be-49f8-8174-4a0bf6cda9dd"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.144304 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qw89z\" (UniqueName: \"kubernetes.io/projected/19e02200-91be-49f8-8174-4a0bf6cda9dd-kube-api-access-qw89z\") on node \"crc\" DevicePath \"\"" Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.144386 4699 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.144399 4699 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/19e02200-91be-49f8-8174-4a0bf6cda9dd-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.144409 4699 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/19e02200-91be-49f8-8174-4a0bf6cda9dd-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.144420 4699 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/19e02200-91be-49f8-8174-4a0bf6cda9dd-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.144428 4699 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/19e02200-91be-49f8-8174-4a0bf6cda9dd-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.144437 4699 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19e02200-91be-49f8-8174-4a0bf6cda9dd-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.172077 4699 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.246512 4699 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.588409 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"19e02200-91be-49f8-8174-4a0bf6cda9dd","Type":"ContainerDied","Data":"0f178f25ec5476c2b73a67092a0049cc1be8c1984e676d6f03c82e6dac970a0f"} Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.588462 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f178f25ec5476c2b73a67092a0049cc1be8c1984e676d6f03c82e6dac970a0f" Feb 26 12:09:38 crc kubenswrapper[4699]: I0226 12:09:38.588513 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 26 12:09:41 crc kubenswrapper[4699]: I0226 12:09:41.587136 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 26 12:09:41 crc kubenswrapper[4699]: E0226 12:09:41.587932 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60aaedcf-19db-44db-848e-bc1c1f21bf5e" containerName="registry-server" Feb 26 12:09:41 crc kubenswrapper[4699]: I0226 12:09:41.587946 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="60aaedcf-19db-44db-848e-bc1c1f21bf5e" containerName="registry-server" Feb 26 12:09:41 crc kubenswrapper[4699]: E0226 12:09:41.587967 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60aaedcf-19db-44db-848e-bc1c1f21bf5e" containerName="extract-content" Feb 26 12:09:41 crc kubenswrapper[4699]: I0226 12:09:41.587973 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="60aaedcf-19db-44db-848e-bc1c1f21bf5e" containerName="extract-content" Feb 26 12:09:41 crc kubenswrapper[4699]: E0226 12:09:41.588000 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e02200-91be-49f8-8174-4a0bf6cda9dd" containerName="tempest-tests-tempest-tests-runner" Feb 26 12:09:41 crc kubenswrapper[4699]: I0226 12:09:41.588007 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e02200-91be-49f8-8174-4a0bf6cda9dd" containerName="tempest-tests-tempest-tests-runner" Feb 26 12:09:41 crc kubenswrapper[4699]: E0226 12:09:41.588021 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60aaedcf-19db-44db-848e-bc1c1f21bf5e" containerName="extract-utilities" Feb 26 12:09:41 crc kubenswrapper[4699]: I0226 12:09:41.588027 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="60aaedcf-19db-44db-848e-bc1c1f21bf5e" containerName="extract-utilities" Feb 26 12:09:41 crc kubenswrapper[4699]: I0226 12:09:41.588233 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="19e02200-91be-49f8-8174-4a0bf6cda9dd" containerName="tempest-tests-tempest-tests-runner" Feb 26 12:09:41 crc kubenswrapper[4699]: I0226 12:09:41.588247 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="60aaedcf-19db-44db-848e-bc1c1f21bf5e" containerName="registry-server" Feb 26 12:09:41 crc kubenswrapper[4699]: I0226 12:09:41.588974 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 12:09:41 crc kubenswrapper[4699]: I0226 12:09:41.592352 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-fmwlb" Feb 26 12:09:41 crc kubenswrapper[4699]: I0226 12:09:41.598663 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 26 12:09:41 crc kubenswrapper[4699]: I0226 12:09:41.614348 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"66beadbe-fd5d-48af-8a33-8a652c8d1c71\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 12:09:41 crc kubenswrapper[4699]: I0226 12:09:41.614688 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg6jw\" (UniqueName: \"kubernetes.io/projected/66beadbe-fd5d-48af-8a33-8a652c8d1c71-kube-api-access-cg6jw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"66beadbe-fd5d-48af-8a33-8a652c8d1c71\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 12:09:41 crc kubenswrapper[4699]: I0226 12:09:41.716816 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg6jw\" (UniqueName: \"kubernetes.io/projected/66beadbe-fd5d-48af-8a33-8a652c8d1c71-kube-api-access-cg6jw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"66beadbe-fd5d-48af-8a33-8a652c8d1c71\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 12:09:41 crc kubenswrapper[4699]: I0226 12:09:41.717011 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"66beadbe-fd5d-48af-8a33-8a652c8d1c71\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 12:09:41 crc kubenswrapper[4699]: I0226 12:09:41.717775 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"66beadbe-fd5d-48af-8a33-8a652c8d1c71\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 12:09:41 crc kubenswrapper[4699]: I0226 12:09:41.746839 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg6jw\" (UniqueName: \"kubernetes.io/projected/66beadbe-fd5d-48af-8a33-8a652c8d1c71-kube-api-access-cg6jw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"66beadbe-fd5d-48af-8a33-8a652c8d1c71\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 12:09:41 crc kubenswrapper[4699]: I0226 12:09:41.757396 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"66beadbe-fd5d-48af-8a33-8a652c8d1c71\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 12:09:41 crc kubenswrapper[4699]: I0226 12:09:41.922923 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 12:09:42 crc kubenswrapper[4699]: I0226 12:09:42.415560 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 26 12:09:42 crc kubenswrapper[4699]: I0226 12:09:42.627770 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"66beadbe-fd5d-48af-8a33-8a652c8d1c71","Type":"ContainerStarted","Data":"847da687299bc9288ded47021eee31d66d769ac8d475d3f5fb40ed971a8106f0"} Feb 26 12:09:43 crc kubenswrapper[4699]: I0226 12:09:43.638939 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"66beadbe-fd5d-48af-8a33-8a652c8d1c71","Type":"ContainerStarted","Data":"53d8b61522ead810a906b8e7f70dd6e20b8faf411b8a954290488492e292ef7b"} Feb 26 12:09:43 crc kubenswrapper[4699]: I0226 12:09:43.655503 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.765777923 podStartE2EDuration="2.655477524s" podCreationTimestamp="2026-02-26 12:09:41 +0000 UTC" firstStartedPulling="2026-02-26 12:09:42.417886399 +0000 UTC m=+3528.228712833" lastFinishedPulling="2026-02-26 12:09:43.30758596 +0000 UTC m=+3529.118412434" observedRunningTime="2026-02-26 12:09:43.650642577 +0000 UTC m=+3529.461469021" watchObservedRunningTime="2026-02-26 12:09:43.655477524 +0000 UTC m=+3529.466303958" Feb 26 12:10:00 crc kubenswrapper[4699]: I0226 12:10:00.163873 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535130-v52gt"] Feb 26 12:10:00 crc kubenswrapper[4699]: I0226 12:10:00.167063 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535130-v52gt" Feb 26 12:10:00 crc kubenswrapper[4699]: I0226 12:10:00.172072 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 12:10:00 crc kubenswrapper[4699]: I0226 12:10:00.172552 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 12:10:00 crc kubenswrapper[4699]: I0226 12:10:00.173062 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 12:10:00 crc kubenswrapper[4699]: I0226 12:10:00.178062 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535130-v52gt"] Feb 26 12:10:00 crc kubenswrapper[4699]: I0226 12:10:00.297622 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnt8x\" (UniqueName: \"kubernetes.io/projected/524c38c5-5560-45a6-aa15-3010000b2165-kube-api-access-lnt8x\") pod \"auto-csr-approver-29535130-v52gt\" (UID: \"524c38c5-5560-45a6-aa15-3010000b2165\") " pod="openshift-infra/auto-csr-approver-29535130-v52gt" Feb 26 12:10:00 crc kubenswrapper[4699]: I0226 12:10:00.414698 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnt8x\" (UniqueName: \"kubernetes.io/projected/524c38c5-5560-45a6-aa15-3010000b2165-kube-api-access-lnt8x\") pod \"auto-csr-approver-29535130-v52gt\" (UID: \"524c38c5-5560-45a6-aa15-3010000b2165\") " pod="openshift-infra/auto-csr-approver-29535130-v52gt" Feb 26 12:10:00 crc kubenswrapper[4699]: I0226 12:10:00.436766 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnt8x\" (UniqueName: \"kubernetes.io/projected/524c38c5-5560-45a6-aa15-3010000b2165-kube-api-access-lnt8x\") pod \"auto-csr-approver-29535130-v52gt\" (UID: \"524c38c5-5560-45a6-aa15-3010000b2165\") " pod="openshift-infra/auto-csr-approver-29535130-v52gt" Feb 26 12:10:00 crc kubenswrapper[4699]: I0226 12:10:00.494375 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535130-v52gt" Feb 26 12:10:00 crc kubenswrapper[4699]: I0226 12:10:00.931692 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535130-v52gt"] Feb 26 12:10:01 crc kubenswrapper[4699]: I0226 12:10:01.796975 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535130-v52gt" event={"ID":"524c38c5-5560-45a6-aa15-3010000b2165","Type":"ContainerStarted","Data":"95b121cd73f6696c5e9b893eea048f56496a29dd23330d6b28609095b8186a5a"} Feb 26 12:10:02 crc kubenswrapper[4699]: E0226 12:10:02.640443 4699 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod524c38c5_5560_45a6_aa15_3010000b2165.slice/crio-b1e1f8248ccd17084f1b3aa21ad1265018f7368ffdb4ddbf286721c65474aad5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod524c38c5_5560_45a6_aa15_3010000b2165.slice/crio-conmon-b1e1f8248ccd17084f1b3aa21ad1265018f7368ffdb4ddbf286721c65474aad5.scope\": RecentStats: unable to find data in memory cache]" Feb 26 12:10:02 crc kubenswrapper[4699]: I0226 12:10:02.807277 4699 generic.go:334] "Generic (PLEG): container finished" podID="524c38c5-5560-45a6-aa15-3010000b2165" containerID="b1e1f8248ccd17084f1b3aa21ad1265018f7368ffdb4ddbf286721c65474aad5" exitCode=0 Feb 26 12:10:02 crc kubenswrapper[4699]: I0226 12:10:02.807324 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535130-v52gt" event={"ID":"524c38c5-5560-45a6-aa15-3010000b2165","Type":"ContainerDied","Data":"b1e1f8248ccd17084f1b3aa21ad1265018f7368ffdb4ddbf286721c65474aad5"} Feb 26 12:10:04 crc kubenswrapper[4699]: I0226 12:10:04.175915 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535130-v52gt" Feb 26 12:10:04 crc kubenswrapper[4699]: I0226 12:10:04.192585 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnt8x\" (UniqueName: \"kubernetes.io/projected/524c38c5-5560-45a6-aa15-3010000b2165-kube-api-access-lnt8x\") pod \"524c38c5-5560-45a6-aa15-3010000b2165\" (UID: \"524c38c5-5560-45a6-aa15-3010000b2165\") " Feb 26 12:10:04 crc kubenswrapper[4699]: I0226 12:10:04.199775 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/524c38c5-5560-45a6-aa15-3010000b2165-kube-api-access-lnt8x" (OuterVolumeSpecName: "kube-api-access-lnt8x") pod "524c38c5-5560-45a6-aa15-3010000b2165" (UID: "524c38c5-5560-45a6-aa15-3010000b2165"). InnerVolumeSpecName "kube-api-access-lnt8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:10:04 crc kubenswrapper[4699]: I0226 12:10:04.294447 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnt8x\" (UniqueName: \"kubernetes.io/projected/524c38c5-5560-45a6-aa15-3010000b2165-kube-api-access-lnt8x\") on node \"crc\" DevicePath \"\"" Feb 26 12:10:04 crc kubenswrapper[4699]: I0226 12:10:04.830387 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535130-v52gt" Feb 26 12:10:04 crc kubenswrapper[4699]: I0226 12:10:04.830376 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535130-v52gt" event={"ID":"524c38c5-5560-45a6-aa15-3010000b2165","Type":"ContainerDied","Data":"95b121cd73f6696c5e9b893eea048f56496a29dd23330d6b28609095b8186a5a"} Feb 26 12:10:04 crc kubenswrapper[4699]: I0226 12:10:04.830601 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95b121cd73f6696c5e9b893eea048f56496a29dd23330d6b28609095b8186a5a" Feb 26 12:10:04 crc kubenswrapper[4699]: I0226 12:10:04.911189 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4plh5/must-gather-cwqbr"] Feb 26 12:10:04 crc kubenswrapper[4699]: E0226 12:10:04.911723 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524c38c5-5560-45a6-aa15-3010000b2165" containerName="oc" Feb 26 12:10:04 crc kubenswrapper[4699]: I0226 12:10:04.911746 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="524c38c5-5560-45a6-aa15-3010000b2165" containerName="oc" Feb 26 12:10:04 crc kubenswrapper[4699]: I0226 12:10:04.912002 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="524c38c5-5560-45a6-aa15-3010000b2165" containerName="oc" Feb 26 12:10:04 crc kubenswrapper[4699]: I0226 12:10:04.913375 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4plh5/must-gather-cwqbr" Feb 26 12:10:04 crc kubenswrapper[4699]: I0226 12:10:04.915652 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4plh5"/"default-dockercfg-lbpps" Feb 26 12:10:04 crc kubenswrapper[4699]: I0226 12:10:04.916615 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4plh5"/"openshift-service-ca.crt" Feb 26 12:10:04 crc kubenswrapper[4699]: I0226 12:10:04.918606 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4plh5"/"kube-root-ca.crt" Feb 26 12:10:04 crc kubenswrapper[4699]: I0226 12:10:04.936043 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4plh5/must-gather-cwqbr"] Feb 26 12:10:05 crc kubenswrapper[4699]: I0226 12:10:05.107632 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p96w\" (UniqueName: \"kubernetes.io/projected/2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9-kube-api-access-8p96w\") pod \"must-gather-cwqbr\" (UID: \"2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9\") " pod="openshift-must-gather-4plh5/must-gather-cwqbr" Feb 26 12:10:05 crc kubenswrapper[4699]: I0226 12:10:05.107699 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9-must-gather-output\") pod \"must-gather-cwqbr\" (UID: \"2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9\") " pod="openshift-must-gather-4plh5/must-gather-cwqbr" Feb 26 12:10:05 crc kubenswrapper[4699]: I0226 12:10:05.210142 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p96w\" (UniqueName: \"kubernetes.io/projected/2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9-kube-api-access-8p96w\") pod \"must-gather-cwqbr\" (UID: \"2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9\") " pod="openshift-must-gather-4plh5/must-gather-cwqbr" Feb 26 12:10:05 crc kubenswrapper[4699]: I0226 12:10:05.210217 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9-must-gather-output\") pod \"must-gather-cwqbr\" (UID: \"2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9\") " pod="openshift-must-gather-4plh5/must-gather-cwqbr" Feb 26 12:10:05 crc kubenswrapper[4699]: I0226 12:10:05.210765 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9-must-gather-output\") pod \"must-gather-cwqbr\" (UID: \"2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9\") " pod="openshift-must-gather-4plh5/must-gather-cwqbr" Feb 26 12:10:05 crc kubenswrapper[4699]: I0226 12:10:05.229460 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p96w\" (UniqueName: \"kubernetes.io/projected/2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9-kube-api-access-8p96w\") pod \"must-gather-cwqbr\" (UID: \"2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9\") " pod="openshift-must-gather-4plh5/must-gather-cwqbr" Feb 26 12:10:05 crc kubenswrapper[4699]: I0226 12:10:05.233585 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4plh5/must-gather-cwqbr" Feb 26 12:10:05 crc kubenswrapper[4699]: I0226 12:10:05.256803 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535124-gqg7z"] Feb 26 12:10:05 crc kubenswrapper[4699]: I0226 12:10:05.265063 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535124-gqg7z"] Feb 26 12:10:05 crc kubenswrapper[4699]: I0226 12:10:05.666041 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4plh5/must-gather-cwqbr"] Feb 26 12:10:05 crc kubenswrapper[4699]: W0226 12:10:05.667632 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e6e42cb_6891_4f97_9ba8_b4c6ad63a7a9.slice/crio-0ef1f3bd591d4afff35f968e33ba667440df3e1ced5cc806faf7879a065504f3 WatchSource:0}: Error finding container 0ef1f3bd591d4afff35f968e33ba667440df3e1ced5cc806faf7879a065504f3: Status 404 returned error can't find the container with id 0ef1f3bd591d4afff35f968e33ba667440df3e1ced5cc806faf7879a065504f3 Feb 26 12:10:05 crc kubenswrapper[4699]: I0226 12:10:05.842778 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4plh5/must-gather-cwqbr" event={"ID":"2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9","Type":"ContainerStarted","Data":"0ef1f3bd591d4afff35f968e33ba667440df3e1ced5cc806faf7879a065504f3"} Feb 26 12:10:06 crc kubenswrapper[4699]: I0226 12:10:06.280900 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af10706a-2423-4bb2-b0a5-de33b64b4b64" path="/var/lib/kubelet/pods/af10706a-2423-4bb2-b0a5-de33b64b4b64/volumes" Feb 26 12:10:12 crc kubenswrapper[4699]: I0226 12:10:12.905264 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4plh5/must-gather-cwqbr" event={"ID":"2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9","Type":"ContainerStarted","Data":"40c5e22b0608cb2076c35bbaefac8f43550651c9e232dc9f005649022a951ddd"} Feb 26 12:10:12 crc kubenswrapper[4699]: I0226 12:10:12.905829 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4plh5/must-gather-cwqbr" event={"ID":"2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9","Type":"ContainerStarted","Data":"69186eaeffdff5cb7d043fe084a54702933857d82206e64b726b17a9aa98bafb"} Feb 26 12:10:12 crc kubenswrapper[4699]: I0226 12:10:12.933041 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4plh5/must-gather-cwqbr" podStartSLOduration=2.527169629 podStartE2EDuration="8.932993295s" podCreationTimestamp="2026-02-26 12:10:04 +0000 UTC" firstStartedPulling="2026-02-26 12:10:05.670956458 +0000 UTC m=+3551.481782892" lastFinishedPulling="2026-02-26 12:10:12.076780124 +0000 UTC m=+3557.887606558" observedRunningTime="2026-02-26 12:10:12.926170732 +0000 UTC m=+3558.736997186" watchObservedRunningTime="2026-02-26 12:10:12.932993295 +0000 UTC m=+3558.743819729" Feb 26 12:10:15 crc kubenswrapper[4699]: I0226 12:10:15.783317 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4plh5/crc-debug-bswb4"] Feb 26 12:10:15 crc kubenswrapper[4699]: I0226 12:10:15.785765 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4plh5/crc-debug-bswb4" Feb 26 12:10:15 crc kubenswrapper[4699]: I0226 12:10:15.834611 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpcq4\" (UniqueName: \"kubernetes.io/projected/864b723d-bc08-43f3-a5ec-718a0066eac0-kube-api-access-fpcq4\") pod \"crc-debug-bswb4\" (UID: \"864b723d-bc08-43f3-a5ec-718a0066eac0\") " pod="openshift-must-gather-4plh5/crc-debug-bswb4" Feb 26 12:10:15 crc kubenswrapper[4699]: I0226 12:10:15.834716 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/864b723d-bc08-43f3-a5ec-718a0066eac0-host\") pod \"crc-debug-bswb4\" (UID: \"864b723d-bc08-43f3-a5ec-718a0066eac0\") " pod="openshift-must-gather-4plh5/crc-debug-bswb4" Feb 26 12:10:15 crc kubenswrapper[4699]: I0226 12:10:15.936624 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpcq4\" (UniqueName: \"kubernetes.io/projected/864b723d-bc08-43f3-a5ec-718a0066eac0-kube-api-access-fpcq4\") pod \"crc-debug-bswb4\" (UID: \"864b723d-bc08-43f3-a5ec-718a0066eac0\") " pod="openshift-must-gather-4plh5/crc-debug-bswb4" Feb 26 12:10:15 crc kubenswrapper[4699]: I0226 12:10:15.936725 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/864b723d-bc08-43f3-a5ec-718a0066eac0-host\") pod \"crc-debug-bswb4\" (UID: \"864b723d-bc08-43f3-a5ec-718a0066eac0\") " pod="openshift-must-gather-4plh5/crc-debug-bswb4" Feb 26 12:10:15 crc kubenswrapper[4699]: I0226 12:10:15.936911 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/864b723d-bc08-43f3-a5ec-718a0066eac0-host\") pod \"crc-debug-bswb4\" (UID: \"864b723d-bc08-43f3-a5ec-718a0066eac0\") " pod="openshift-must-gather-4plh5/crc-debug-bswb4" Feb 26 12:10:15 crc kubenswrapper[4699]: I0226 12:10:15.956807 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpcq4\" (UniqueName: \"kubernetes.io/projected/864b723d-bc08-43f3-a5ec-718a0066eac0-kube-api-access-fpcq4\") pod \"crc-debug-bswb4\" (UID: \"864b723d-bc08-43f3-a5ec-718a0066eac0\") " pod="openshift-must-gather-4plh5/crc-debug-bswb4" Feb 26 12:10:16 crc kubenswrapper[4699]: I0226 12:10:16.112850 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4plh5/crc-debug-bswb4" Feb 26 12:10:16 crc kubenswrapper[4699]: I0226 12:10:16.943244 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4plh5/crc-debug-bswb4" event={"ID":"864b723d-bc08-43f3-a5ec-718a0066eac0","Type":"ContainerStarted","Data":"adeffda648e6a329b7a914fc4d25da8e13877082c895a3e7fe7e9714af65d8ec"} Feb 26 12:10:28 crc kubenswrapper[4699]: I0226 12:10:28.055865 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4plh5/crc-debug-bswb4" event={"ID":"864b723d-bc08-43f3-a5ec-718a0066eac0","Type":"ContainerStarted","Data":"548a0e8c1b14580465351f41c66bafc1b217669a68f00a69bd71038d87540f9f"} Feb 26 12:10:28 crc kubenswrapper[4699]: I0226 12:10:28.081057 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4plh5/crc-debug-bswb4" podStartSLOduration=1.687489682 podStartE2EDuration="13.081018608s" podCreationTimestamp="2026-02-26 12:10:15 +0000 UTC" firstStartedPulling="2026-02-26 12:10:16.196870879 +0000 UTC m=+3562.007697313" lastFinishedPulling="2026-02-26 12:10:27.590399805 +0000 UTC m=+3573.401226239" observedRunningTime="2026-02-26 12:10:28.07051953 +0000 UTC m=+3573.881345974" watchObservedRunningTime="2026-02-26 12:10:28.081018608 +0000 UTC m=+3573.891845062" Feb 26 12:10:39 crc kubenswrapper[4699]: I0226 12:10:39.616159 4699 scope.go:117] "RemoveContainer" containerID="bee6179034d0d615200cc2b0cca46b2b7ac3bbc955a96024e317fe4212ffc149" Feb 26 12:11:09 crc kubenswrapper[4699]: I0226 12:11:09.428849 4699 generic.go:334] "Generic (PLEG): container finished" podID="864b723d-bc08-43f3-a5ec-718a0066eac0" containerID="548a0e8c1b14580465351f41c66bafc1b217669a68f00a69bd71038d87540f9f" exitCode=0 Feb 26 12:11:09 crc kubenswrapper[4699]: I0226 12:11:09.428897 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4plh5/crc-debug-bswb4" event={"ID":"864b723d-bc08-43f3-a5ec-718a0066eac0","Type":"ContainerDied","Data":"548a0e8c1b14580465351f41c66bafc1b217669a68f00a69bd71038d87540f9f"} Feb 26 12:11:10 crc kubenswrapper[4699]: I0226 12:11:10.530570 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4plh5/crc-debug-bswb4" Feb 26 12:11:10 crc kubenswrapper[4699]: I0226 12:11:10.565782 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4plh5/crc-debug-bswb4"] Feb 26 12:11:10 crc kubenswrapper[4699]: I0226 12:11:10.575058 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4plh5/crc-debug-bswb4"] Feb 26 12:11:10 crc kubenswrapper[4699]: I0226 12:11:10.609230 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpcq4\" (UniqueName: \"kubernetes.io/projected/864b723d-bc08-43f3-a5ec-718a0066eac0-kube-api-access-fpcq4\") pod \"864b723d-bc08-43f3-a5ec-718a0066eac0\" (UID: \"864b723d-bc08-43f3-a5ec-718a0066eac0\") " Feb 26 12:11:10 crc kubenswrapper[4699]: I0226 12:11:10.609348 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/864b723d-bc08-43f3-a5ec-718a0066eac0-host\") pod \"864b723d-bc08-43f3-a5ec-718a0066eac0\" (UID: \"864b723d-bc08-43f3-a5ec-718a0066eac0\") " Feb 26 12:11:10 crc kubenswrapper[4699]: I0226 12:11:10.609680 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/864b723d-bc08-43f3-a5ec-718a0066eac0-host" (OuterVolumeSpecName: "host") pod "864b723d-bc08-43f3-a5ec-718a0066eac0" (UID: "864b723d-bc08-43f3-a5ec-718a0066eac0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 12:11:10 crc kubenswrapper[4699]: I0226 12:11:10.610091 4699 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/864b723d-bc08-43f3-a5ec-718a0066eac0-host\") on node \"crc\" DevicePath \"\"" Feb 26 12:11:10 crc kubenswrapper[4699]: I0226 12:11:10.616347 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/864b723d-bc08-43f3-a5ec-718a0066eac0-kube-api-access-fpcq4" (OuterVolumeSpecName: "kube-api-access-fpcq4") pod "864b723d-bc08-43f3-a5ec-718a0066eac0" (UID: "864b723d-bc08-43f3-a5ec-718a0066eac0"). InnerVolumeSpecName "kube-api-access-fpcq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:11:10 crc kubenswrapper[4699]: I0226 12:11:10.711768 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpcq4\" (UniqueName: \"kubernetes.io/projected/864b723d-bc08-43f3-a5ec-718a0066eac0-kube-api-access-fpcq4\") on node \"crc\" DevicePath \"\"" Feb 26 12:11:11 crc kubenswrapper[4699]: I0226 12:11:11.448441 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adeffda648e6a329b7a914fc4d25da8e13877082c895a3e7fe7e9714af65d8ec" Feb 26 12:11:11 crc kubenswrapper[4699]: I0226 12:11:11.448481 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4plh5/crc-debug-bswb4" Feb 26 12:11:11 crc kubenswrapper[4699]: I0226 12:11:11.584959 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 12:11:11 crc kubenswrapper[4699]: I0226 12:11:11.585045 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 12:11:11 crc kubenswrapper[4699]: I0226 12:11:11.853013 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4plh5/crc-debug-9d4cm"] Feb 26 12:11:11 crc kubenswrapper[4699]: E0226 12:11:11.853485 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="864b723d-bc08-43f3-a5ec-718a0066eac0" containerName="container-00" Feb 26 12:11:11 crc kubenswrapper[4699]: I0226 12:11:11.853510 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="864b723d-bc08-43f3-a5ec-718a0066eac0" containerName="container-00" Feb 26 12:11:11 crc kubenswrapper[4699]: I0226 12:11:11.853777 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="864b723d-bc08-43f3-a5ec-718a0066eac0" containerName="container-00" Feb 26 12:11:11 crc kubenswrapper[4699]: I0226 12:11:11.854591 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4plh5/crc-debug-9d4cm" Feb 26 12:11:11 crc kubenswrapper[4699]: I0226 12:11:11.936144 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg6zg\" (UniqueName: \"kubernetes.io/projected/0e0a84dc-a0ad-4da3-b004-98594b781410-kube-api-access-wg6zg\") pod \"crc-debug-9d4cm\" (UID: \"0e0a84dc-a0ad-4da3-b004-98594b781410\") " pod="openshift-must-gather-4plh5/crc-debug-9d4cm" Feb 26 12:11:11 crc kubenswrapper[4699]: I0226 12:11:11.936242 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0e0a84dc-a0ad-4da3-b004-98594b781410-host\") pod \"crc-debug-9d4cm\" (UID: \"0e0a84dc-a0ad-4da3-b004-98594b781410\") " pod="openshift-must-gather-4plh5/crc-debug-9d4cm" Feb 26 12:11:12 crc kubenswrapper[4699]: I0226 12:11:12.038840 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg6zg\" (UniqueName: \"kubernetes.io/projected/0e0a84dc-a0ad-4da3-b004-98594b781410-kube-api-access-wg6zg\") pod \"crc-debug-9d4cm\" (UID: \"0e0a84dc-a0ad-4da3-b004-98594b781410\") " pod="openshift-must-gather-4plh5/crc-debug-9d4cm" Feb 26 12:11:12 crc kubenswrapper[4699]: I0226 12:11:12.038933 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0e0a84dc-a0ad-4da3-b004-98594b781410-host\") pod \"crc-debug-9d4cm\" (UID: \"0e0a84dc-a0ad-4da3-b004-98594b781410\") " pod="openshift-must-gather-4plh5/crc-debug-9d4cm" Feb 26 12:11:12 crc kubenswrapper[4699]: I0226 12:11:12.039310 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0e0a84dc-a0ad-4da3-b004-98594b781410-host\") pod \"crc-debug-9d4cm\" (UID: \"0e0a84dc-a0ad-4da3-b004-98594b781410\") " pod="openshift-must-gather-4plh5/crc-debug-9d4cm" Feb 26 12:11:12 crc kubenswrapper[4699]: I0226 12:11:12.061983 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg6zg\" (UniqueName: \"kubernetes.io/projected/0e0a84dc-a0ad-4da3-b004-98594b781410-kube-api-access-wg6zg\") pod \"crc-debug-9d4cm\" (UID: \"0e0a84dc-a0ad-4da3-b004-98594b781410\") " pod="openshift-must-gather-4plh5/crc-debug-9d4cm" Feb 26 12:11:12 crc kubenswrapper[4699]: I0226 12:11:12.170591 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4plh5/crc-debug-9d4cm" Feb 26 12:11:12 crc kubenswrapper[4699]: I0226 12:11:12.277756 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="864b723d-bc08-43f3-a5ec-718a0066eac0" path="/var/lib/kubelet/pods/864b723d-bc08-43f3-a5ec-718a0066eac0/volumes" Feb 26 12:11:12 crc kubenswrapper[4699]: I0226 12:11:12.455937 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4plh5/crc-debug-9d4cm" event={"ID":"0e0a84dc-a0ad-4da3-b004-98594b781410","Type":"ContainerStarted","Data":"1830ebb83943317d1452f94ddd1bbd24c88d43dcb4e4541e0c9c10d16e425c29"} Feb 26 12:11:12 crc kubenswrapper[4699]: I0226 12:11:12.455983 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4plh5/crc-debug-9d4cm" event={"ID":"0e0a84dc-a0ad-4da3-b004-98594b781410","Type":"ContainerStarted","Data":"2d733d9b3e91b35010efb39dfda838f4c195644dab97fb7f5913b7b91c78c259"} Feb 26 12:11:12 crc kubenswrapper[4699]: I0226 12:11:12.478714 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4plh5/crc-debug-9d4cm" podStartSLOduration=1.478692661 podStartE2EDuration="1.478692661s" podCreationTimestamp="2026-02-26 12:11:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 12:11:12.470850299 +0000 UTC m=+3618.281676753" watchObservedRunningTime="2026-02-26 12:11:12.478692661 +0000 UTC m=+3618.289519095" Feb 26 12:11:13 crc kubenswrapper[4699]: I0226 12:11:13.467504 4699 generic.go:334] "Generic (PLEG): container finished" podID="0e0a84dc-a0ad-4da3-b004-98594b781410" containerID="1830ebb83943317d1452f94ddd1bbd24c88d43dcb4e4541e0c9c10d16e425c29" exitCode=0 Feb 26 12:11:13 crc kubenswrapper[4699]: I0226 12:11:13.467605 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4plh5/crc-debug-9d4cm" event={"ID":"0e0a84dc-a0ad-4da3-b004-98594b781410","Type":"ContainerDied","Data":"1830ebb83943317d1452f94ddd1bbd24c88d43dcb4e4541e0c9c10d16e425c29"} Feb 26 12:11:14 crc kubenswrapper[4699]: I0226 12:11:14.572231 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4plh5/crc-debug-9d4cm" Feb 26 12:11:14 crc kubenswrapper[4699]: I0226 12:11:14.627018 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4plh5/crc-debug-9d4cm"] Feb 26 12:11:14 crc kubenswrapper[4699]: I0226 12:11:14.638270 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4plh5/crc-debug-9d4cm"] Feb 26 12:11:14 crc kubenswrapper[4699]: I0226 12:11:14.687344 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0e0a84dc-a0ad-4da3-b004-98594b781410-host\") pod \"0e0a84dc-a0ad-4da3-b004-98594b781410\" (UID: \"0e0a84dc-a0ad-4da3-b004-98594b781410\") " Feb 26 12:11:14 crc kubenswrapper[4699]: I0226 12:11:14.687607 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg6zg\" (UniqueName: \"kubernetes.io/projected/0e0a84dc-a0ad-4da3-b004-98594b781410-kube-api-access-wg6zg\") pod \"0e0a84dc-a0ad-4da3-b004-98594b781410\" (UID: \"0e0a84dc-a0ad-4da3-b004-98594b781410\") " Feb 26 12:11:14 crc kubenswrapper[4699]: I0226 12:11:14.687830 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e0a84dc-a0ad-4da3-b004-98594b781410-host" (OuterVolumeSpecName: "host") pod "0e0a84dc-a0ad-4da3-b004-98594b781410" (UID: "0e0a84dc-a0ad-4da3-b004-98594b781410"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 12:11:14 crc kubenswrapper[4699]: I0226 12:11:14.688081 4699 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0e0a84dc-a0ad-4da3-b004-98594b781410-host\") on node \"crc\" DevicePath \"\"" Feb 26 12:11:14 crc kubenswrapper[4699]: I0226 12:11:14.694854 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e0a84dc-a0ad-4da3-b004-98594b781410-kube-api-access-wg6zg" (OuterVolumeSpecName: "kube-api-access-wg6zg") pod "0e0a84dc-a0ad-4da3-b004-98594b781410" (UID: "0e0a84dc-a0ad-4da3-b004-98594b781410"). InnerVolumeSpecName "kube-api-access-wg6zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:11:14 crc kubenswrapper[4699]: I0226 12:11:14.790030 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wg6zg\" (UniqueName: \"kubernetes.io/projected/0e0a84dc-a0ad-4da3-b004-98594b781410-kube-api-access-wg6zg\") on node \"crc\" DevicePath \"\"" Feb 26 12:11:15 crc kubenswrapper[4699]: I0226 12:11:15.489663 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d733d9b3e91b35010efb39dfda838f4c195644dab97fb7f5913b7b91c78c259" Feb 26 12:11:15 crc kubenswrapper[4699]: I0226 12:11:15.489820 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4plh5/crc-debug-9d4cm" Feb 26 12:11:15 crc kubenswrapper[4699]: I0226 12:11:15.813076 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4plh5/crc-debug-tbmzf"] Feb 26 12:11:15 crc kubenswrapper[4699]: E0226 12:11:15.813797 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e0a84dc-a0ad-4da3-b004-98594b781410" containerName="container-00" Feb 26 12:11:15 crc kubenswrapper[4699]: I0226 12:11:15.813810 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e0a84dc-a0ad-4da3-b004-98594b781410" containerName="container-00" Feb 26 12:11:15 crc kubenswrapper[4699]: I0226 12:11:15.814038 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e0a84dc-a0ad-4da3-b004-98594b781410" containerName="container-00" Feb 26 12:11:15 crc kubenswrapper[4699]: I0226 12:11:15.814719 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4plh5/crc-debug-tbmzf" Feb 26 12:11:15 crc kubenswrapper[4699]: I0226 12:11:15.912623 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtbls\" (UniqueName: \"kubernetes.io/projected/e9944e38-a08c-4638-b132-1841c82d51c2-kube-api-access-jtbls\") pod \"crc-debug-tbmzf\" (UID: \"e9944e38-a08c-4638-b132-1841c82d51c2\") " pod="openshift-must-gather-4plh5/crc-debug-tbmzf" Feb 26 12:11:15 crc kubenswrapper[4699]: I0226 12:11:15.912671 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9944e38-a08c-4638-b132-1841c82d51c2-host\") pod \"crc-debug-tbmzf\" (UID: \"e9944e38-a08c-4638-b132-1841c82d51c2\") " pod="openshift-must-gather-4plh5/crc-debug-tbmzf" Feb 26 12:11:16 crc kubenswrapper[4699]: I0226 12:11:16.014341 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtbls\" (UniqueName: \"kubernetes.io/projected/e9944e38-a08c-4638-b132-1841c82d51c2-kube-api-access-jtbls\") pod \"crc-debug-tbmzf\" (UID: \"e9944e38-a08c-4638-b132-1841c82d51c2\") " pod="openshift-must-gather-4plh5/crc-debug-tbmzf" Feb 26 12:11:16 crc kubenswrapper[4699]: I0226 12:11:16.014415 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9944e38-a08c-4638-b132-1841c82d51c2-host\") pod \"crc-debug-tbmzf\" (UID: \"e9944e38-a08c-4638-b132-1841c82d51c2\") " pod="openshift-must-gather-4plh5/crc-debug-tbmzf" Feb 26 12:11:16 crc kubenswrapper[4699]: I0226 12:11:16.014551 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9944e38-a08c-4638-b132-1841c82d51c2-host\") pod \"crc-debug-tbmzf\" (UID: \"e9944e38-a08c-4638-b132-1841c82d51c2\") " pod="openshift-must-gather-4plh5/crc-debug-tbmzf" Feb 26 12:11:16 crc kubenswrapper[4699]: I0226 12:11:16.038285 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtbls\" (UniqueName: \"kubernetes.io/projected/e9944e38-a08c-4638-b132-1841c82d51c2-kube-api-access-jtbls\") pod \"crc-debug-tbmzf\" (UID: \"e9944e38-a08c-4638-b132-1841c82d51c2\") " pod="openshift-must-gather-4plh5/crc-debug-tbmzf" Feb 26 12:11:16 crc kubenswrapper[4699]: I0226 12:11:16.138397 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4plh5/crc-debug-tbmzf" Feb 26 12:11:16 crc kubenswrapper[4699]: I0226 12:11:16.273199 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e0a84dc-a0ad-4da3-b004-98594b781410" path="/var/lib/kubelet/pods/0e0a84dc-a0ad-4da3-b004-98594b781410/volumes" Feb 26 12:11:16 crc kubenswrapper[4699]: I0226 12:11:16.500442 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4plh5/crc-debug-tbmzf" event={"ID":"e9944e38-a08c-4638-b132-1841c82d51c2","Type":"ContainerStarted","Data":"ad8cfdb739766e618ecc3a2f4fa20f12f46798f6783950e3cb56a1c6c7d53124"} Feb 26 12:11:17 crc kubenswrapper[4699]: I0226 12:11:17.510982 4699 generic.go:334] "Generic (PLEG): container finished" podID="e9944e38-a08c-4638-b132-1841c82d51c2" containerID="80c52c8f64990f9768d3a0c9c4bf05f19b3350e70a27d2f9ab510f7d7259fb47" exitCode=0 Feb 26 12:11:17 crc kubenswrapper[4699]: I0226 12:11:17.511038 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4plh5/crc-debug-tbmzf" event={"ID":"e9944e38-a08c-4638-b132-1841c82d51c2","Type":"ContainerDied","Data":"80c52c8f64990f9768d3a0c9c4bf05f19b3350e70a27d2f9ab510f7d7259fb47"} Feb 26 12:11:17 crc kubenswrapper[4699]: I0226 12:11:17.557532 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4plh5/crc-debug-tbmzf"] Feb 26 12:11:17 crc kubenswrapper[4699]: I0226 12:11:17.570544 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4plh5/crc-debug-tbmzf"] Feb 26 12:11:18 crc kubenswrapper[4699]: I0226 12:11:18.629860 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4plh5/crc-debug-tbmzf" Feb 26 12:11:18 crc kubenswrapper[4699]: I0226 12:11:18.766728 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtbls\" (UniqueName: \"kubernetes.io/projected/e9944e38-a08c-4638-b132-1841c82d51c2-kube-api-access-jtbls\") pod \"e9944e38-a08c-4638-b132-1841c82d51c2\" (UID: \"e9944e38-a08c-4638-b132-1841c82d51c2\") " Feb 26 12:11:18 crc kubenswrapper[4699]: I0226 12:11:18.766858 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9944e38-a08c-4638-b132-1841c82d51c2-host\") pod \"e9944e38-a08c-4638-b132-1841c82d51c2\" (UID: \"e9944e38-a08c-4638-b132-1841c82d51c2\") " Feb 26 12:11:18 crc kubenswrapper[4699]: I0226 12:11:18.766938 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9944e38-a08c-4638-b132-1841c82d51c2-host" (OuterVolumeSpecName: "host") pod "e9944e38-a08c-4638-b132-1841c82d51c2" (UID: "e9944e38-a08c-4638-b132-1841c82d51c2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 12:11:18 crc kubenswrapper[4699]: I0226 12:11:18.767337 4699 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9944e38-a08c-4638-b132-1841c82d51c2-host\") on node \"crc\" DevicePath \"\"" Feb 26 12:11:18 crc kubenswrapper[4699]: I0226 12:11:18.772590 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9944e38-a08c-4638-b132-1841c82d51c2-kube-api-access-jtbls" (OuterVolumeSpecName: "kube-api-access-jtbls") pod "e9944e38-a08c-4638-b132-1841c82d51c2" (UID: "e9944e38-a08c-4638-b132-1841c82d51c2"). InnerVolumeSpecName "kube-api-access-jtbls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:11:18 crc kubenswrapper[4699]: I0226 12:11:18.868810 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtbls\" (UniqueName: \"kubernetes.io/projected/e9944e38-a08c-4638-b132-1841c82d51c2-kube-api-access-jtbls\") on node \"crc\" DevicePath \"\"" Feb 26 12:11:19 crc kubenswrapper[4699]: I0226 12:11:19.529813 4699 scope.go:117] "RemoveContainer" containerID="80c52c8f64990f9768d3a0c9c4bf05f19b3350e70a27d2f9ab510f7d7259fb47" Feb 26 12:11:19 crc kubenswrapper[4699]: I0226 12:11:19.529861 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4plh5/crc-debug-tbmzf" Feb 26 12:11:20 crc kubenswrapper[4699]: I0226 12:11:20.272393 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9944e38-a08c-4638-b132-1841c82d51c2" path="/var/lib/kubelet/pods/e9944e38-a08c-4638-b132-1841c82d51c2/volumes" Feb 26 12:11:33 crc kubenswrapper[4699]: I0226 12:11:33.633093 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-977f89944-b96zk_dd004e01-9dac-4316-b6ee-05c1a0f20713/barbican-api/0.log" Feb 26 12:11:33 crc kubenswrapper[4699]: I0226 12:11:33.812784 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-977f89944-b96zk_dd004e01-9dac-4316-b6ee-05c1a0f20713/barbican-api-log/0.log" Feb 26 12:11:33 crc kubenswrapper[4699]: I0226 12:11:33.830243 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5bb8c656f4-cl8tt_770f4ffe-352c-416b-8f67-a894c4107003/barbican-keystone-listener/0.log" Feb 26 12:11:33 crc kubenswrapper[4699]: I0226 12:11:33.908642 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5bb8c656f4-cl8tt_770f4ffe-352c-416b-8f67-a894c4107003/barbican-keystone-listener-log/0.log" Feb 26 12:11:34 crc kubenswrapper[4699]: I0226 12:11:34.019152 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6596b66679-qmv4f_edb59470-4038-48c2-a3ec-f3046406a971/barbican-worker/0.log" Feb 26 12:11:34 crc kubenswrapper[4699]: I0226 12:11:34.054823 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6596b66679-qmv4f_edb59470-4038-48c2-a3ec-f3046406a971/barbican-worker-log/0.log" Feb 26 12:11:34 crc kubenswrapper[4699]: I0226 12:11:34.233294 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj_fee4a36b-0896-43c1-9b23-3da3ae870cbe/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:11:34 crc kubenswrapper[4699]: I0226 12:11:34.259557 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_09a6eb79-27c3-465b-adae-b32d96c56b65/ceilometer-central-agent/0.log" Feb 26 12:11:34 crc kubenswrapper[4699]: I0226 12:11:34.334734 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_09a6eb79-27c3-465b-adae-b32d96c56b65/ceilometer-notification-agent/0.log" Feb 26 12:11:34 crc kubenswrapper[4699]: I0226 12:11:34.404978 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_09a6eb79-27c3-465b-adae-b32d96c56b65/proxy-httpd/0.log" Feb 26 12:11:34 crc kubenswrapper[4699]: I0226 12:11:34.470727 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_09a6eb79-27c3-465b-adae-b32d96c56b65/sg-core/0.log" Feb 26 12:11:34 crc kubenswrapper[4699]: I0226 12:11:34.588243 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c2c2d2c1-e68e-4b14-a732-3b42a6132503/cinder-api/0.log" Feb 26 12:11:34 crc kubenswrapper[4699]: I0226 12:11:34.614106 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c2c2d2c1-e68e-4b14-a732-3b42a6132503/cinder-api-log/0.log" Feb 26 12:11:34 crc kubenswrapper[4699]: I0226 12:11:34.745155 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fbf1f488-444f-45d3-b5e6-44506bf45f8e/cinder-scheduler/0.log" Feb 26 12:11:34 crc kubenswrapper[4699]: I0226 12:11:34.885593 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fbf1f488-444f-45d3-b5e6-44506bf45f8e/probe/0.log" Feb 26 12:11:34 crc kubenswrapper[4699]: I0226 12:11:34.974374 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-86gl7_b1a06be0-15ce-4abd-b9e7-7e11e789bd64/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:11:35 crc kubenswrapper[4699]: I0226 12:11:35.085341 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-h9q25_85e0d37e-fb25-4bbc-afe5-7e6ab304390c/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:11:35 crc kubenswrapper[4699]: I0226 12:11:35.244894 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-hddfn_24dd88a8-4737-4ebc-8925-b2bcedb760c2/init/0.log" Feb 26 12:11:35 crc kubenswrapper[4699]: I0226 12:11:35.533541 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-hddfn_24dd88a8-4737-4ebc-8925-b2bcedb760c2/dnsmasq-dns/0.log" Feb 26 12:11:35 crc kubenswrapper[4699]: I0226 12:11:35.557598 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-hddfn_24dd88a8-4737-4ebc-8925-b2bcedb760c2/init/0.log" Feb 26 12:11:35 crc kubenswrapper[4699]: I0226 12:11:35.635227 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-f97wz_8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:11:36 crc kubenswrapper[4699]: I0226 12:11:36.005725 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9c58ea0a-4ad4-47cf-8976-a004ef7e56da/glance-httpd/0.log" Feb 26 12:11:36 crc kubenswrapper[4699]: I0226 12:11:36.020901 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9c58ea0a-4ad4-47cf-8976-a004ef7e56da/glance-log/0.log" Feb 26 12:11:36 crc kubenswrapper[4699]: I0226 12:11:36.173371 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_796738f1-8a6c-4e91-bdfe-bee2f252b3fc/glance-httpd/0.log" Feb 26 12:11:36 crc kubenswrapper[4699]: I0226 12:11:36.206904 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_796738f1-8a6c-4e91-bdfe-bee2f252b3fc/glance-log/0.log" Feb 26 12:11:36 crc kubenswrapper[4699]: I0226 12:11:36.409461 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5795557cd8-dvzqq_15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0/horizon/0.log" Feb 26 12:11:36 crc kubenswrapper[4699]: I0226 12:11:36.615846 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv_e537c30c-dc6b-406f-bb86-5540ebd8a36d/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:11:36 crc kubenswrapper[4699]: I0226 12:11:36.627649 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5795557cd8-dvzqq_15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0/horizon-log/0.log" Feb 26 12:11:36 crc kubenswrapper[4699]: I0226 12:11:36.725913 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-mlb2f_ac66647f-74c0-4a4e-9925-e47cd90568a1/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:11:36 crc kubenswrapper[4699]: I0226 12:11:36.947417 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29535121-plvtd_ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68/keystone-cron/0.log" Feb 26 12:11:36 crc kubenswrapper[4699]: I0226 12:11:36.996863 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-67d4f89fb9-65kmq_5d9e1983-3363-4542-a5f0-deb132ea6994/keystone-api/0.log" Feb 26 12:11:37 crc kubenswrapper[4699]: I0226 12:11:37.144509 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_c685fadd-b283-40bc-9de2-3372317b9875/kube-state-metrics/0.log" Feb 26 12:11:37 crc kubenswrapper[4699]: I0226 12:11:37.227917 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f_6436c321-6850-4db3-81b2-0dc329e10900/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:11:37 crc kubenswrapper[4699]: I0226 12:11:37.619104 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d45896d49-mh862_862cb546-78f8-4864-a158-9dc217ec2796/neutron-httpd/0.log" Feb 26 12:11:37 crc kubenswrapper[4699]: I0226 12:11:37.637991 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d45896d49-mh862_862cb546-78f8-4864-a158-9dc217ec2796/neutron-api/0.log" Feb 26 12:11:38 crc kubenswrapper[4699]: I0226 12:11:38.259283 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l_59456382-a459-4f82-ac99-b96eb735ddb9/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:11:38 crc kubenswrapper[4699]: I0226 12:11:38.671284 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_2ff15a2d-962f-421b-be00-e3bf6ef22612/nova-cell0-conductor-conductor/0.log" Feb 26 12:11:38 crc kubenswrapper[4699]: I0226 12:11:38.693258 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2d0d807f-7fdc-4239-b7bb-1952c2f7c222/nova-api-log/0.log" Feb 26 12:11:38 crc kubenswrapper[4699]: I0226 12:11:38.876584 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2d0d807f-7fdc-4239-b7bb-1952c2f7c222/nova-api-api/0.log" Feb 26 12:11:39 crc kubenswrapper[4699]: I0226 12:11:39.058546 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_ff2b3846-c197-4cc6-a442-0f466d97d53d/nova-cell1-conductor-conductor/0.log" Feb 26 12:11:39 crc kubenswrapper[4699]: I0226 12:11:39.094617 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_8bb28763-ceae-456c-a0d6-5df33b478106/nova-cell1-novncproxy-novncproxy/0.log" Feb 26 12:11:39 crc kubenswrapper[4699]: I0226 12:11:39.377813 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-wv666_2c2e8329-038c-4347-b30f-f8b42f36cc67/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:11:39 crc kubenswrapper[4699]: I0226 12:11:39.530295 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_15752dfa-4afb-412f-99a0-75c5fe76f6a8/nova-metadata-log/0.log" Feb 26 12:11:39 crc kubenswrapper[4699]: I0226 12:11:39.863705 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_edce8e75-6dd5-4fbd-8f76-bc6553cc27b9/mysql-bootstrap/0.log" Feb 26 12:11:39 crc kubenswrapper[4699]: I0226 12:11:39.899447 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_9d8371db-373f-4a41-97cb-b2d00aa17571/nova-scheduler-scheduler/0.log" Feb 26 12:11:40 crc kubenswrapper[4699]: I0226 12:11:40.080756 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_edce8e75-6dd5-4fbd-8f76-bc6553cc27b9/mysql-bootstrap/0.log" Feb 26 12:11:40 crc kubenswrapper[4699]: I0226 12:11:40.126223 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_edce8e75-6dd5-4fbd-8f76-bc6553cc27b9/galera/0.log" Feb 26 12:11:40 crc kubenswrapper[4699]: I0226 12:11:40.324101 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6fdc6b6d-ac77-4179-9864-f220d622c0f4/mysql-bootstrap/0.log" Feb 26 12:11:40 crc kubenswrapper[4699]: I0226 12:11:40.506439 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6fdc6b6d-ac77-4179-9864-f220d622c0f4/mysql-bootstrap/0.log" Feb 26 12:11:40 crc kubenswrapper[4699]: I0226 12:11:40.513947 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6fdc6b6d-ac77-4179-9864-f220d622c0f4/galera/0.log" Feb 26 12:11:40 crc kubenswrapper[4699]: I0226 12:11:40.678676 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_16db7cc3-bd7c-44aa-b92f-d2a645d96ef0/openstackclient/0.log" Feb 26 12:11:40 crc kubenswrapper[4699]: I0226 12:11:40.696744 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_15752dfa-4afb-412f-99a0-75c5fe76f6a8/nova-metadata-metadata/0.log" Feb 26 12:11:40 crc kubenswrapper[4699]: I0226 12:11:40.798972 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-qfxsz_a4767003-9eba-4b86-933c-5bcbaa93e458/openstack-network-exporter/0.log" Feb 26 12:11:40 crc kubenswrapper[4699]: I0226 12:11:40.983440 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gxnxl_8afc038e-11dc-4959-a6b0-61e9b1c2dc35/ovsdb-server-init/0.log" Feb 26 12:11:41 crc kubenswrapper[4699]: I0226 12:11:41.001256 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-nrvng_cd4015f0-f1a7-40d7-ae69-089f74a6873d/ovn-controller/0.log" Feb 26 12:11:41 crc kubenswrapper[4699]: I0226 12:11:41.215422 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gxnxl_8afc038e-11dc-4959-a6b0-61e9b1c2dc35/ovs-vswitchd/0.log" Feb 26 12:11:41 crc kubenswrapper[4699]: I0226 12:11:41.222768 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gxnxl_8afc038e-11dc-4959-a6b0-61e9b1c2dc35/ovsdb-server-init/0.log" Feb 26 12:11:41 crc kubenswrapper[4699]: I0226 12:11:41.274107 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gxnxl_8afc038e-11dc-4959-a6b0-61e9b1c2dc35/ovsdb-server/0.log" Feb 26 12:11:41 crc kubenswrapper[4699]: I0226 12:11:41.431886 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-hmpqg_dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:11:41 crc kubenswrapper[4699]: I0226 12:11:41.531379 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8fbd47d6-02c1-4ac4-a981-231eb0f13530/ovn-northd/0.log" Feb 26 12:11:41 crc kubenswrapper[4699]: I0226 12:11:41.549601 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8fbd47d6-02c1-4ac4-a981-231eb0f13530/openstack-network-exporter/0.log" Feb 26 12:11:41 crc kubenswrapper[4699]: I0226 12:11:41.584618 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 12:11:41 crc kubenswrapper[4699]: I0226 12:11:41.584742 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 12:11:41 crc kubenswrapper[4699]: I0226 12:11:41.714465 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ef805480-81ec-4d0b-b2ca-06db4bf74383/openstack-network-exporter/0.log" Feb 26 12:11:41 crc kubenswrapper[4699]: I0226 12:11:41.733908 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ef805480-81ec-4d0b-b2ca-06db4bf74383/ovsdbserver-nb/0.log" Feb 26 12:11:41 crc kubenswrapper[4699]: I0226 12:11:41.860028 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b981c8a5-ce76-4bc1-a018-28255391e3f2/openstack-network-exporter/0.log" Feb 26 12:11:41 crc kubenswrapper[4699]: I0226 12:11:41.946074 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b981c8a5-ce76-4bc1-a018-28255391e3f2/ovsdbserver-sb/0.log" Feb 26 12:11:42 crc kubenswrapper[4699]: I0226 12:11:42.071762 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-d4878dd78-qpvzg_b7700bd0-21d8-4b96-9753-2619443038a3/placement-api/0.log" Feb 26 12:11:42 crc kubenswrapper[4699]: I0226 12:11:42.158883 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-d4878dd78-qpvzg_b7700bd0-21d8-4b96-9753-2619443038a3/placement-log/0.log" Feb 26 12:11:42 crc kubenswrapper[4699]: I0226 12:11:42.261612 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3b731314-eb90-4a19-a425-2f9282af2a7f/setup-container/0.log" Feb 26 12:11:42 crc kubenswrapper[4699]: I0226 12:11:42.491345 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3b731314-eb90-4a19-a425-2f9282af2a7f/setup-container/0.log" Feb 26 12:11:42 crc kubenswrapper[4699]: I0226 12:11:42.493798 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0d9b2e6e-c43b-49ae-a71e-844610621e3e/setup-container/0.log" Feb 26 12:11:42 crc kubenswrapper[4699]: I0226 12:11:42.568771 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3b731314-eb90-4a19-a425-2f9282af2a7f/rabbitmq/0.log" Feb 26 12:11:42 crc kubenswrapper[4699]: I0226 12:11:42.716212 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0d9b2e6e-c43b-49ae-a71e-844610621e3e/setup-container/0.log" Feb 26 12:11:42 crc kubenswrapper[4699]: I0226 12:11:42.774181 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l_a1aabb80-3c23-4f5a-9bd1-4d573089856c/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:11:42 crc kubenswrapper[4699]: I0226 12:11:42.868508 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0d9b2e6e-c43b-49ae-a71e-844610621e3e/rabbitmq/0.log" Feb 26 12:11:43 crc kubenswrapper[4699]: I0226 12:11:43.301760 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-zdf2z_fcea0fcf-0c80-4334-9327-f0a57b385cc9/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:11:43 crc kubenswrapper[4699]: I0226 12:11:43.312855 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n_57bbec48-f33e-43b8-9f82-8cc3a42e7723/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:11:43 crc kubenswrapper[4699]: I0226 12:11:43.599724 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-8w2tv_96b6beba-4e99-4cb7-b49b-3f211c5e12b7/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:11:43 crc kubenswrapper[4699]: I0226 12:11:43.670541 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-t4sjg_2930a730-d5e2-49e1-a618-7428b999a73d/ssh-known-hosts-edpm-deployment/0.log" Feb 26 12:11:43 crc kubenswrapper[4699]: I0226 12:11:43.837564 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-78cbc76b59-m6shv_5a4ece68-df2a-480c-9531-1d133d7f4bd0/proxy-server/0.log" Feb 26 12:11:43 crc kubenswrapper[4699]: I0226 12:11:43.982414 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-78cbc76b59-m6shv_5a4ece68-df2a-480c-9531-1d133d7f4bd0/proxy-httpd/0.log" Feb 26 12:11:44 crc kubenswrapper[4699]: I0226 12:11:44.267675 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-lqqdx_9125ee3a-a0b6-469b-b79d-3a376f2d5d91/swift-ring-rebalance/0.log" Feb 26 12:11:44 crc kubenswrapper[4699]: I0226 12:11:44.316297 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/account-reaper/0.log" Feb 26 12:11:44 crc kubenswrapper[4699]: I0226 12:11:44.384631 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/account-auditor/0.log" Feb 26 12:11:44 crc kubenswrapper[4699]: I0226 12:11:44.488004 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/account-replicator/0.log" Feb 26 12:11:44 crc kubenswrapper[4699]: I0226 12:11:44.545986 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/account-server/0.log" Feb 26 12:11:44 crc kubenswrapper[4699]: I0226 12:11:44.554933 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/container-auditor/0.log" Feb 26 12:11:44 crc kubenswrapper[4699]: I0226 12:11:44.614180 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/container-replicator/0.log" Feb 26 12:11:44 crc kubenswrapper[4699]: I0226 12:11:44.737765 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/container-server/0.log" Feb 26 12:11:44 crc kubenswrapper[4699]: I0226 12:11:44.744929 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/object-auditor/0.log" Feb 26 12:11:44 crc kubenswrapper[4699]: I0226 12:11:44.755963 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/container-updater/0.log" Feb 26 12:11:44 crc kubenswrapper[4699]: I0226 12:11:44.845902 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/object-expirer/0.log" Feb 26 12:11:44 crc kubenswrapper[4699]: I0226 12:11:44.951191 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/object-server/0.log" Feb 26 12:11:44 crc kubenswrapper[4699]: I0226 12:11:44.956139 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/object-replicator/0.log" Feb 26 12:11:45 crc kubenswrapper[4699]: I0226 12:11:45.009914 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/object-updater/0.log" Feb 26 12:11:45 crc kubenswrapper[4699]: I0226 12:11:45.073016 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/rsync/0.log" Feb 26 12:11:45 crc kubenswrapper[4699]: I0226 12:11:45.184206 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/swift-recon-cron/0.log" Feb 26 12:11:45 crc kubenswrapper[4699]: I0226 12:11:45.305632 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9_08bdd16a-fc18-4262-9175-a05b613a76c9/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:11:45 crc kubenswrapper[4699]: I0226 12:11:45.524998 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_19e02200-91be-49f8-8174-4a0bf6cda9dd/tempest-tests-tempest-tests-runner/0.log" Feb 26 12:11:45 crc kubenswrapper[4699]: I0226 12:11:45.547437 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_66beadbe-fd5d-48af-8a33-8a652c8d1c71/test-operator-logs-container/0.log" Feb 26 12:11:45 crc kubenswrapper[4699]: I0226 12:11:45.686077 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-9npsm_974c869a-b430-4a83-81d0-ece37d67c0b0/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:11:52 crc kubenswrapper[4699]: I0226 12:11:52.915935 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2/memcached/0.log" Feb 26 12:12:00 crc kubenswrapper[4699]: I0226 12:12:00.143305 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535132-hr4rf"] Feb 26 12:12:00 crc kubenswrapper[4699]: E0226 12:12:00.146176 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9944e38-a08c-4638-b132-1841c82d51c2" containerName="container-00" Feb 26 12:12:00 crc kubenswrapper[4699]: I0226 12:12:00.146298 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9944e38-a08c-4638-b132-1841c82d51c2" containerName="container-00" Feb 26 12:12:00 crc kubenswrapper[4699]: I0226 12:12:00.146700 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9944e38-a08c-4638-b132-1841c82d51c2" containerName="container-00" Feb 26 12:12:00 crc kubenswrapper[4699]: I0226 12:12:00.147555 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535132-hr4rf" Feb 26 12:12:00 crc kubenswrapper[4699]: I0226 12:12:00.152044 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 12:12:00 crc kubenswrapper[4699]: I0226 12:12:00.152284 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 12:12:00 crc kubenswrapper[4699]: I0226 12:12:00.152399 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 12:12:00 crc kubenswrapper[4699]: I0226 12:12:00.156790 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535132-hr4rf"] Feb 26 12:12:00 crc kubenswrapper[4699]: I0226 12:12:00.350889 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kshk8\" (UniqueName: \"kubernetes.io/projected/5ff3c3e0-4f58-401c-9f5f-b733727f73ff-kube-api-access-kshk8\") pod \"auto-csr-approver-29535132-hr4rf\" (UID: \"5ff3c3e0-4f58-401c-9f5f-b733727f73ff\") " pod="openshift-infra/auto-csr-approver-29535132-hr4rf" Feb 26 12:12:00 crc kubenswrapper[4699]: I0226 12:12:00.453350 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kshk8\" (UniqueName: \"kubernetes.io/projected/5ff3c3e0-4f58-401c-9f5f-b733727f73ff-kube-api-access-kshk8\") pod \"auto-csr-approver-29535132-hr4rf\" (UID: \"5ff3c3e0-4f58-401c-9f5f-b733727f73ff\") " pod="openshift-infra/auto-csr-approver-29535132-hr4rf" Feb 26 12:12:00 crc kubenswrapper[4699]: I0226 12:12:00.475751 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kshk8\" (UniqueName: \"kubernetes.io/projected/5ff3c3e0-4f58-401c-9f5f-b733727f73ff-kube-api-access-kshk8\") pod \"auto-csr-approver-29535132-hr4rf\" (UID: \"5ff3c3e0-4f58-401c-9f5f-b733727f73ff\") " pod="openshift-infra/auto-csr-approver-29535132-hr4rf" Feb 26 12:12:00 crc kubenswrapper[4699]: I0226 12:12:00.768257 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535132-hr4rf" Feb 26 12:12:01 crc kubenswrapper[4699]: I0226 12:12:01.200060 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535132-hr4rf"] Feb 26 12:12:01 crc kubenswrapper[4699]: I0226 12:12:01.929962 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535132-hr4rf" event={"ID":"5ff3c3e0-4f58-401c-9f5f-b733727f73ff","Type":"ContainerStarted","Data":"c74e194aee577dd844b3d47a42b28777ae42ce708f0b112dd1f7c9633e886051"} Feb 26 12:12:02 crc kubenswrapper[4699]: I0226 12:12:02.940622 4699 generic.go:334] "Generic (PLEG): container finished" podID="5ff3c3e0-4f58-401c-9f5f-b733727f73ff" containerID="747ddaa984d13eaf0f8ee9e7ae1b9299bffa91ea051e4eb23c1b1a2ab2aaf402" exitCode=0 Feb 26 12:12:02 crc kubenswrapper[4699]: I0226 12:12:02.940695 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535132-hr4rf" event={"ID":"5ff3c3e0-4f58-401c-9f5f-b733727f73ff","Type":"ContainerDied","Data":"747ddaa984d13eaf0f8ee9e7ae1b9299bffa91ea051e4eb23c1b1a2ab2aaf402"} Feb 26 12:12:04 crc kubenswrapper[4699]: I0226 12:12:04.287192 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535132-hr4rf" Feb 26 12:12:04 crc kubenswrapper[4699]: I0226 12:12:04.431368 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kshk8\" (UniqueName: \"kubernetes.io/projected/5ff3c3e0-4f58-401c-9f5f-b733727f73ff-kube-api-access-kshk8\") pod \"5ff3c3e0-4f58-401c-9f5f-b733727f73ff\" (UID: \"5ff3c3e0-4f58-401c-9f5f-b733727f73ff\") " Feb 26 12:12:04 crc kubenswrapper[4699]: I0226 12:12:04.440008 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ff3c3e0-4f58-401c-9f5f-b733727f73ff-kube-api-access-kshk8" (OuterVolumeSpecName: "kube-api-access-kshk8") pod "5ff3c3e0-4f58-401c-9f5f-b733727f73ff" (UID: "5ff3c3e0-4f58-401c-9f5f-b733727f73ff"). InnerVolumeSpecName "kube-api-access-kshk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:12:04 crc kubenswrapper[4699]: I0226 12:12:04.534343 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kshk8\" (UniqueName: \"kubernetes.io/projected/5ff3c3e0-4f58-401c-9f5f-b733727f73ff-kube-api-access-kshk8\") on node \"crc\" DevicePath \"\"" Feb 26 12:12:04 crc kubenswrapper[4699]: I0226 12:12:04.962011 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535132-hr4rf" event={"ID":"5ff3c3e0-4f58-401c-9f5f-b733727f73ff","Type":"ContainerDied","Data":"c74e194aee577dd844b3d47a42b28777ae42ce708f0b112dd1f7c9633e886051"} Feb 26 12:12:04 crc kubenswrapper[4699]: I0226 12:12:04.962053 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c74e194aee577dd844b3d47a42b28777ae42ce708f0b112dd1f7c9633e886051" Feb 26 12:12:04 crc kubenswrapper[4699]: I0226 12:12:04.962105 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535132-hr4rf" Feb 26 12:12:05 crc kubenswrapper[4699]: I0226 12:12:05.358785 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535126-n7gpm"] Feb 26 12:12:05 crc kubenswrapper[4699]: I0226 12:12:05.367860 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535126-n7gpm"] Feb 26 12:12:06 crc kubenswrapper[4699]: I0226 12:12:06.274751 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d62b893f-dc84-4f3a-9c62-5c49c65be99f" path="/var/lib/kubelet/pods/d62b893f-dc84-4f3a-9c62-5c49c65be99f/volumes" Feb 26 12:12:10 crc kubenswrapper[4699]: I0226 12:12:10.636404 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-4k4sm_07c2552c-8182-4cfe-a397-39ad287029e5/manager/0.log" Feb 26 12:12:11 crc kubenswrapper[4699]: I0226 12:12:11.070998 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8_449351cd-8256-4e21-b27e-be3c4db11ca5/util/0.log" Feb 26 12:12:11 crc kubenswrapper[4699]: I0226 12:12:11.279820 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8_449351cd-8256-4e21-b27e-be3c4db11ca5/util/0.log" Feb 26 12:12:11 crc kubenswrapper[4699]: I0226 12:12:11.306520 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8_449351cd-8256-4e21-b27e-be3c4db11ca5/pull/0.log" Feb 26 12:12:11 crc kubenswrapper[4699]: I0226 12:12:11.478494 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8_449351cd-8256-4e21-b27e-be3c4db11ca5/pull/0.log" Feb 26 12:12:11 crc kubenswrapper[4699]: I0226 12:12:11.584625 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 12:12:11 crc kubenswrapper[4699]: I0226 12:12:11.584682 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 12:12:11 crc kubenswrapper[4699]: I0226 12:12:11.584731 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 12:12:11 crc kubenswrapper[4699]: I0226 12:12:11.585495 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7"} pod="openshift-machine-config-operator/machine-config-daemon-28p79" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 12:12:11 crc kubenswrapper[4699]: I0226 12:12:11.585543 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" containerID="cri-o://994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" gracePeriod=600 Feb 26 12:12:11 crc kubenswrapper[4699]: I0226 12:12:11.653510 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8_449351cd-8256-4e21-b27e-be3c4db11ca5/util/0.log" Feb 26 12:12:11 crc kubenswrapper[4699]: I0226 12:12:11.691770 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8_449351cd-8256-4e21-b27e-be3c4db11ca5/pull/0.log" Feb 26 12:12:11 crc kubenswrapper[4699]: E0226 12:12:11.707054 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:12:11 crc kubenswrapper[4699]: I0226 12:12:11.817207 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-xw85z_35555f68-d5c4-44b2-9dfa-af5f91f57c7c/manager/0.log" Feb 26 12:12:11 crc kubenswrapper[4699]: I0226 12:12:11.845019 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8_449351cd-8256-4e21-b27e-be3c4db11ca5/extract/0.log" Feb 26 12:12:12 crc kubenswrapper[4699]: I0226 12:12:12.110440 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-t8c9f_7b204025-d5ff-4c74-96b9-6774b62e0cc4/manager/0.log" Feb 26 12:12:12 crc kubenswrapper[4699]: I0226 12:12:12.179938 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-784b5bb6c5-jh7vz_27e251bb-8f9b-48d4-9ea3-81d03fd85244/manager/0.log" Feb 26 12:12:12 crc kubenswrapper[4699]: I0226 12:12:12.325690 4699 generic.go:334] "Generic (PLEG): container finished" podID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" exitCode=0 Feb 26 12:12:12 crc kubenswrapper[4699]: I0226 12:12:12.325737 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerDied","Data":"994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7"} Feb 26 12:12:12 crc kubenswrapper[4699]: I0226 12:12:12.325775 4699 scope.go:117] "RemoveContainer" containerID="243333360a6594f8acbe71e1e9197448e74ac1a0258671779fb6af974ca032dd" Feb 26 12:12:12 crc kubenswrapper[4699]: I0226 12:12:12.326604 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:12:12 crc kubenswrapper[4699]: E0226 12:12:12.326926 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:12:12 crc kubenswrapper[4699]: I0226 12:12:12.385179 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-qf9vd_619dff06-7255-4aab-9ffe-9f2561bcc904/manager/0.log" Feb 26 12:12:12 crc kubenswrapper[4699]: I0226 12:12:12.658499 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-5k85p_d56efcbf-3414-4bd1-9cbf-d56c434ac529/manager/0.log" Feb 26 12:12:12 crc kubenswrapper[4699]: I0226 12:12:12.951921 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-d2pxc_a2c419ab-2a99-4d37-b46c-b84024f24b2e/manager/0.log" Feb 26 12:12:12 crc kubenswrapper[4699]: I0226 12:12:12.979924 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-mtrs6_afbeb2d8-c332-447b-a931-9fe7b246914d/manager/0.log" Feb 26 12:12:13 crc kubenswrapper[4699]: I0226 12:12:13.146035 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-9gwwj_caabfe5b-420c-47c7-9ed2-b4ac9b2d54f2/manager/0.log" Feb 26 12:12:13 crc kubenswrapper[4699]: I0226 12:12:13.323517 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-95whc_38eef260-c32f-4568-9936-6197ba984f05/manager/0.log" Feb 26 12:12:13 crc kubenswrapper[4699]: I0226 12:12:13.496952 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6bd4687957-6gblm_54959b79-361c-415a-986d-1af6d8eb6701/manager/0.log" Feb 26 12:12:13 crc kubenswrapper[4699]: I0226 12:12:13.755807 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-659dc6bbfc-2wj2n_a6e7ca85-e18b-4605-9180-316f65b82006/manager/0.log" Feb 26 12:12:13 crc kubenswrapper[4699]: I0226 12:12:13.763074 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-4mghs_0ca72d8e-ed5f-47e4-b9c0-62b41fd687ee/manager/0.log" Feb 26 12:12:13 crc kubenswrapper[4699]: I0226 12:12:13.962696 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb_ce7c40ca-05ad-49ca-a091-02ac588c3eb7/manager/0.log" Feb 26 12:12:14 crc kubenswrapper[4699]: I0226 12:12:14.437321 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7c5cc54f9c-wjrrd_3a6d1210-ece5-4666-80bf-c7c7821e441c/operator/0.log" Feb 26 12:12:14 crc kubenswrapper[4699]: I0226 12:12:14.574347 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-gmh8j_22cfe789-87ae-4b23-91c2-cbb5112e4285/registry-server/0.log" Feb 26 12:12:14 crc kubenswrapper[4699]: I0226 12:12:14.895775 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5955d8c787-96png_a90c4025-7bd1-401b-8f92-5f15a58fb3d6/manager/0.log" Feb 26 12:12:15 crc kubenswrapper[4699]: I0226 12:12:15.050357 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-jxr77_7545763d-d2d2-4b6e-980d-737062f0a894/manager/0.log" Feb 26 12:12:15 crc kubenswrapper[4699]: I0226 12:12:15.176292 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-ghqf4_8d440653-f1c3-483c-a37d-463dcfc15224/operator/0.log" Feb 26 12:12:15 crc kubenswrapper[4699]: I0226 12:12:15.305371 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-bqvxr_33fc0a61-18c9-4e80-b898-92a5b1b71dac/manager/0.log" Feb 26 12:12:15 crc kubenswrapper[4699]: I0226 12:12:15.542309 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-589c568786-f9kz5_15255a9b-0767-4518-8e81-ca9044f9190a/manager/0.log" Feb 26 12:12:15 crc kubenswrapper[4699]: I0226 12:12:15.609458 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5dc6794d5b-mwvnr_5be0c14a-e51f-4b69-ab58-c0cac66910e2/manager/0.log" Feb 26 12:12:15 crc kubenswrapper[4699]: I0226 12:12:15.788972 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-fnnc7_a2b3bf3b-a815-4033-983b-eedc16b8609f/manager/0.log" Feb 26 12:12:16 crc kubenswrapper[4699]: I0226 12:12:16.003906 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-947f4f86b-m69sv_ebf1a568-be30-4ceb-bc67-e3158a0280b9/manager/0.log" Feb 26 12:12:16 crc kubenswrapper[4699]: I0226 12:12:16.885782 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-sndb9_1814471e-5f82-4464-9528-75da66d7235b/manager/0.log" Feb 26 12:12:25 crc kubenswrapper[4699]: I0226 12:12:25.260700 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:12:25 crc kubenswrapper[4699]: E0226 12:12:25.262529 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:12:35 crc kubenswrapper[4699]: I0226 12:12:35.098168 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-p9wj4_bad776f4-e24b-41f1-88d8-2b1fe6258783/control-plane-machine-set-operator/0.log" Feb 26 12:12:35 crc kubenswrapper[4699]: I0226 12:12:35.305310 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pw64v_5d015dd8-56c9-4f61-b133-4951cda91ca5/machine-api-operator/0.log" Feb 26 12:12:35 crc kubenswrapper[4699]: I0226 12:12:35.345901 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pw64v_5d015dd8-56c9-4f61-b133-4951cda91ca5/kube-rbac-proxy/0.log" Feb 26 12:12:37 crc kubenswrapper[4699]: I0226 12:12:37.261389 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:12:37 crc kubenswrapper[4699]: E0226 12:12:37.262053 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:12:39 crc kubenswrapper[4699]: I0226 12:12:39.818522 4699 scope.go:117] "RemoveContainer" containerID="0a9b5f9a5f2d730b937d8d7362f22b7e6fe3edad8ecb5523a71d611f339c4a8e" Feb 26 12:12:47 crc kubenswrapper[4699]: I0226 12:12:47.477706 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-fhn2n_fc42522b-c5f4-4df2-8435-3e3985dd960c/cert-manager-controller/0.log" Feb 26 12:12:47 crc kubenswrapper[4699]: I0226 12:12:47.650935 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-dswxp_f026799a-39c7-443e-9801-f046ba8ae94b/cert-manager-cainjector/0.log" Feb 26 12:12:47 crc kubenswrapper[4699]: I0226 12:12:47.717272 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-l2fdt_fad1f923-b22c-4c0d-9eb9-684636bc76c0/cert-manager-webhook/0.log" Feb 26 12:12:52 crc kubenswrapper[4699]: I0226 12:12:52.262581 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:12:52 crc kubenswrapper[4699]: E0226 12:12:52.263707 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:12:59 crc kubenswrapper[4699]: I0226 12:12:59.392841 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-7f4bx_13fc1aa0-a043-4b42-952b-7f718ff577d2/nmstate-console-plugin/0.log" Feb 26 12:12:59 crc kubenswrapper[4699]: I0226 12:12:59.579535 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-5jrwg_80de38f0-8620-4e27-988e-6d85d7c8bc24/nmstate-handler/0.log" Feb 26 12:12:59 crc kubenswrapper[4699]: I0226 12:12:59.625265 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-jnrsc_c4897df9-3a79-41bf-a7ba-7a72d888f8e1/kube-rbac-proxy/0.log" Feb 26 12:12:59 crc kubenswrapper[4699]: I0226 12:12:59.678641 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-jnrsc_c4897df9-3a79-41bf-a7ba-7a72d888f8e1/nmstate-metrics/0.log" Feb 26 12:12:59 crc kubenswrapper[4699]: I0226 12:12:59.776250 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-8l8n8_15312afe-49aa-4681-8513-6ed9c774d222/nmstate-operator/0.log" Feb 26 12:12:59 crc kubenswrapper[4699]: I0226 12:12:59.881418 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-qmw66_d674e733-7357-43e5-be9c-4d4e9bad252c/nmstate-webhook/0.log" Feb 26 12:13:06 crc kubenswrapper[4699]: I0226 12:13:06.268130 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:13:06 crc kubenswrapper[4699]: E0226 12:13:06.269002 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:13:17 crc kubenswrapper[4699]: I0226 12:13:17.260389 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:13:17 crc kubenswrapper[4699]: E0226 12:13:17.261416 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:13:24 crc kubenswrapper[4699]: I0226 12:13:24.707935 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-bs5nk_6ef6a9d7-6997-485a-a812-ded9d3a2df85/kube-rbac-proxy/0.log" Feb 26 12:13:24 crc kubenswrapper[4699]: I0226 12:13:24.761653 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-bs5nk_6ef6a9d7-6997-485a-a812-ded9d3a2df85/controller/0.log" Feb 26 12:13:24 crc kubenswrapper[4699]: I0226 12:13:24.915683 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-svsrb_35357e2c-2a03-46f8-bc28-f7daad3b679d/frr-k8s-webhook-server/0.log" Feb 26 12:13:24 crc kubenswrapper[4699]: I0226 12:13:24.997239 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-frr-files/0.log" Feb 26 12:13:25 crc kubenswrapper[4699]: I0226 12:13:25.117277 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-metrics/0.log" Feb 26 12:13:25 crc kubenswrapper[4699]: I0226 12:13:25.138151 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-frr-files/0.log" Feb 26 12:13:25 crc kubenswrapper[4699]: I0226 12:13:25.143939 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-reloader/0.log" Feb 26 12:13:25 crc kubenswrapper[4699]: I0226 12:13:25.186943 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-reloader/0.log" Feb 26 12:13:25 crc kubenswrapper[4699]: I0226 12:13:25.376966 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-metrics/0.log" Feb 26 12:13:25 crc kubenswrapper[4699]: I0226 12:13:25.394648 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-reloader/0.log" Feb 26 12:13:25 crc kubenswrapper[4699]: I0226 12:13:25.395812 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-frr-files/0.log" Feb 26 12:13:25 crc kubenswrapper[4699]: I0226 12:13:25.411040 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-metrics/0.log" Feb 26 12:13:25 crc kubenswrapper[4699]: I0226 12:13:25.572476 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-frr-files/0.log" Feb 26 12:13:25 crc kubenswrapper[4699]: I0226 12:13:25.575407 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-reloader/0.log" Feb 26 12:13:25 crc kubenswrapper[4699]: I0226 12:13:25.586097 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/controller/0.log" Feb 26 12:13:25 crc kubenswrapper[4699]: I0226 12:13:25.599274 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-metrics/0.log" Feb 26 12:13:25 crc kubenswrapper[4699]: I0226 12:13:25.750497 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/frr-metrics/0.log" Feb 26 12:13:25 crc kubenswrapper[4699]: I0226 12:13:25.785489 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/kube-rbac-proxy-frr/0.log" Feb 26 12:13:25 crc kubenswrapper[4699]: I0226 12:13:25.785491 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/kube-rbac-proxy/0.log" Feb 26 12:13:26 crc kubenswrapper[4699]: I0226 12:13:26.007157 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/reloader/0.log" Feb 26 12:13:26 crc kubenswrapper[4699]: I0226 12:13:26.019869 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5d58b8658b-qjr5b_cacc25f7-9d5d-4ba4-b7b7-f95d7cba63e8/manager/0.log" Feb 26 12:13:26 crc kubenswrapper[4699]: I0226 12:13:26.210183 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6d98597f89-glkjh_af2438c1-8812-4bb1-8999-66cb8d804c05/webhook-server/0.log" Feb 26 12:13:26 crc kubenswrapper[4699]: I0226 12:13:26.453958 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-l8phj_d656ca89-f955-44bb-9944-f75bf485a254/kube-rbac-proxy/0.log" Feb 26 12:13:27 crc kubenswrapper[4699]: I0226 12:13:27.001676 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-l8phj_d656ca89-f955-44bb-9944-f75bf485a254/speaker/0.log" Feb 26 12:13:27 crc kubenswrapper[4699]: I0226 12:13:27.490916 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/frr/0.log" Feb 26 12:13:31 crc kubenswrapper[4699]: I0226 12:13:31.261554 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:13:31 crc kubenswrapper[4699]: E0226 12:13:31.262295 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:13:38 crc kubenswrapper[4699]: I0226 12:13:38.675225 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5_a0751c34-68ec-4fd1-821f-94e314dd5621/util/0.log" Feb 26 12:13:38 crc kubenswrapper[4699]: I0226 12:13:38.802395 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5_a0751c34-68ec-4fd1-821f-94e314dd5621/util/0.log" Feb 26 12:13:38 crc kubenswrapper[4699]: I0226 12:13:38.836154 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5_a0751c34-68ec-4fd1-821f-94e314dd5621/pull/0.log" Feb 26 12:13:38 crc kubenswrapper[4699]: I0226 12:13:38.871550 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5_a0751c34-68ec-4fd1-821f-94e314dd5621/pull/0.log" Feb 26 12:13:39 crc kubenswrapper[4699]: I0226 12:13:39.080224 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5_a0751c34-68ec-4fd1-821f-94e314dd5621/util/0.log" Feb 26 12:13:39 crc kubenswrapper[4699]: I0226 12:13:39.081621 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5_a0751c34-68ec-4fd1-821f-94e314dd5621/pull/0.log" Feb 26 12:13:39 crc kubenswrapper[4699]: I0226 12:13:39.088387 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5_a0751c34-68ec-4fd1-821f-94e314dd5621/extract/0.log" Feb 26 12:13:39 crc kubenswrapper[4699]: I0226 12:13:39.225679 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5vsj9_a23d2795-eec2-4e37-8902-7f9220e44cb1/extract-utilities/0.log" Feb 26 12:13:39 crc kubenswrapper[4699]: I0226 12:13:39.379486 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5vsj9_a23d2795-eec2-4e37-8902-7f9220e44cb1/extract-content/0.log" Feb 26 12:13:39 crc kubenswrapper[4699]: I0226 12:13:39.402313 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5vsj9_a23d2795-eec2-4e37-8902-7f9220e44cb1/extract-content/0.log" Feb 26 12:13:39 crc kubenswrapper[4699]: I0226 12:13:39.413159 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5vsj9_a23d2795-eec2-4e37-8902-7f9220e44cb1/extract-utilities/0.log" Feb 26 12:13:39 crc kubenswrapper[4699]: I0226 12:13:39.577756 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5vsj9_a23d2795-eec2-4e37-8902-7f9220e44cb1/extract-content/0.log" Feb 26 12:13:39 crc kubenswrapper[4699]: I0226 12:13:39.585499 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5vsj9_a23d2795-eec2-4e37-8902-7f9220e44cb1/extract-utilities/0.log" Feb 26 12:13:39 crc kubenswrapper[4699]: I0226 12:13:39.799131 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hfvdf_fde9effb-9fa9-46a0-a8e6-08080ed0b8ba/extract-utilities/0.log" Feb 26 12:13:40 crc kubenswrapper[4699]: I0226 12:13:40.036488 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5vsj9_a23d2795-eec2-4e37-8902-7f9220e44cb1/registry-server/0.log" Feb 26 12:13:40 crc kubenswrapper[4699]: I0226 12:13:40.037319 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hfvdf_fde9effb-9fa9-46a0-a8e6-08080ed0b8ba/extract-content/0.log" Feb 26 12:13:40 crc kubenswrapper[4699]: I0226 12:13:40.056433 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hfvdf_fde9effb-9fa9-46a0-a8e6-08080ed0b8ba/extract-utilities/0.log" Feb 26 12:13:40 crc kubenswrapper[4699]: I0226 12:13:40.071553 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hfvdf_fde9effb-9fa9-46a0-a8e6-08080ed0b8ba/extract-content/0.log" Feb 26 12:13:40 crc kubenswrapper[4699]: I0226 12:13:40.207662 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hfvdf_fde9effb-9fa9-46a0-a8e6-08080ed0b8ba/extract-utilities/0.log" Feb 26 12:13:40 crc kubenswrapper[4699]: I0226 12:13:40.241599 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hfvdf_fde9effb-9fa9-46a0-a8e6-08080ed0b8ba/extract-content/0.log" Feb 26 12:13:40 crc kubenswrapper[4699]: I0226 12:13:40.452774 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb_2628fd13-0f89-4bb3-9b76-86a9331a303e/util/0.log" Feb 26 12:13:40 crc kubenswrapper[4699]: I0226 12:13:40.637023 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb_2628fd13-0f89-4bb3-9b76-86a9331a303e/pull/0.log" Feb 26 12:13:40 crc kubenswrapper[4699]: I0226 12:13:40.641026 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb_2628fd13-0f89-4bb3-9b76-86a9331a303e/util/0.log" Feb 26 12:13:40 crc kubenswrapper[4699]: I0226 12:13:40.662572 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hfvdf_fde9effb-9fa9-46a0-a8e6-08080ed0b8ba/registry-server/0.log" Feb 26 12:13:40 crc kubenswrapper[4699]: I0226 12:13:40.672278 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb_2628fd13-0f89-4bb3-9b76-86a9331a303e/pull/0.log" Feb 26 12:13:40 crc kubenswrapper[4699]: I0226 12:13:40.827510 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb_2628fd13-0f89-4bb3-9b76-86a9331a303e/util/0.log" Feb 26 12:13:40 crc kubenswrapper[4699]: I0226 12:13:40.840896 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb_2628fd13-0f89-4bb3-9b76-86a9331a303e/pull/0.log" Feb 26 12:13:40 crc kubenswrapper[4699]: I0226 12:13:40.910602 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb_2628fd13-0f89-4bb3-9b76-86a9331a303e/extract/0.log" Feb 26 12:13:41 crc kubenswrapper[4699]: I0226 12:13:41.007890 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-nwbkq_43a980f6-1eff-4610-aa3e-69729c3eb7c7/marketplace-operator/0.log" Feb 26 12:13:41 crc kubenswrapper[4699]: I0226 12:13:41.039522 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r555d_d174508d-e5d5-4912-a652-e7b264f1c882/extract-utilities/0.log" Feb 26 12:13:41 crc kubenswrapper[4699]: I0226 12:13:41.248384 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r555d_d174508d-e5d5-4912-a652-e7b264f1c882/extract-utilities/0.log" Feb 26 12:13:41 crc kubenswrapper[4699]: I0226 12:13:41.252831 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r555d_d174508d-e5d5-4912-a652-e7b264f1c882/extract-content/0.log" Feb 26 12:13:41 crc kubenswrapper[4699]: I0226 12:13:41.271269 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r555d_d174508d-e5d5-4912-a652-e7b264f1c882/extract-content/0.log" Feb 26 12:13:41 crc kubenswrapper[4699]: I0226 12:13:41.466141 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r555d_d174508d-e5d5-4912-a652-e7b264f1c882/extract-content/0.log" Feb 26 12:13:41 crc kubenswrapper[4699]: I0226 12:13:41.485826 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r555d_d174508d-e5d5-4912-a652-e7b264f1c882/extract-utilities/0.log" Feb 26 12:13:41 crc kubenswrapper[4699]: I0226 12:13:41.586130 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r555d_d174508d-e5d5-4912-a652-e7b264f1c882/registry-server/0.log" Feb 26 12:13:41 crc kubenswrapper[4699]: I0226 12:13:41.735420 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wqmqz_a69df934-4fa7-472d-abe7-8fa4ec5d4296/extract-utilities/0.log" Feb 26 12:13:41 crc kubenswrapper[4699]: I0226 12:13:41.830308 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wqmqz_a69df934-4fa7-472d-abe7-8fa4ec5d4296/extract-utilities/0.log" Feb 26 12:13:41 crc kubenswrapper[4699]: I0226 12:13:41.833786 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wqmqz_a69df934-4fa7-472d-abe7-8fa4ec5d4296/extract-content/0.log" Feb 26 12:13:41 crc kubenswrapper[4699]: I0226 12:13:41.851425 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wqmqz_a69df934-4fa7-472d-abe7-8fa4ec5d4296/extract-content/0.log" Feb 26 12:13:42 crc kubenswrapper[4699]: I0226 12:13:42.007348 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wqmqz_a69df934-4fa7-472d-abe7-8fa4ec5d4296/extract-utilities/0.log" Feb 26 12:13:42 crc kubenswrapper[4699]: I0226 12:13:42.014996 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wqmqz_a69df934-4fa7-472d-abe7-8fa4ec5d4296/extract-content/0.log" Feb 26 12:13:42 crc kubenswrapper[4699]: I0226 12:13:42.418939 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wqmqz_a69df934-4fa7-472d-abe7-8fa4ec5d4296/registry-server/0.log" Feb 26 12:13:46 crc kubenswrapper[4699]: I0226 12:13:46.267365 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:13:46 crc kubenswrapper[4699]: E0226 12:13:46.267633 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:13:57 crc kubenswrapper[4699]: I0226 12:13:57.261445 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:13:57 crc kubenswrapper[4699]: E0226 12:13:57.262282 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:14:00 crc kubenswrapper[4699]: I0226 12:14:00.143080 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535134-bh8fn"] Feb 26 12:14:00 crc kubenswrapper[4699]: E0226 12:14:00.148770 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ff3c3e0-4f58-401c-9f5f-b733727f73ff" containerName="oc" Feb 26 12:14:00 crc kubenswrapper[4699]: I0226 12:14:00.148794 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ff3c3e0-4f58-401c-9f5f-b733727f73ff" containerName="oc" Feb 26 12:14:00 crc kubenswrapper[4699]: I0226 12:14:00.149053 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ff3c3e0-4f58-401c-9f5f-b733727f73ff" containerName="oc" Feb 26 12:14:00 crc kubenswrapper[4699]: I0226 12:14:00.149848 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535134-bh8fn" Feb 26 12:14:00 crc kubenswrapper[4699]: I0226 12:14:00.154362 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 12:14:00 crc kubenswrapper[4699]: I0226 12:14:00.154563 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 12:14:00 crc kubenswrapper[4699]: I0226 12:14:00.154720 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 12:14:00 crc kubenswrapper[4699]: I0226 12:14:00.171432 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535134-bh8fn"] Feb 26 12:14:00 crc kubenswrapper[4699]: I0226 12:14:00.331258 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpvg5\" (UniqueName: \"kubernetes.io/projected/916cd984-33ed-4299-ade5-5064478d656f-kube-api-access-qpvg5\") pod \"auto-csr-approver-29535134-bh8fn\" (UID: \"916cd984-33ed-4299-ade5-5064478d656f\") " pod="openshift-infra/auto-csr-approver-29535134-bh8fn" Feb 26 12:14:00 crc kubenswrapper[4699]: I0226 12:14:00.433144 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpvg5\" (UniqueName: \"kubernetes.io/projected/916cd984-33ed-4299-ade5-5064478d656f-kube-api-access-qpvg5\") pod \"auto-csr-approver-29535134-bh8fn\" (UID: \"916cd984-33ed-4299-ade5-5064478d656f\") " pod="openshift-infra/auto-csr-approver-29535134-bh8fn" Feb 26 12:14:00 crc kubenswrapper[4699]: I0226 12:14:00.450521 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpvg5\" (UniqueName: \"kubernetes.io/projected/916cd984-33ed-4299-ade5-5064478d656f-kube-api-access-qpvg5\") pod \"auto-csr-approver-29535134-bh8fn\" (UID: \"916cd984-33ed-4299-ade5-5064478d656f\") " pod="openshift-infra/auto-csr-approver-29535134-bh8fn" Feb 26 12:14:00 crc kubenswrapper[4699]: I0226 12:14:00.488714 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535134-bh8fn" Feb 26 12:14:00 crc kubenswrapper[4699]: I0226 12:14:00.982994 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535134-bh8fn"] Feb 26 12:14:00 crc kubenswrapper[4699]: I0226 12:14:00.983498 4699 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 12:14:01 crc kubenswrapper[4699]: I0226 12:14:01.374036 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535134-bh8fn" event={"ID":"916cd984-33ed-4299-ade5-5064478d656f","Type":"ContainerStarted","Data":"1c9da386567c1145e06dd7ab813fc07e39cefd0bde6d9e1a25317925d1acc515"} Feb 26 12:14:03 crc kubenswrapper[4699]: I0226 12:14:03.395480 4699 generic.go:334] "Generic (PLEG): container finished" podID="916cd984-33ed-4299-ade5-5064478d656f" containerID="ae1928085c149280cf3addf69107c792048518ecf95f2de337f2886f53e0e594" exitCode=0 Feb 26 12:14:03 crc kubenswrapper[4699]: I0226 12:14:03.395689 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535134-bh8fn" event={"ID":"916cd984-33ed-4299-ade5-5064478d656f","Type":"ContainerDied","Data":"ae1928085c149280cf3addf69107c792048518ecf95f2de337f2886f53e0e594"} Feb 26 12:14:05 crc kubenswrapper[4699]: I0226 12:14:05.042725 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535134-bh8fn" Feb 26 12:14:05 crc kubenswrapper[4699]: I0226 12:14:05.054374 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpvg5\" (UniqueName: \"kubernetes.io/projected/916cd984-33ed-4299-ade5-5064478d656f-kube-api-access-qpvg5\") pod \"916cd984-33ed-4299-ade5-5064478d656f\" (UID: \"916cd984-33ed-4299-ade5-5064478d656f\") " Feb 26 12:14:05 crc kubenswrapper[4699]: I0226 12:14:05.064608 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/916cd984-33ed-4299-ade5-5064478d656f-kube-api-access-qpvg5" (OuterVolumeSpecName: "kube-api-access-qpvg5") pod "916cd984-33ed-4299-ade5-5064478d656f" (UID: "916cd984-33ed-4299-ade5-5064478d656f"). InnerVolumeSpecName "kube-api-access-qpvg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:14:05 crc kubenswrapper[4699]: I0226 12:14:05.157283 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpvg5\" (UniqueName: \"kubernetes.io/projected/916cd984-33ed-4299-ade5-5064478d656f-kube-api-access-qpvg5\") on node \"crc\" DevicePath \"\"" Feb 26 12:14:05 crc kubenswrapper[4699]: I0226 12:14:05.415374 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535134-bh8fn" event={"ID":"916cd984-33ed-4299-ade5-5064478d656f","Type":"ContainerDied","Data":"1c9da386567c1145e06dd7ab813fc07e39cefd0bde6d9e1a25317925d1acc515"} Feb 26 12:14:05 crc kubenswrapper[4699]: I0226 12:14:05.415415 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c9da386567c1145e06dd7ab813fc07e39cefd0bde6d9e1a25317925d1acc515" Feb 26 12:14:05 crc kubenswrapper[4699]: I0226 12:14:05.415469 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535134-bh8fn" Feb 26 12:14:06 crc kubenswrapper[4699]: I0226 12:14:06.114239 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535128-4vkb4"] Feb 26 12:14:06 crc kubenswrapper[4699]: I0226 12:14:06.132253 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535128-4vkb4"] Feb 26 12:14:06 crc kubenswrapper[4699]: I0226 12:14:06.288250 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e519986-41ca-4360-b9bd-14a485e9a635" path="/var/lib/kubelet/pods/0e519986-41ca-4360-b9bd-14a485e9a635/volumes" Feb 26 12:14:10 crc kubenswrapper[4699]: I0226 12:14:10.265035 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:14:10 crc kubenswrapper[4699]: E0226 12:14:10.265914 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:14:22 crc kubenswrapper[4699]: I0226 12:14:22.261488 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:14:22 crc kubenswrapper[4699]: E0226 12:14:22.262154 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:14:33 crc kubenswrapper[4699]: I0226 12:14:33.791778 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qwws6"] Feb 26 12:14:33 crc kubenswrapper[4699]: E0226 12:14:33.793063 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="916cd984-33ed-4299-ade5-5064478d656f" containerName="oc" Feb 26 12:14:33 crc kubenswrapper[4699]: I0226 12:14:33.793085 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="916cd984-33ed-4299-ade5-5064478d656f" containerName="oc" Feb 26 12:14:33 crc kubenswrapper[4699]: I0226 12:14:33.793369 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="916cd984-33ed-4299-ade5-5064478d656f" containerName="oc" Feb 26 12:14:33 crc kubenswrapper[4699]: I0226 12:14:33.794697 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qwws6" Feb 26 12:14:33 crc kubenswrapper[4699]: I0226 12:14:33.806865 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qwws6"] Feb 26 12:14:33 crc kubenswrapper[4699]: I0226 12:14:33.949434 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6wmj\" (UniqueName: \"kubernetes.io/projected/c99a1b65-dd7a-4d1c-a767-43eb7192dea7-kube-api-access-v6wmj\") pod \"community-operators-qwws6\" (UID: \"c99a1b65-dd7a-4d1c-a767-43eb7192dea7\") " pod="openshift-marketplace/community-operators-qwws6" Feb 26 12:14:33 crc kubenswrapper[4699]: I0226 12:14:33.949850 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99a1b65-dd7a-4d1c-a767-43eb7192dea7-utilities\") pod \"community-operators-qwws6\" (UID: \"c99a1b65-dd7a-4d1c-a767-43eb7192dea7\") " pod="openshift-marketplace/community-operators-qwws6" Feb 26 12:14:33 crc kubenswrapper[4699]: I0226 12:14:33.949978 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99a1b65-dd7a-4d1c-a767-43eb7192dea7-catalog-content\") pod \"community-operators-qwws6\" (UID: \"c99a1b65-dd7a-4d1c-a767-43eb7192dea7\") " pod="openshift-marketplace/community-operators-qwws6" Feb 26 12:14:34 crc kubenswrapper[4699]: I0226 12:14:34.052010 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99a1b65-dd7a-4d1c-a767-43eb7192dea7-utilities\") pod \"community-operators-qwws6\" (UID: \"c99a1b65-dd7a-4d1c-a767-43eb7192dea7\") " pod="openshift-marketplace/community-operators-qwws6" Feb 26 12:14:34 crc kubenswrapper[4699]: I0226 12:14:34.052073 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99a1b65-dd7a-4d1c-a767-43eb7192dea7-catalog-content\") pod \"community-operators-qwws6\" (UID: \"c99a1b65-dd7a-4d1c-a767-43eb7192dea7\") " pod="openshift-marketplace/community-operators-qwws6" Feb 26 12:14:34 crc kubenswrapper[4699]: I0226 12:14:34.052133 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6wmj\" (UniqueName: \"kubernetes.io/projected/c99a1b65-dd7a-4d1c-a767-43eb7192dea7-kube-api-access-v6wmj\") pod \"community-operators-qwws6\" (UID: \"c99a1b65-dd7a-4d1c-a767-43eb7192dea7\") " pod="openshift-marketplace/community-operators-qwws6" Feb 26 12:14:34 crc kubenswrapper[4699]: I0226 12:14:34.052990 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99a1b65-dd7a-4d1c-a767-43eb7192dea7-utilities\") pod \"community-operators-qwws6\" (UID: \"c99a1b65-dd7a-4d1c-a767-43eb7192dea7\") " pod="openshift-marketplace/community-operators-qwws6" Feb 26 12:14:34 crc kubenswrapper[4699]: I0226 12:14:34.053074 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99a1b65-dd7a-4d1c-a767-43eb7192dea7-catalog-content\") pod \"community-operators-qwws6\" (UID: \"c99a1b65-dd7a-4d1c-a767-43eb7192dea7\") " pod="openshift-marketplace/community-operators-qwws6" Feb 26 12:14:34 crc kubenswrapper[4699]: I0226 12:14:34.082508 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6wmj\" (UniqueName: \"kubernetes.io/projected/c99a1b65-dd7a-4d1c-a767-43eb7192dea7-kube-api-access-v6wmj\") pod \"community-operators-qwws6\" (UID: \"c99a1b65-dd7a-4d1c-a767-43eb7192dea7\") " pod="openshift-marketplace/community-operators-qwws6" Feb 26 12:14:34 crc kubenswrapper[4699]: I0226 12:14:34.188945 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qwws6" Feb 26 12:14:34 crc kubenswrapper[4699]: I0226 12:14:34.679734 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qwws6"] Feb 26 12:14:35 crc kubenswrapper[4699]: I0226 12:14:35.260777 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:14:35 crc kubenswrapper[4699]: E0226 12:14:35.261273 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:14:35 crc kubenswrapper[4699]: I0226 12:14:35.697382 4699 generic.go:334] "Generic (PLEG): container finished" podID="c99a1b65-dd7a-4d1c-a767-43eb7192dea7" containerID="f22c7b5f9734f5bd9f782943baa7fd673b2c739b83f0f0acf9560a4f0b14c5c3" exitCode=0 Feb 26 12:14:35 crc kubenswrapper[4699]: I0226 12:14:35.698029 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qwws6" event={"ID":"c99a1b65-dd7a-4d1c-a767-43eb7192dea7","Type":"ContainerDied","Data":"f22c7b5f9734f5bd9f782943baa7fd673b2c739b83f0f0acf9560a4f0b14c5c3"} Feb 26 12:14:35 crc kubenswrapper[4699]: I0226 12:14:35.698089 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qwws6" event={"ID":"c99a1b65-dd7a-4d1c-a767-43eb7192dea7","Type":"ContainerStarted","Data":"89eafaeedd18474def84950473646ec600025a974d15625aadc38ccb3c651b4c"} Feb 26 12:14:36 crc kubenswrapper[4699]: I0226 12:14:36.708816 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qwws6" event={"ID":"c99a1b65-dd7a-4d1c-a767-43eb7192dea7","Type":"ContainerStarted","Data":"03997dd7e5a08e62d93a69b33ec106072b174646b82c2a64e9e509952ea5dbd1"} Feb 26 12:14:37 crc kubenswrapper[4699]: I0226 12:14:37.722897 4699 generic.go:334] "Generic (PLEG): container finished" podID="c99a1b65-dd7a-4d1c-a767-43eb7192dea7" containerID="03997dd7e5a08e62d93a69b33ec106072b174646b82c2a64e9e509952ea5dbd1" exitCode=0 Feb 26 12:14:37 crc kubenswrapper[4699]: I0226 12:14:37.723278 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qwws6" event={"ID":"c99a1b65-dd7a-4d1c-a767-43eb7192dea7","Type":"ContainerDied","Data":"03997dd7e5a08e62d93a69b33ec106072b174646b82c2a64e9e509952ea5dbd1"} Feb 26 12:14:38 crc kubenswrapper[4699]: I0226 12:14:38.734454 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qwws6" event={"ID":"c99a1b65-dd7a-4d1c-a767-43eb7192dea7","Type":"ContainerStarted","Data":"bb353a6cefd1490bc9489650a34b8b94c97dc0a81961d477aa68b76700f3d58c"} Feb 26 12:14:38 crc kubenswrapper[4699]: I0226 12:14:38.765133 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qwws6" podStartSLOduration=3.333039178 podStartE2EDuration="5.765094784s" podCreationTimestamp="2026-02-26 12:14:33 +0000 UTC" firstStartedPulling="2026-02-26 12:14:35.701312013 +0000 UTC m=+3821.512138447" lastFinishedPulling="2026-02-26 12:14:38.133367599 +0000 UTC m=+3823.944194053" observedRunningTime="2026-02-26 12:14:38.757065047 +0000 UTC m=+3824.567891491" watchObservedRunningTime="2026-02-26 12:14:38.765094784 +0000 UTC m=+3824.575921218" Feb 26 12:14:39 crc kubenswrapper[4699]: I0226 12:14:39.915425 4699 scope.go:117] "RemoveContainer" containerID="edcce5d1b2431ea73d4d1a16900e65c51edf48fce3e10f865133733ba98e31ff" Feb 26 12:14:44 crc kubenswrapper[4699]: I0226 12:14:44.189686 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qwws6" Feb 26 12:14:44 crc kubenswrapper[4699]: I0226 12:14:44.190769 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qwws6" Feb 26 12:14:44 crc kubenswrapper[4699]: I0226 12:14:44.233110 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qwws6" Feb 26 12:14:44 crc kubenswrapper[4699]: I0226 12:14:44.858744 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qwws6" Feb 26 12:14:44 crc kubenswrapper[4699]: I0226 12:14:44.909503 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qwws6"] Feb 26 12:14:46 crc kubenswrapper[4699]: I0226 12:14:46.819597 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qwws6" podUID="c99a1b65-dd7a-4d1c-a767-43eb7192dea7" containerName="registry-server" containerID="cri-o://bb353a6cefd1490bc9489650a34b8b94c97dc0a81961d477aa68b76700f3d58c" gracePeriod=2 Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.356706 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qwws6" Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.533059 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99a1b65-dd7a-4d1c-a767-43eb7192dea7-catalog-content\") pod \"c99a1b65-dd7a-4d1c-a767-43eb7192dea7\" (UID: \"c99a1b65-dd7a-4d1c-a767-43eb7192dea7\") " Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.533502 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99a1b65-dd7a-4d1c-a767-43eb7192dea7-utilities\") pod \"c99a1b65-dd7a-4d1c-a767-43eb7192dea7\" (UID: \"c99a1b65-dd7a-4d1c-a767-43eb7192dea7\") " Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.533683 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6wmj\" (UniqueName: \"kubernetes.io/projected/c99a1b65-dd7a-4d1c-a767-43eb7192dea7-kube-api-access-v6wmj\") pod \"c99a1b65-dd7a-4d1c-a767-43eb7192dea7\" (UID: \"c99a1b65-dd7a-4d1c-a767-43eb7192dea7\") " Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.534300 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c99a1b65-dd7a-4d1c-a767-43eb7192dea7-utilities" (OuterVolumeSpecName: "utilities") pod "c99a1b65-dd7a-4d1c-a767-43eb7192dea7" (UID: "c99a1b65-dd7a-4d1c-a767-43eb7192dea7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.534626 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99a1b65-dd7a-4d1c-a767-43eb7192dea7-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.547874 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c99a1b65-dd7a-4d1c-a767-43eb7192dea7-kube-api-access-v6wmj" (OuterVolumeSpecName: "kube-api-access-v6wmj") pod "c99a1b65-dd7a-4d1c-a767-43eb7192dea7" (UID: "c99a1b65-dd7a-4d1c-a767-43eb7192dea7"). InnerVolumeSpecName "kube-api-access-v6wmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.619437 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c99a1b65-dd7a-4d1c-a767-43eb7192dea7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c99a1b65-dd7a-4d1c-a767-43eb7192dea7" (UID: "c99a1b65-dd7a-4d1c-a767-43eb7192dea7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.636569 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99a1b65-dd7a-4d1c-a767-43eb7192dea7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.636619 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6wmj\" (UniqueName: \"kubernetes.io/projected/c99a1b65-dd7a-4d1c-a767-43eb7192dea7-kube-api-access-v6wmj\") on node \"crc\" DevicePath \"\"" Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.827957 4699 generic.go:334] "Generic (PLEG): container finished" podID="c99a1b65-dd7a-4d1c-a767-43eb7192dea7" containerID="bb353a6cefd1490bc9489650a34b8b94c97dc0a81961d477aa68b76700f3d58c" exitCode=0 Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.828007 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qwws6" event={"ID":"c99a1b65-dd7a-4d1c-a767-43eb7192dea7","Type":"ContainerDied","Data":"bb353a6cefd1490bc9489650a34b8b94c97dc0a81961d477aa68b76700f3d58c"} Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.828042 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qwws6" event={"ID":"c99a1b65-dd7a-4d1c-a767-43eb7192dea7","Type":"ContainerDied","Data":"89eafaeedd18474def84950473646ec600025a974d15625aadc38ccb3c651b4c"} Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.828047 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qwws6" Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.828062 4699 scope.go:117] "RemoveContainer" containerID="bb353a6cefd1490bc9489650a34b8b94c97dc0a81961d477aa68b76700f3d58c" Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.864443 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qwws6"] Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.869874 4699 scope.go:117] "RemoveContainer" containerID="03997dd7e5a08e62d93a69b33ec106072b174646b82c2a64e9e509952ea5dbd1" Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.876803 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qwws6"] Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.894880 4699 scope.go:117] "RemoveContainer" containerID="f22c7b5f9734f5bd9f782943baa7fd673b2c739b83f0f0acf9560a4f0b14c5c3" Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.930270 4699 scope.go:117] "RemoveContainer" containerID="bb353a6cefd1490bc9489650a34b8b94c97dc0a81961d477aa68b76700f3d58c" Feb 26 12:14:47 crc kubenswrapper[4699]: E0226 12:14:47.930704 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb353a6cefd1490bc9489650a34b8b94c97dc0a81961d477aa68b76700f3d58c\": container with ID starting with bb353a6cefd1490bc9489650a34b8b94c97dc0a81961d477aa68b76700f3d58c not found: ID does not exist" containerID="bb353a6cefd1490bc9489650a34b8b94c97dc0a81961d477aa68b76700f3d58c" Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.930770 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb353a6cefd1490bc9489650a34b8b94c97dc0a81961d477aa68b76700f3d58c"} err="failed to get container status \"bb353a6cefd1490bc9489650a34b8b94c97dc0a81961d477aa68b76700f3d58c\": rpc error: code = NotFound desc = could not find container \"bb353a6cefd1490bc9489650a34b8b94c97dc0a81961d477aa68b76700f3d58c\": container with ID starting with bb353a6cefd1490bc9489650a34b8b94c97dc0a81961d477aa68b76700f3d58c not found: ID does not exist" Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.930799 4699 scope.go:117] "RemoveContainer" containerID="03997dd7e5a08e62d93a69b33ec106072b174646b82c2a64e9e509952ea5dbd1" Feb 26 12:14:47 crc kubenswrapper[4699]: E0226 12:14:47.931088 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03997dd7e5a08e62d93a69b33ec106072b174646b82c2a64e9e509952ea5dbd1\": container with ID starting with 03997dd7e5a08e62d93a69b33ec106072b174646b82c2a64e9e509952ea5dbd1 not found: ID does not exist" containerID="03997dd7e5a08e62d93a69b33ec106072b174646b82c2a64e9e509952ea5dbd1" Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.931141 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03997dd7e5a08e62d93a69b33ec106072b174646b82c2a64e9e509952ea5dbd1"} err="failed to get container status \"03997dd7e5a08e62d93a69b33ec106072b174646b82c2a64e9e509952ea5dbd1\": rpc error: code = NotFound desc = could not find container \"03997dd7e5a08e62d93a69b33ec106072b174646b82c2a64e9e509952ea5dbd1\": container with ID starting with 03997dd7e5a08e62d93a69b33ec106072b174646b82c2a64e9e509952ea5dbd1 not found: ID does not exist" Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.931169 4699 scope.go:117] "RemoveContainer" containerID="f22c7b5f9734f5bd9f782943baa7fd673b2c739b83f0f0acf9560a4f0b14c5c3" Feb 26 12:14:47 crc kubenswrapper[4699]: E0226 12:14:47.931580 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f22c7b5f9734f5bd9f782943baa7fd673b2c739b83f0f0acf9560a4f0b14c5c3\": container with ID starting with f22c7b5f9734f5bd9f782943baa7fd673b2c739b83f0f0acf9560a4f0b14c5c3 not found: ID does not exist" containerID="f22c7b5f9734f5bd9f782943baa7fd673b2c739b83f0f0acf9560a4f0b14c5c3" Feb 26 12:14:47 crc kubenswrapper[4699]: I0226 12:14:47.931611 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f22c7b5f9734f5bd9f782943baa7fd673b2c739b83f0f0acf9560a4f0b14c5c3"} err="failed to get container status \"f22c7b5f9734f5bd9f782943baa7fd673b2c739b83f0f0acf9560a4f0b14c5c3\": rpc error: code = NotFound desc = could not find container \"f22c7b5f9734f5bd9f782943baa7fd673b2c739b83f0f0acf9560a4f0b14c5c3\": container with ID starting with f22c7b5f9734f5bd9f782943baa7fd673b2c739b83f0f0acf9560a4f0b14c5c3 not found: ID does not exist" Feb 26 12:14:48 crc kubenswrapper[4699]: I0226 12:14:48.262197 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:14:48 crc kubenswrapper[4699]: E0226 12:14:48.262414 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:14:48 crc kubenswrapper[4699]: I0226 12:14:48.272028 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c99a1b65-dd7a-4d1c-a767-43eb7192dea7" path="/var/lib/kubelet/pods/c99a1b65-dd7a-4d1c-a767-43eb7192dea7/volumes" Feb 26 12:15:00 crc kubenswrapper[4699]: I0226 12:15:00.150437 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535135-wjstb"] Feb 26 12:15:00 crc kubenswrapper[4699]: E0226 12:15:00.151553 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99a1b65-dd7a-4d1c-a767-43eb7192dea7" containerName="extract-utilities" Feb 26 12:15:00 crc kubenswrapper[4699]: I0226 12:15:00.151570 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99a1b65-dd7a-4d1c-a767-43eb7192dea7" containerName="extract-utilities" Feb 26 12:15:00 crc kubenswrapper[4699]: E0226 12:15:00.151586 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99a1b65-dd7a-4d1c-a767-43eb7192dea7" containerName="registry-server" Feb 26 12:15:00 crc kubenswrapper[4699]: I0226 12:15:00.151592 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99a1b65-dd7a-4d1c-a767-43eb7192dea7" containerName="registry-server" Feb 26 12:15:00 crc kubenswrapper[4699]: E0226 12:15:00.151604 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99a1b65-dd7a-4d1c-a767-43eb7192dea7" containerName="extract-content" Feb 26 12:15:00 crc kubenswrapper[4699]: I0226 12:15:00.151609 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99a1b65-dd7a-4d1c-a767-43eb7192dea7" containerName="extract-content" Feb 26 12:15:00 crc kubenswrapper[4699]: I0226 12:15:00.151794 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="c99a1b65-dd7a-4d1c-a767-43eb7192dea7" containerName="registry-server" Feb 26 12:15:00 crc kubenswrapper[4699]: I0226 12:15:00.152624 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535135-wjstb" Feb 26 12:15:00 crc kubenswrapper[4699]: I0226 12:15:00.155848 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 12:15:00 crc kubenswrapper[4699]: I0226 12:15:00.156181 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 12:15:00 crc kubenswrapper[4699]: I0226 12:15:00.160648 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535135-wjstb"] Feb 26 12:15:00 crc kubenswrapper[4699]: I0226 12:15:00.321424 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7-secret-volume\") pod \"collect-profiles-29535135-wjstb\" (UID: \"a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535135-wjstb" Feb 26 12:15:00 crc kubenswrapper[4699]: I0226 12:15:00.321498 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2q9m\" (UniqueName: \"kubernetes.io/projected/a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7-kube-api-access-z2q9m\") pod \"collect-profiles-29535135-wjstb\" (UID: \"a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535135-wjstb" Feb 26 12:15:00 crc kubenswrapper[4699]: I0226 12:15:00.321563 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7-config-volume\") pod \"collect-profiles-29535135-wjstb\" (UID: \"a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535135-wjstb" Feb 26 12:15:00 crc kubenswrapper[4699]: I0226 12:15:00.422727 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2q9m\" (UniqueName: \"kubernetes.io/projected/a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7-kube-api-access-z2q9m\") pod \"collect-profiles-29535135-wjstb\" (UID: \"a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535135-wjstb" Feb 26 12:15:00 crc kubenswrapper[4699]: I0226 12:15:00.422816 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7-config-volume\") pod \"collect-profiles-29535135-wjstb\" (UID: \"a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535135-wjstb" Feb 26 12:15:00 crc kubenswrapper[4699]: I0226 12:15:00.422944 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7-secret-volume\") pod \"collect-profiles-29535135-wjstb\" (UID: \"a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535135-wjstb" Feb 26 12:15:00 crc kubenswrapper[4699]: I0226 12:15:00.424223 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7-config-volume\") pod \"collect-profiles-29535135-wjstb\" (UID: \"a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535135-wjstb" Feb 26 12:15:00 crc kubenswrapper[4699]: I0226 12:15:00.429250 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7-secret-volume\") pod \"collect-profiles-29535135-wjstb\" (UID: \"a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535135-wjstb" Feb 26 12:15:00 crc kubenswrapper[4699]: I0226 12:15:00.440008 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2q9m\" (UniqueName: \"kubernetes.io/projected/a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7-kube-api-access-z2q9m\") pod \"collect-profiles-29535135-wjstb\" (UID: \"a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535135-wjstb" Feb 26 12:15:00 crc kubenswrapper[4699]: I0226 12:15:00.510085 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535135-wjstb" Feb 26 12:15:01 crc kubenswrapper[4699]: I0226 12:15:01.006522 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535135-wjstb"] Feb 26 12:15:01 crc kubenswrapper[4699]: I0226 12:15:01.945329 4699 generic.go:334] "Generic (PLEG): container finished" podID="a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7" containerID="dcd2b4d3e7e82abee5eb779094bdc847007f7d64e831426aba90d3a867cacda2" exitCode=0 Feb 26 12:15:01 crc kubenswrapper[4699]: I0226 12:15:01.945519 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535135-wjstb" event={"ID":"a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7","Type":"ContainerDied","Data":"dcd2b4d3e7e82abee5eb779094bdc847007f7d64e831426aba90d3a867cacda2"} Feb 26 12:15:01 crc kubenswrapper[4699]: I0226 12:15:01.946659 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535135-wjstb" event={"ID":"a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7","Type":"ContainerStarted","Data":"baa44810e6541d95a944872949381ca67f1a661fa4ff50b10dcb9803f9c72471"} Feb 26 12:15:03 crc kubenswrapper[4699]: I0226 12:15:03.262455 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:15:03 crc kubenswrapper[4699]: E0226 12:15:03.263254 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:15:03 crc kubenswrapper[4699]: I0226 12:15:03.274516 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535135-wjstb" Feb 26 12:15:03 crc kubenswrapper[4699]: I0226 12:15:03.398473 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2q9m\" (UniqueName: \"kubernetes.io/projected/a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7-kube-api-access-z2q9m\") pod \"a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7\" (UID: \"a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7\") " Feb 26 12:15:03 crc kubenswrapper[4699]: I0226 12:15:03.398908 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7-secret-volume\") pod \"a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7\" (UID: \"a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7\") " Feb 26 12:15:03 crc kubenswrapper[4699]: I0226 12:15:03.399083 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7-config-volume\") pod \"a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7\" (UID: \"a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7\") " Feb 26 12:15:03 crc kubenswrapper[4699]: I0226 12:15:03.399948 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7-config-volume" (OuterVolumeSpecName: "config-volume") pod "a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7" (UID: "a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 12:15:03 crc kubenswrapper[4699]: I0226 12:15:03.402255 4699 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 12:15:03 crc kubenswrapper[4699]: I0226 12:15:03.407256 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7" (UID: "a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 12:15:03 crc kubenswrapper[4699]: I0226 12:15:03.407462 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7-kube-api-access-z2q9m" (OuterVolumeSpecName: "kube-api-access-z2q9m") pod "a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7" (UID: "a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7"). InnerVolumeSpecName "kube-api-access-z2q9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:15:03 crc kubenswrapper[4699]: I0226 12:15:03.504523 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2q9m\" (UniqueName: \"kubernetes.io/projected/a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7-kube-api-access-z2q9m\") on node \"crc\" DevicePath \"\"" Feb 26 12:15:03 crc kubenswrapper[4699]: I0226 12:15:03.504583 4699 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 12:15:03 crc kubenswrapper[4699]: I0226 12:15:03.965040 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535135-wjstb" event={"ID":"a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7","Type":"ContainerDied","Data":"baa44810e6541d95a944872949381ca67f1a661fa4ff50b10dcb9803f9c72471"} Feb 26 12:15:03 crc kubenswrapper[4699]: I0226 12:15:03.965537 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baa44810e6541d95a944872949381ca67f1a661fa4ff50b10dcb9803f9c72471" Feb 26 12:15:03 crc kubenswrapper[4699]: I0226 12:15:03.965251 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535135-wjstb" Feb 26 12:15:04 crc kubenswrapper[4699]: I0226 12:15:04.383574 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535090-n42nj"] Feb 26 12:15:04 crc kubenswrapper[4699]: I0226 12:15:04.391404 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535090-n42nj"] Feb 26 12:15:06 crc kubenswrapper[4699]: I0226 12:15:06.279396 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b298a96-eca9-49eb-a547-f88e986f326e" path="/var/lib/kubelet/pods/9b298a96-eca9-49eb-a547-f88e986f326e/volumes" Feb 26 12:15:14 crc kubenswrapper[4699]: I0226 12:15:14.260944 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:15:14 crc kubenswrapper[4699]: E0226 12:15:14.261738 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:15:28 crc kubenswrapper[4699]: I0226 12:15:28.261967 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:15:28 crc kubenswrapper[4699]: E0226 12:15:28.262853 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:15:31 crc kubenswrapper[4699]: I0226 12:15:31.252412 4699 generic.go:334] "Generic (PLEG): container finished" podID="2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9" containerID="69186eaeffdff5cb7d043fe084a54702933857d82206e64b726b17a9aa98bafb" exitCode=0 Feb 26 12:15:31 crc kubenswrapper[4699]: I0226 12:15:31.252512 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4plh5/must-gather-cwqbr" event={"ID":"2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9","Type":"ContainerDied","Data":"69186eaeffdff5cb7d043fe084a54702933857d82206e64b726b17a9aa98bafb"} Feb 26 12:15:31 crc kubenswrapper[4699]: I0226 12:15:31.253566 4699 scope.go:117] "RemoveContainer" containerID="69186eaeffdff5cb7d043fe084a54702933857d82206e64b726b17a9aa98bafb" Feb 26 12:15:32 crc kubenswrapper[4699]: I0226 12:15:32.053521 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4plh5_must-gather-cwqbr_2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9/gather/0.log" Feb 26 12:15:39 crc kubenswrapper[4699]: I0226 12:15:39.558598 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4plh5/must-gather-cwqbr"] Feb 26 12:15:39 crc kubenswrapper[4699]: I0226 12:15:39.559287 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-4plh5/must-gather-cwqbr" podUID="2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9" containerName="copy" containerID="cri-o://40c5e22b0608cb2076c35bbaefac8f43550651c9e232dc9f005649022a951ddd" gracePeriod=2 Feb 26 12:15:39 crc kubenswrapper[4699]: I0226 12:15:39.567196 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4plh5/must-gather-cwqbr"] Feb 26 12:15:39 crc kubenswrapper[4699]: I0226 12:15:39.998518 4699 scope.go:117] "RemoveContainer" containerID="81dc18175a458a0d1e57583f805b2614af5b4f06183622336860874df0cedc4e" Feb 26 12:15:40 crc kubenswrapper[4699]: I0226 12:15:40.092770 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4plh5_must-gather-cwqbr_2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9/copy/0.log" Feb 26 12:15:40 crc kubenswrapper[4699]: I0226 12:15:40.093612 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4plh5/must-gather-cwqbr" Feb 26 12:15:40 crc kubenswrapper[4699]: I0226 12:15:40.260692 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:15:40 crc kubenswrapper[4699]: E0226 12:15:40.261299 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:15:40 crc kubenswrapper[4699]: I0226 12:15:40.271881 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8p96w\" (UniqueName: \"kubernetes.io/projected/2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9-kube-api-access-8p96w\") pod \"2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9\" (UID: \"2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9\") " Feb 26 12:15:40 crc kubenswrapper[4699]: I0226 12:15:40.272270 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9-must-gather-output\") pod \"2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9\" (UID: \"2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9\") " Feb 26 12:15:40 crc kubenswrapper[4699]: I0226 12:15:40.277236 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9-kube-api-access-8p96w" (OuterVolumeSpecName: "kube-api-access-8p96w") pod "2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9" (UID: "2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9"). InnerVolumeSpecName "kube-api-access-8p96w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:15:40 crc kubenswrapper[4699]: I0226 12:15:40.374988 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8p96w\" (UniqueName: \"kubernetes.io/projected/2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9-kube-api-access-8p96w\") on node \"crc\" DevicePath \"\"" Feb 26 12:15:40 crc kubenswrapper[4699]: I0226 12:15:40.441566 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9" (UID: "2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:15:40 crc kubenswrapper[4699]: I0226 12:15:40.450669 4699 generic.go:334] "Generic (PLEG): container finished" podID="2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9" containerID="40c5e22b0608cb2076c35bbaefac8f43550651c9e232dc9f005649022a951ddd" exitCode=143 Feb 26 12:15:40 crc kubenswrapper[4699]: I0226 12:15:40.450781 4699 scope.go:117] "RemoveContainer" containerID="40c5e22b0608cb2076c35bbaefac8f43550651c9e232dc9f005649022a951ddd" Feb 26 12:15:40 crc kubenswrapper[4699]: I0226 12:15:40.450832 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4plh5/must-gather-cwqbr" Feb 26 12:15:40 crc kubenswrapper[4699]: I0226 12:15:40.475811 4699 scope.go:117] "RemoveContainer" containerID="69186eaeffdff5cb7d043fe084a54702933857d82206e64b726b17a9aa98bafb" Feb 26 12:15:40 crc kubenswrapper[4699]: I0226 12:15:40.477050 4699 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 26 12:15:40 crc kubenswrapper[4699]: I0226 12:15:40.809713 4699 scope.go:117] "RemoveContainer" containerID="40c5e22b0608cb2076c35bbaefac8f43550651c9e232dc9f005649022a951ddd" Feb 26 12:15:40 crc kubenswrapper[4699]: E0226 12:15:40.821166 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40c5e22b0608cb2076c35bbaefac8f43550651c9e232dc9f005649022a951ddd\": container with ID starting with 40c5e22b0608cb2076c35bbaefac8f43550651c9e232dc9f005649022a951ddd not found: ID does not exist" containerID="40c5e22b0608cb2076c35bbaefac8f43550651c9e232dc9f005649022a951ddd" Feb 26 12:15:40 crc kubenswrapper[4699]: I0226 12:15:40.821216 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40c5e22b0608cb2076c35bbaefac8f43550651c9e232dc9f005649022a951ddd"} err="failed to get container status \"40c5e22b0608cb2076c35bbaefac8f43550651c9e232dc9f005649022a951ddd\": rpc error: code = NotFound desc = could not find container \"40c5e22b0608cb2076c35bbaefac8f43550651c9e232dc9f005649022a951ddd\": container with ID starting with 40c5e22b0608cb2076c35bbaefac8f43550651c9e232dc9f005649022a951ddd not found: ID does not exist" Feb 26 12:15:40 crc kubenswrapper[4699]: I0226 12:15:40.821246 4699 scope.go:117] "RemoveContainer" containerID="69186eaeffdff5cb7d043fe084a54702933857d82206e64b726b17a9aa98bafb" Feb 26 12:15:40 crc kubenswrapper[4699]: E0226 12:15:40.824541 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69186eaeffdff5cb7d043fe084a54702933857d82206e64b726b17a9aa98bafb\": container with ID starting with 69186eaeffdff5cb7d043fe084a54702933857d82206e64b726b17a9aa98bafb not found: ID does not exist" containerID="69186eaeffdff5cb7d043fe084a54702933857d82206e64b726b17a9aa98bafb" Feb 26 12:15:40 crc kubenswrapper[4699]: I0226 12:15:40.824592 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69186eaeffdff5cb7d043fe084a54702933857d82206e64b726b17a9aa98bafb"} err="failed to get container status \"69186eaeffdff5cb7d043fe084a54702933857d82206e64b726b17a9aa98bafb\": rpc error: code = NotFound desc = could not find container \"69186eaeffdff5cb7d043fe084a54702933857d82206e64b726b17a9aa98bafb\": container with ID starting with 69186eaeffdff5cb7d043fe084a54702933857d82206e64b726b17a9aa98bafb not found: ID does not exist" Feb 26 12:15:42 crc kubenswrapper[4699]: I0226 12:15:42.272019 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9" path="/var/lib/kubelet/pods/2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9/volumes" Feb 26 12:15:51 crc kubenswrapper[4699]: I0226 12:15:51.260577 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:15:51 crc kubenswrapper[4699]: E0226 12:15:51.261230 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:16:00 crc kubenswrapper[4699]: I0226 12:16:00.148156 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535136-zr5lr"] Feb 26 12:16:00 crc kubenswrapper[4699]: E0226 12:16:00.149724 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9" containerName="gather" Feb 26 12:16:00 crc kubenswrapper[4699]: I0226 12:16:00.149747 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9" containerName="gather" Feb 26 12:16:00 crc kubenswrapper[4699]: E0226 12:16:00.149783 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9" containerName="copy" Feb 26 12:16:00 crc kubenswrapper[4699]: I0226 12:16:00.149791 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9" containerName="copy" Feb 26 12:16:00 crc kubenswrapper[4699]: E0226 12:16:00.149814 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7" containerName="collect-profiles" Feb 26 12:16:00 crc kubenswrapper[4699]: I0226 12:16:00.149823 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7" containerName="collect-profiles" Feb 26 12:16:00 crc kubenswrapper[4699]: I0226 12:16:00.150095 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9" containerName="gather" Feb 26 12:16:00 crc kubenswrapper[4699]: I0226 12:16:00.150159 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e6e42cb-6891-4f97-9ba8-b4c6ad63a7a9" containerName="copy" Feb 26 12:16:00 crc kubenswrapper[4699]: I0226 12:16:00.150177 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="a30d5384-a2b7-4e4b-8bf5-1bb62cdb82f7" containerName="collect-profiles" Feb 26 12:16:00 crc kubenswrapper[4699]: I0226 12:16:00.151028 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535136-zr5lr" Feb 26 12:16:00 crc kubenswrapper[4699]: I0226 12:16:00.153985 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 12:16:00 crc kubenswrapper[4699]: I0226 12:16:00.156094 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 12:16:00 crc kubenswrapper[4699]: I0226 12:16:00.164525 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 12:16:00 crc kubenswrapper[4699]: I0226 12:16:00.177531 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535136-zr5lr"] Feb 26 12:16:00 crc kubenswrapper[4699]: I0226 12:16:00.267739 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgb5d\" (UniqueName: \"kubernetes.io/projected/502aea63-b1be-4c9e-850b-bc5a2503b628-kube-api-access-fgb5d\") pod \"auto-csr-approver-29535136-zr5lr\" (UID: \"502aea63-b1be-4c9e-850b-bc5a2503b628\") " pod="openshift-infra/auto-csr-approver-29535136-zr5lr" Feb 26 12:16:00 crc kubenswrapper[4699]: I0226 12:16:00.369909 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgb5d\" (UniqueName: \"kubernetes.io/projected/502aea63-b1be-4c9e-850b-bc5a2503b628-kube-api-access-fgb5d\") pod \"auto-csr-approver-29535136-zr5lr\" (UID: \"502aea63-b1be-4c9e-850b-bc5a2503b628\") " pod="openshift-infra/auto-csr-approver-29535136-zr5lr" Feb 26 12:16:00 crc kubenswrapper[4699]: I0226 12:16:00.387904 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgb5d\" (UniqueName: \"kubernetes.io/projected/502aea63-b1be-4c9e-850b-bc5a2503b628-kube-api-access-fgb5d\") pod \"auto-csr-approver-29535136-zr5lr\" (UID: \"502aea63-b1be-4c9e-850b-bc5a2503b628\") " pod="openshift-infra/auto-csr-approver-29535136-zr5lr" Feb 26 12:16:00 crc kubenswrapper[4699]: I0226 12:16:00.517842 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535136-zr5lr" Feb 26 12:16:00 crc kubenswrapper[4699]: W0226 12:16:00.940808 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod502aea63_b1be_4c9e_850b_bc5a2503b628.slice/crio-a3680e8d9d354f766a8d1a4052690172c397bfc0e5fac85d7ed29e97ed123cae WatchSource:0}: Error finding container a3680e8d9d354f766a8d1a4052690172c397bfc0e5fac85d7ed29e97ed123cae: Status 404 returned error can't find the container with id a3680e8d9d354f766a8d1a4052690172c397bfc0e5fac85d7ed29e97ed123cae Feb 26 12:16:00 crc kubenswrapper[4699]: I0226 12:16:00.944306 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535136-zr5lr"] Feb 26 12:16:01 crc kubenswrapper[4699]: I0226 12:16:01.652379 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535136-zr5lr" event={"ID":"502aea63-b1be-4c9e-850b-bc5a2503b628","Type":"ContainerStarted","Data":"a3680e8d9d354f766a8d1a4052690172c397bfc0e5fac85d7ed29e97ed123cae"} Feb 26 12:16:02 crc kubenswrapper[4699]: I0226 12:16:02.667266 4699 generic.go:334] "Generic (PLEG): container finished" podID="502aea63-b1be-4c9e-850b-bc5a2503b628" containerID="6e828c6eb232b14fedfc4161c27c5a5dd3b91bd1fe215ef080f8deb69fce1e31" exitCode=0 Feb 26 12:16:02 crc kubenswrapper[4699]: I0226 12:16:02.667414 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535136-zr5lr" event={"ID":"502aea63-b1be-4c9e-850b-bc5a2503b628","Type":"ContainerDied","Data":"6e828c6eb232b14fedfc4161c27c5a5dd3b91bd1fe215ef080f8deb69fce1e31"} Feb 26 12:16:04 crc kubenswrapper[4699]: I0226 12:16:04.072667 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535136-zr5lr" Feb 26 12:16:04 crc kubenswrapper[4699]: I0226 12:16:04.246681 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgb5d\" (UniqueName: \"kubernetes.io/projected/502aea63-b1be-4c9e-850b-bc5a2503b628-kube-api-access-fgb5d\") pod \"502aea63-b1be-4c9e-850b-bc5a2503b628\" (UID: \"502aea63-b1be-4c9e-850b-bc5a2503b628\") " Feb 26 12:16:04 crc kubenswrapper[4699]: I0226 12:16:04.254821 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/502aea63-b1be-4c9e-850b-bc5a2503b628-kube-api-access-fgb5d" (OuterVolumeSpecName: "kube-api-access-fgb5d") pod "502aea63-b1be-4c9e-850b-bc5a2503b628" (UID: "502aea63-b1be-4c9e-850b-bc5a2503b628"). InnerVolumeSpecName "kube-api-access-fgb5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:16:04 crc kubenswrapper[4699]: I0226 12:16:04.261692 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:16:04 crc kubenswrapper[4699]: E0226 12:16:04.262056 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:16:04 crc kubenswrapper[4699]: I0226 12:16:04.349084 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgb5d\" (UniqueName: \"kubernetes.io/projected/502aea63-b1be-4c9e-850b-bc5a2503b628-kube-api-access-fgb5d\") on node \"crc\" DevicePath \"\"" Feb 26 12:16:04 crc kubenswrapper[4699]: I0226 12:16:04.684407 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535136-zr5lr" event={"ID":"502aea63-b1be-4c9e-850b-bc5a2503b628","Type":"ContainerDied","Data":"a3680e8d9d354f766a8d1a4052690172c397bfc0e5fac85d7ed29e97ed123cae"} Feb 26 12:16:04 crc kubenswrapper[4699]: I0226 12:16:04.684453 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3680e8d9d354f766a8d1a4052690172c397bfc0e5fac85d7ed29e97ed123cae" Feb 26 12:16:04 crc kubenswrapper[4699]: I0226 12:16:04.684456 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535136-zr5lr" Feb 26 12:16:05 crc kubenswrapper[4699]: I0226 12:16:05.141776 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535130-v52gt"] Feb 26 12:16:05 crc kubenswrapper[4699]: I0226 12:16:05.159227 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535130-v52gt"] Feb 26 12:16:06 crc kubenswrapper[4699]: I0226 12:16:06.270091 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="524c38c5-5560-45a6-aa15-3010000b2165" path="/var/lib/kubelet/pods/524c38c5-5560-45a6-aa15-3010000b2165/volumes" Feb 26 12:16:19 crc kubenswrapper[4699]: I0226 12:16:19.261766 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:16:19 crc kubenswrapper[4699]: E0226 12:16:19.262602 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:16:31 crc kubenswrapper[4699]: I0226 12:16:31.261202 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:16:31 crc kubenswrapper[4699]: E0226 12:16:31.261963 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:16:40 crc kubenswrapper[4699]: I0226 12:16:40.150084 4699 scope.go:117] "RemoveContainer" containerID="548a0e8c1b14580465351f41c66bafc1b217669a68f00a69bd71038d87540f9f" Feb 26 12:16:40 crc kubenswrapper[4699]: I0226 12:16:40.176757 4699 scope.go:117] "RemoveContainer" containerID="b1e1f8248ccd17084f1b3aa21ad1265018f7368ffdb4ddbf286721c65474aad5" Feb 26 12:16:46 crc kubenswrapper[4699]: I0226 12:16:46.060527 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7xmfl"] Feb 26 12:16:46 crc kubenswrapper[4699]: E0226 12:16:46.061797 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="502aea63-b1be-4c9e-850b-bc5a2503b628" containerName="oc" Feb 26 12:16:46 crc kubenswrapper[4699]: I0226 12:16:46.061827 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="502aea63-b1be-4c9e-850b-bc5a2503b628" containerName="oc" Feb 26 12:16:46 crc kubenswrapper[4699]: I0226 12:16:46.062056 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="502aea63-b1be-4c9e-850b-bc5a2503b628" containerName="oc" Feb 26 12:16:46 crc kubenswrapper[4699]: I0226 12:16:46.063399 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7xmfl" Feb 26 12:16:46 crc kubenswrapper[4699]: I0226 12:16:46.080461 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7xmfl"] Feb 26 12:16:46 crc kubenswrapper[4699]: I0226 12:16:46.215164 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50f381e3-7f33-484e-92d9-fae178f3c093-utilities\") pod \"certified-operators-7xmfl\" (UID: \"50f381e3-7f33-484e-92d9-fae178f3c093\") " pod="openshift-marketplace/certified-operators-7xmfl" Feb 26 12:16:46 crc kubenswrapper[4699]: I0226 12:16:46.215221 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50f381e3-7f33-484e-92d9-fae178f3c093-catalog-content\") pod \"certified-operators-7xmfl\" (UID: \"50f381e3-7f33-484e-92d9-fae178f3c093\") " pod="openshift-marketplace/certified-operators-7xmfl" Feb 26 12:16:46 crc kubenswrapper[4699]: I0226 12:16:46.215428 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrcvh\" (UniqueName: \"kubernetes.io/projected/50f381e3-7f33-484e-92d9-fae178f3c093-kube-api-access-xrcvh\") pod \"certified-operators-7xmfl\" (UID: \"50f381e3-7f33-484e-92d9-fae178f3c093\") " pod="openshift-marketplace/certified-operators-7xmfl" Feb 26 12:16:46 crc kubenswrapper[4699]: I0226 12:16:46.267274 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:16:46 crc kubenswrapper[4699]: E0226 12:16:46.267644 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:16:46 crc kubenswrapper[4699]: I0226 12:16:46.317867 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50f381e3-7f33-484e-92d9-fae178f3c093-utilities\") pod \"certified-operators-7xmfl\" (UID: \"50f381e3-7f33-484e-92d9-fae178f3c093\") " pod="openshift-marketplace/certified-operators-7xmfl" Feb 26 12:16:46 crc kubenswrapper[4699]: I0226 12:16:46.317948 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50f381e3-7f33-484e-92d9-fae178f3c093-catalog-content\") pod \"certified-operators-7xmfl\" (UID: \"50f381e3-7f33-484e-92d9-fae178f3c093\") " pod="openshift-marketplace/certified-operators-7xmfl" Feb 26 12:16:46 crc kubenswrapper[4699]: I0226 12:16:46.318007 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrcvh\" (UniqueName: \"kubernetes.io/projected/50f381e3-7f33-484e-92d9-fae178f3c093-kube-api-access-xrcvh\") pod \"certified-operators-7xmfl\" (UID: \"50f381e3-7f33-484e-92d9-fae178f3c093\") " pod="openshift-marketplace/certified-operators-7xmfl" Feb 26 12:16:46 crc kubenswrapper[4699]: I0226 12:16:46.318510 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50f381e3-7f33-484e-92d9-fae178f3c093-utilities\") pod \"certified-operators-7xmfl\" (UID: \"50f381e3-7f33-484e-92d9-fae178f3c093\") " pod="openshift-marketplace/certified-operators-7xmfl" Feb 26 12:16:46 crc kubenswrapper[4699]: I0226 12:16:46.318725 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50f381e3-7f33-484e-92d9-fae178f3c093-catalog-content\") pod \"certified-operators-7xmfl\" (UID: \"50f381e3-7f33-484e-92d9-fae178f3c093\") " pod="openshift-marketplace/certified-operators-7xmfl" Feb 26 12:16:46 crc kubenswrapper[4699]: I0226 12:16:46.336309 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrcvh\" (UniqueName: \"kubernetes.io/projected/50f381e3-7f33-484e-92d9-fae178f3c093-kube-api-access-xrcvh\") pod \"certified-operators-7xmfl\" (UID: \"50f381e3-7f33-484e-92d9-fae178f3c093\") " pod="openshift-marketplace/certified-operators-7xmfl" Feb 26 12:16:46 crc kubenswrapper[4699]: I0226 12:16:46.392152 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7xmfl" Feb 26 12:16:46 crc kubenswrapper[4699]: I0226 12:16:46.913558 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7xmfl"] Feb 26 12:16:47 crc kubenswrapper[4699]: I0226 12:16:47.094328 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7xmfl" event={"ID":"50f381e3-7f33-484e-92d9-fae178f3c093","Type":"ContainerStarted","Data":"a6d5b25b4d177d8555d869e48757c6481e857e538c0ed0ef4d0db5b527a06ba0"} Feb 26 12:16:48 crc kubenswrapper[4699]: I0226 12:16:48.105056 4699 generic.go:334] "Generic (PLEG): container finished" podID="50f381e3-7f33-484e-92d9-fae178f3c093" containerID="e735c3339391fd0f13911f50b898e1f2278f444401de37ed0fd6df5e46d1bba4" exitCode=0 Feb 26 12:16:48 crc kubenswrapper[4699]: I0226 12:16:48.105164 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7xmfl" event={"ID":"50f381e3-7f33-484e-92d9-fae178f3c093","Type":"ContainerDied","Data":"e735c3339391fd0f13911f50b898e1f2278f444401de37ed0fd6df5e46d1bba4"} Feb 26 12:16:50 crc kubenswrapper[4699]: I0226 12:16:50.126090 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7xmfl" event={"ID":"50f381e3-7f33-484e-92d9-fae178f3c093","Type":"ContainerDied","Data":"b3f88c4dbc90d0e0841ba117e9a0bc5f2795f5cfd24580f6eef4ea3f84df4552"} Feb 26 12:16:50 crc kubenswrapper[4699]: I0226 12:16:50.126839 4699 generic.go:334] "Generic (PLEG): container finished" podID="50f381e3-7f33-484e-92d9-fae178f3c093" containerID="b3f88c4dbc90d0e0841ba117e9a0bc5f2795f5cfd24580f6eef4ea3f84df4552" exitCode=0 Feb 26 12:16:52 crc kubenswrapper[4699]: I0226 12:16:52.147190 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7xmfl" event={"ID":"50f381e3-7f33-484e-92d9-fae178f3c093","Type":"ContainerStarted","Data":"17dd68a15017639b9cdb1f6b050d9eec24aaeef2b101103c764ca0a655082d54"} Feb 26 12:16:52 crc kubenswrapper[4699]: I0226 12:16:52.172378 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7xmfl" podStartSLOduration=2.384850505 podStartE2EDuration="6.17235088s" podCreationTimestamp="2026-02-26 12:16:46 +0000 UTC" firstStartedPulling="2026-02-26 12:16:48.107497766 +0000 UTC m=+3953.918324210" lastFinishedPulling="2026-02-26 12:16:51.894998151 +0000 UTC m=+3957.705824585" observedRunningTime="2026-02-26 12:16:52.166490764 +0000 UTC m=+3957.977317198" watchObservedRunningTime="2026-02-26 12:16:52.17235088 +0000 UTC m=+3957.983177314" Feb 26 12:16:56 crc kubenswrapper[4699]: I0226 12:16:56.392692 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7xmfl" Feb 26 12:16:56 crc kubenswrapper[4699]: I0226 12:16:56.393206 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7xmfl" Feb 26 12:16:56 crc kubenswrapper[4699]: I0226 12:16:56.437459 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7xmfl" Feb 26 12:16:57 crc kubenswrapper[4699]: I0226 12:16:57.248064 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7xmfl" Feb 26 12:16:57 crc kubenswrapper[4699]: I0226 12:16:57.293436 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7xmfl"] Feb 26 12:16:59 crc kubenswrapper[4699]: I0226 12:16:59.224503 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7xmfl" podUID="50f381e3-7f33-484e-92d9-fae178f3c093" containerName="registry-server" containerID="cri-o://17dd68a15017639b9cdb1f6b050d9eec24aaeef2b101103c764ca0a655082d54" gracePeriod=2 Feb 26 12:16:59 crc kubenswrapper[4699]: I0226 12:16:59.261889 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:16:59 crc kubenswrapper[4699]: E0226 12:16:59.262277 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:16:59 crc kubenswrapper[4699]: I0226 12:16:59.741847 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7xmfl" Feb 26 12:16:59 crc kubenswrapper[4699]: I0226 12:16:59.876742 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50f381e3-7f33-484e-92d9-fae178f3c093-utilities\") pod \"50f381e3-7f33-484e-92d9-fae178f3c093\" (UID: \"50f381e3-7f33-484e-92d9-fae178f3c093\") " Feb 26 12:16:59 crc kubenswrapper[4699]: I0226 12:16:59.876973 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrcvh\" (UniqueName: \"kubernetes.io/projected/50f381e3-7f33-484e-92d9-fae178f3c093-kube-api-access-xrcvh\") pod \"50f381e3-7f33-484e-92d9-fae178f3c093\" (UID: \"50f381e3-7f33-484e-92d9-fae178f3c093\") " Feb 26 12:16:59 crc kubenswrapper[4699]: I0226 12:16:59.877006 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50f381e3-7f33-484e-92d9-fae178f3c093-catalog-content\") pod \"50f381e3-7f33-484e-92d9-fae178f3c093\" (UID: \"50f381e3-7f33-484e-92d9-fae178f3c093\") " Feb 26 12:16:59 crc kubenswrapper[4699]: I0226 12:16:59.877745 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50f381e3-7f33-484e-92d9-fae178f3c093-utilities" (OuterVolumeSpecName: "utilities") pod "50f381e3-7f33-484e-92d9-fae178f3c093" (UID: "50f381e3-7f33-484e-92d9-fae178f3c093"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:16:59 crc kubenswrapper[4699]: I0226 12:16:59.882368 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50f381e3-7f33-484e-92d9-fae178f3c093-kube-api-access-xrcvh" (OuterVolumeSpecName: "kube-api-access-xrcvh") pod "50f381e3-7f33-484e-92d9-fae178f3c093" (UID: "50f381e3-7f33-484e-92d9-fae178f3c093"). InnerVolumeSpecName "kube-api-access-xrcvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:16:59 crc kubenswrapper[4699]: I0226 12:16:59.944519 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50f381e3-7f33-484e-92d9-fae178f3c093-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50f381e3-7f33-484e-92d9-fae178f3c093" (UID: "50f381e3-7f33-484e-92d9-fae178f3c093"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:16:59 crc kubenswrapper[4699]: I0226 12:16:59.979874 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrcvh\" (UniqueName: \"kubernetes.io/projected/50f381e3-7f33-484e-92d9-fae178f3c093-kube-api-access-xrcvh\") on node \"crc\" DevicePath \"\"" Feb 26 12:16:59 crc kubenswrapper[4699]: I0226 12:16:59.979915 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50f381e3-7f33-484e-92d9-fae178f3c093-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 12:16:59 crc kubenswrapper[4699]: I0226 12:16:59.979924 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50f381e3-7f33-484e-92d9-fae178f3c093-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 12:17:00 crc kubenswrapper[4699]: I0226 12:17:00.235375 4699 generic.go:334] "Generic (PLEG): container finished" podID="50f381e3-7f33-484e-92d9-fae178f3c093" containerID="17dd68a15017639b9cdb1f6b050d9eec24aaeef2b101103c764ca0a655082d54" exitCode=0 Feb 26 12:17:00 crc kubenswrapper[4699]: I0226 12:17:00.235419 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7xmfl" event={"ID":"50f381e3-7f33-484e-92d9-fae178f3c093","Type":"ContainerDied","Data":"17dd68a15017639b9cdb1f6b050d9eec24aaeef2b101103c764ca0a655082d54"} Feb 26 12:17:00 crc kubenswrapper[4699]: I0226 12:17:00.235444 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7xmfl" event={"ID":"50f381e3-7f33-484e-92d9-fae178f3c093","Type":"ContainerDied","Data":"a6d5b25b4d177d8555d869e48757c6481e857e538c0ed0ef4d0db5b527a06ba0"} Feb 26 12:17:00 crc kubenswrapper[4699]: I0226 12:17:00.235447 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7xmfl" Feb 26 12:17:00 crc kubenswrapper[4699]: I0226 12:17:00.235461 4699 scope.go:117] "RemoveContainer" containerID="17dd68a15017639b9cdb1f6b050d9eec24aaeef2b101103c764ca0a655082d54" Feb 26 12:17:00 crc kubenswrapper[4699]: I0226 12:17:00.258509 4699 scope.go:117] "RemoveContainer" containerID="b3f88c4dbc90d0e0841ba117e9a0bc5f2795f5cfd24580f6eef4ea3f84df4552" Feb 26 12:17:00 crc kubenswrapper[4699]: I0226 12:17:00.285975 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7xmfl"] Feb 26 12:17:00 crc kubenswrapper[4699]: I0226 12:17:00.286149 4699 scope.go:117] "RemoveContainer" containerID="e735c3339391fd0f13911f50b898e1f2278f444401de37ed0fd6df5e46d1bba4" Feb 26 12:17:00 crc kubenswrapper[4699]: I0226 12:17:00.292163 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7xmfl"] Feb 26 12:17:00 crc kubenswrapper[4699]: I0226 12:17:00.330212 4699 scope.go:117] "RemoveContainer" containerID="17dd68a15017639b9cdb1f6b050d9eec24aaeef2b101103c764ca0a655082d54" Feb 26 12:17:00 crc kubenswrapper[4699]: E0226 12:17:00.330835 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17dd68a15017639b9cdb1f6b050d9eec24aaeef2b101103c764ca0a655082d54\": container with ID starting with 17dd68a15017639b9cdb1f6b050d9eec24aaeef2b101103c764ca0a655082d54 not found: ID does not exist" containerID="17dd68a15017639b9cdb1f6b050d9eec24aaeef2b101103c764ca0a655082d54" Feb 26 12:17:00 crc kubenswrapper[4699]: I0226 12:17:00.330951 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17dd68a15017639b9cdb1f6b050d9eec24aaeef2b101103c764ca0a655082d54"} err="failed to get container status \"17dd68a15017639b9cdb1f6b050d9eec24aaeef2b101103c764ca0a655082d54\": rpc error: code = NotFound desc = could not find container \"17dd68a15017639b9cdb1f6b050d9eec24aaeef2b101103c764ca0a655082d54\": container with ID starting with 17dd68a15017639b9cdb1f6b050d9eec24aaeef2b101103c764ca0a655082d54 not found: ID does not exist" Feb 26 12:17:00 crc kubenswrapper[4699]: I0226 12:17:00.331046 4699 scope.go:117] "RemoveContainer" containerID="b3f88c4dbc90d0e0841ba117e9a0bc5f2795f5cfd24580f6eef4ea3f84df4552" Feb 26 12:17:00 crc kubenswrapper[4699]: E0226 12:17:00.335283 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3f88c4dbc90d0e0841ba117e9a0bc5f2795f5cfd24580f6eef4ea3f84df4552\": container with ID starting with b3f88c4dbc90d0e0841ba117e9a0bc5f2795f5cfd24580f6eef4ea3f84df4552 not found: ID does not exist" containerID="b3f88c4dbc90d0e0841ba117e9a0bc5f2795f5cfd24580f6eef4ea3f84df4552" Feb 26 12:17:00 crc kubenswrapper[4699]: I0226 12:17:00.335314 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3f88c4dbc90d0e0841ba117e9a0bc5f2795f5cfd24580f6eef4ea3f84df4552"} err="failed to get container status \"b3f88c4dbc90d0e0841ba117e9a0bc5f2795f5cfd24580f6eef4ea3f84df4552\": rpc error: code = NotFound desc = could not find container \"b3f88c4dbc90d0e0841ba117e9a0bc5f2795f5cfd24580f6eef4ea3f84df4552\": container with ID starting with b3f88c4dbc90d0e0841ba117e9a0bc5f2795f5cfd24580f6eef4ea3f84df4552 not found: ID does not exist" Feb 26 12:17:00 crc kubenswrapper[4699]: I0226 12:17:00.335331 4699 scope.go:117] "RemoveContainer" containerID="e735c3339391fd0f13911f50b898e1f2278f444401de37ed0fd6df5e46d1bba4" Feb 26 12:17:00 crc kubenswrapper[4699]: E0226 12:17:00.335587 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e735c3339391fd0f13911f50b898e1f2278f444401de37ed0fd6df5e46d1bba4\": container with ID starting with e735c3339391fd0f13911f50b898e1f2278f444401de37ed0fd6df5e46d1bba4 not found: ID does not exist" containerID="e735c3339391fd0f13911f50b898e1f2278f444401de37ed0fd6df5e46d1bba4" Feb 26 12:17:00 crc kubenswrapper[4699]: I0226 12:17:00.335617 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e735c3339391fd0f13911f50b898e1f2278f444401de37ed0fd6df5e46d1bba4"} err="failed to get container status \"e735c3339391fd0f13911f50b898e1f2278f444401de37ed0fd6df5e46d1bba4\": rpc error: code = NotFound desc = could not find container \"e735c3339391fd0f13911f50b898e1f2278f444401de37ed0fd6df5e46d1bba4\": container with ID starting with e735c3339391fd0f13911f50b898e1f2278f444401de37ed0fd6df5e46d1bba4 not found: ID does not exist" Feb 26 12:17:02 crc kubenswrapper[4699]: I0226 12:17:02.270407 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50f381e3-7f33-484e-92d9-fae178f3c093" path="/var/lib/kubelet/pods/50f381e3-7f33-484e-92d9-fae178f3c093/volumes" Feb 26 12:17:12 crc kubenswrapper[4699]: I0226 12:17:12.261451 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:17:13 crc kubenswrapper[4699]: I0226 12:17:13.370155 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerStarted","Data":"321e80bae8579e8007aa1cc495575fd7eef57d9379aadf703c862dea223958e5"} Feb 26 12:17:29 crc kubenswrapper[4699]: I0226 12:17:29.317466 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-97nmz"] Feb 26 12:17:29 crc kubenswrapper[4699]: E0226 12:17:29.318431 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f381e3-7f33-484e-92d9-fae178f3c093" containerName="extract-utilities" Feb 26 12:17:29 crc kubenswrapper[4699]: I0226 12:17:29.318446 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f381e3-7f33-484e-92d9-fae178f3c093" containerName="extract-utilities" Feb 26 12:17:29 crc kubenswrapper[4699]: E0226 12:17:29.318455 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f381e3-7f33-484e-92d9-fae178f3c093" containerName="extract-content" Feb 26 12:17:29 crc kubenswrapper[4699]: I0226 12:17:29.318460 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f381e3-7f33-484e-92d9-fae178f3c093" containerName="extract-content" Feb 26 12:17:29 crc kubenswrapper[4699]: E0226 12:17:29.318482 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f381e3-7f33-484e-92d9-fae178f3c093" containerName="registry-server" Feb 26 12:17:29 crc kubenswrapper[4699]: I0226 12:17:29.318488 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f381e3-7f33-484e-92d9-fae178f3c093" containerName="registry-server" Feb 26 12:17:29 crc kubenswrapper[4699]: I0226 12:17:29.318661 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="50f381e3-7f33-484e-92d9-fae178f3c093" containerName="registry-server" Feb 26 12:17:29 crc kubenswrapper[4699]: I0226 12:17:29.320201 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-97nmz" Feb 26 12:17:29 crc kubenswrapper[4699]: I0226 12:17:29.352834 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-97nmz"] Feb 26 12:17:29 crc kubenswrapper[4699]: I0226 12:17:29.451824 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f9a62e3-8b3a-4741-93f3-910d206d1bde-catalog-content\") pod \"redhat-marketplace-97nmz\" (UID: \"5f9a62e3-8b3a-4741-93f3-910d206d1bde\") " pod="openshift-marketplace/redhat-marketplace-97nmz" Feb 26 12:17:29 crc kubenswrapper[4699]: I0226 12:17:29.452060 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljf8g\" (UniqueName: \"kubernetes.io/projected/5f9a62e3-8b3a-4741-93f3-910d206d1bde-kube-api-access-ljf8g\") pod \"redhat-marketplace-97nmz\" (UID: \"5f9a62e3-8b3a-4741-93f3-910d206d1bde\") " pod="openshift-marketplace/redhat-marketplace-97nmz" Feb 26 12:17:29 crc kubenswrapper[4699]: I0226 12:17:29.452235 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f9a62e3-8b3a-4741-93f3-910d206d1bde-utilities\") pod \"redhat-marketplace-97nmz\" (UID: \"5f9a62e3-8b3a-4741-93f3-910d206d1bde\") " pod="openshift-marketplace/redhat-marketplace-97nmz" Feb 26 12:17:29 crc kubenswrapper[4699]: I0226 12:17:29.554072 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f9a62e3-8b3a-4741-93f3-910d206d1bde-catalog-content\") pod \"redhat-marketplace-97nmz\" (UID: \"5f9a62e3-8b3a-4741-93f3-910d206d1bde\") " pod="openshift-marketplace/redhat-marketplace-97nmz" Feb 26 12:17:29 crc kubenswrapper[4699]: I0226 12:17:29.554183 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljf8g\" (UniqueName: \"kubernetes.io/projected/5f9a62e3-8b3a-4741-93f3-910d206d1bde-kube-api-access-ljf8g\") pod \"redhat-marketplace-97nmz\" (UID: \"5f9a62e3-8b3a-4741-93f3-910d206d1bde\") " pod="openshift-marketplace/redhat-marketplace-97nmz" Feb 26 12:17:29 crc kubenswrapper[4699]: I0226 12:17:29.554249 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f9a62e3-8b3a-4741-93f3-910d206d1bde-utilities\") pod \"redhat-marketplace-97nmz\" (UID: \"5f9a62e3-8b3a-4741-93f3-910d206d1bde\") " pod="openshift-marketplace/redhat-marketplace-97nmz" Feb 26 12:17:29 crc kubenswrapper[4699]: I0226 12:17:29.554615 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f9a62e3-8b3a-4741-93f3-910d206d1bde-catalog-content\") pod \"redhat-marketplace-97nmz\" (UID: \"5f9a62e3-8b3a-4741-93f3-910d206d1bde\") " pod="openshift-marketplace/redhat-marketplace-97nmz" Feb 26 12:17:29 crc kubenswrapper[4699]: I0226 12:17:29.554798 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f9a62e3-8b3a-4741-93f3-910d206d1bde-utilities\") pod \"redhat-marketplace-97nmz\" (UID: \"5f9a62e3-8b3a-4741-93f3-910d206d1bde\") " pod="openshift-marketplace/redhat-marketplace-97nmz" Feb 26 12:17:29 crc kubenswrapper[4699]: I0226 12:17:29.574761 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljf8g\" (UniqueName: \"kubernetes.io/projected/5f9a62e3-8b3a-4741-93f3-910d206d1bde-kube-api-access-ljf8g\") pod \"redhat-marketplace-97nmz\" (UID: \"5f9a62e3-8b3a-4741-93f3-910d206d1bde\") " pod="openshift-marketplace/redhat-marketplace-97nmz" Feb 26 12:17:29 crc kubenswrapper[4699]: I0226 12:17:29.647451 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-97nmz" Feb 26 12:17:30 crc kubenswrapper[4699]: I0226 12:17:30.172077 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-97nmz"] Feb 26 12:17:30 crc kubenswrapper[4699]: I0226 12:17:30.529274 4699 generic.go:334] "Generic (PLEG): container finished" podID="5f9a62e3-8b3a-4741-93f3-910d206d1bde" containerID="04bee53dc64cf311afb693c3d6c9eb4ce7e191e7b22c1fba0792d43c3e39874f" exitCode=0 Feb 26 12:17:30 crc kubenswrapper[4699]: I0226 12:17:30.529333 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97nmz" event={"ID":"5f9a62e3-8b3a-4741-93f3-910d206d1bde","Type":"ContainerDied","Data":"04bee53dc64cf311afb693c3d6c9eb4ce7e191e7b22c1fba0792d43c3e39874f"} Feb 26 12:17:30 crc kubenswrapper[4699]: I0226 12:17:30.529555 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97nmz" event={"ID":"5f9a62e3-8b3a-4741-93f3-910d206d1bde","Type":"ContainerStarted","Data":"9636133c3447885de744ca24637f9c92a5da0be59cc4b57a4b972a5ae93e8659"} Feb 26 12:17:31 crc kubenswrapper[4699]: I0226 12:17:31.540652 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97nmz" event={"ID":"5f9a62e3-8b3a-4741-93f3-910d206d1bde","Type":"ContainerStarted","Data":"0143e2f9109b3c28c30aa19cdf7638765b67fdbdd672ecbff1c0d603640189ac"} Feb 26 12:17:32 crc kubenswrapper[4699]: I0226 12:17:32.551075 4699 generic.go:334] "Generic (PLEG): container finished" podID="5f9a62e3-8b3a-4741-93f3-910d206d1bde" containerID="0143e2f9109b3c28c30aa19cdf7638765b67fdbdd672ecbff1c0d603640189ac" exitCode=0 Feb 26 12:17:32 crc kubenswrapper[4699]: I0226 12:17:32.551179 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97nmz" event={"ID":"5f9a62e3-8b3a-4741-93f3-910d206d1bde","Type":"ContainerDied","Data":"0143e2f9109b3c28c30aa19cdf7638765b67fdbdd672ecbff1c0d603640189ac"} Feb 26 12:17:33 crc kubenswrapper[4699]: I0226 12:17:33.562308 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97nmz" event={"ID":"5f9a62e3-8b3a-4741-93f3-910d206d1bde","Type":"ContainerStarted","Data":"35f7239c988fe25e7011c5f22473f2854bb1a482de1b024e4b07254c4b2ccb5d"} Feb 26 12:17:33 crc kubenswrapper[4699]: I0226 12:17:33.607141 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-97nmz" podStartSLOduration=2.212645584 podStartE2EDuration="4.607098911s" podCreationTimestamp="2026-02-26 12:17:29 +0000 UTC" firstStartedPulling="2026-02-26 12:17:30.531078007 +0000 UTC m=+3996.341904451" lastFinishedPulling="2026-02-26 12:17:32.925531344 +0000 UTC m=+3998.736357778" observedRunningTime="2026-02-26 12:17:33.585795488 +0000 UTC m=+3999.396621922" watchObservedRunningTime="2026-02-26 12:17:33.607098911 +0000 UTC m=+3999.417925345" Feb 26 12:17:39 crc kubenswrapper[4699]: I0226 12:17:39.660052 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-97nmz" Feb 26 12:17:39 crc kubenswrapper[4699]: I0226 12:17:39.660611 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-97nmz" Feb 26 12:17:39 crc kubenswrapper[4699]: I0226 12:17:39.716760 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-97nmz" Feb 26 12:17:40 crc kubenswrapper[4699]: I0226 12:17:40.301401 4699 scope.go:117] "RemoveContainer" containerID="1830ebb83943317d1452f94ddd1bbd24c88d43dcb4e4541e0c9c10d16e425c29" Feb 26 12:17:40 crc kubenswrapper[4699]: I0226 12:17:40.691852 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-97nmz" Feb 26 12:17:40 crc kubenswrapper[4699]: I0226 12:17:40.743935 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-97nmz"] Feb 26 12:17:42 crc kubenswrapper[4699]: I0226 12:17:42.650395 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-97nmz" podUID="5f9a62e3-8b3a-4741-93f3-910d206d1bde" containerName="registry-server" containerID="cri-o://35f7239c988fe25e7011c5f22473f2854bb1a482de1b024e4b07254c4b2ccb5d" gracePeriod=2 Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.161976 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-97nmz" Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.344765 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f9a62e3-8b3a-4741-93f3-910d206d1bde-utilities\") pod \"5f9a62e3-8b3a-4741-93f3-910d206d1bde\" (UID: \"5f9a62e3-8b3a-4741-93f3-910d206d1bde\") " Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.344907 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f9a62e3-8b3a-4741-93f3-910d206d1bde-catalog-content\") pod \"5f9a62e3-8b3a-4741-93f3-910d206d1bde\" (UID: \"5f9a62e3-8b3a-4741-93f3-910d206d1bde\") " Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.345005 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljf8g\" (UniqueName: \"kubernetes.io/projected/5f9a62e3-8b3a-4741-93f3-910d206d1bde-kube-api-access-ljf8g\") pod \"5f9a62e3-8b3a-4741-93f3-910d206d1bde\" (UID: \"5f9a62e3-8b3a-4741-93f3-910d206d1bde\") " Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.345608 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f9a62e3-8b3a-4741-93f3-910d206d1bde-utilities" (OuterVolumeSpecName: "utilities") pod "5f9a62e3-8b3a-4741-93f3-910d206d1bde" (UID: "5f9a62e3-8b3a-4741-93f3-910d206d1bde"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.381293 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f9a62e3-8b3a-4741-93f3-910d206d1bde-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f9a62e3-8b3a-4741-93f3-910d206d1bde" (UID: "5f9a62e3-8b3a-4741-93f3-910d206d1bde"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.447326 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f9a62e3-8b3a-4741-93f3-910d206d1bde-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.447383 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f9a62e3-8b3a-4741-93f3-910d206d1bde-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.626294 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f9a62e3-8b3a-4741-93f3-910d206d1bde-kube-api-access-ljf8g" (OuterVolumeSpecName: "kube-api-access-ljf8g") pod "5f9a62e3-8b3a-4741-93f3-910d206d1bde" (UID: "5f9a62e3-8b3a-4741-93f3-910d206d1bde"). InnerVolumeSpecName "kube-api-access-ljf8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.650768 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljf8g\" (UniqueName: \"kubernetes.io/projected/5f9a62e3-8b3a-4741-93f3-910d206d1bde-kube-api-access-ljf8g\") on node \"crc\" DevicePath \"\"" Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.665267 4699 generic.go:334] "Generic (PLEG): container finished" podID="5f9a62e3-8b3a-4741-93f3-910d206d1bde" containerID="35f7239c988fe25e7011c5f22473f2854bb1a482de1b024e4b07254c4b2ccb5d" exitCode=0 Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.665329 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97nmz" event={"ID":"5f9a62e3-8b3a-4741-93f3-910d206d1bde","Type":"ContainerDied","Data":"35f7239c988fe25e7011c5f22473f2854bb1a482de1b024e4b07254c4b2ccb5d"} Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.665365 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97nmz" event={"ID":"5f9a62e3-8b3a-4741-93f3-910d206d1bde","Type":"ContainerDied","Data":"9636133c3447885de744ca24637f9c92a5da0be59cc4b57a4b972a5ae93e8659"} Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.665388 4699 scope.go:117] "RemoveContainer" containerID="35f7239c988fe25e7011c5f22473f2854bb1a482de1b024e4b07254c4b2ccb5d" Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.665574 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-97nmz" Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.692314 4699 scope.go:117] "RemoveContainer" containerID="0143e2f9109b3c28c30aa19cdf7638765b67fdbdd672ecbff1c0d603640189ac" Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.714360 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-97nmz"] Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.722912 4699 scope.go:117] "RemoveContainer" containerID="04bee53dc64cf311afb693c3d6c9eb4ce7e191e7b22c1fba0792d43c3e39874f" Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.726990 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-97nmz"] Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.768080 4699 scope.go:117] "RemoveContainer" containerID="35f7239c988fe25e7011c5f22473f2854bb1a482de1b024e4b07254c4b2ccb5d" Feb 26 12:17:43 crc kubenswrapper[4699]: E0226 12:17:43.768669 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35f7239c988fe25e7011c5f22473f2854bb1a482de1b024e4b07254c4b2ccb5d\": container with ID starting with 35f7239c988fe25e7011c5f22473f2854bb1a482de1b024e4b07254c4b2ccb5d not found: ID does not exist" containerID="35f7239c988fe25e7011c5f22473f2854bb1a482de1b024e4b07254c4b2ccb5d" Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.768757 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35f7239c988fe25e7011c5f22473f2854bb1a482de1b024e4b07254c4b2ccb5d"} err="failed to get container status \"35f7239c988fe25e7011c5f22473f2854bb1a482de1b024e4b07254c4b2ccb5d\": rpc error: code = NotFound desc = could not find container \"35f7239c988fe25e7011c5f22473f2854bb1a482de1b024e4b07254c4b2ccb5d\": container with ID starting with 35f7239c988fe25e7011c5f22473f2854bb1a482de1b024e4b07254c4b2ccb5d not found: ID does not exist" Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.768807 4699 scope.go:117] "RemoveContainer" containerID="0143e2f9109b3c28c30aa19cdf7638765b67fdbdd672ecbff1c0d603640189ac" Feb 26 12:17:43 crc kubenswrapper[4699]: E0226 12:17:43.769261 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0143e2f9109b3c28c30aa19cdf7638765b67fdbdd672ecbff1c0d603640189ac\": container with ID starting with 0143e2f9109b3c28c30aa19cdf7638765b67fdbdd672ecbff1c0d603640189ac not found: ID does not exist" containerID="0143e2f9109b3c28c30aa19cdf7638765b67fdbdd672ecbff1c0d603640189ac" Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.769292 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0143e2f9109b3c28c30aa19cdf7638765b67fdbdd672ecbff1c0d603640189ac"} err="failed to get container status \"0143e2f9109b3c28c30aa19cdf7638765b67fdbdd672ecbff1c0d603640189ac\": rpc error: code = NotFound desc = could not find container \"0143e2f9109b3c28c30aa19cdf7638765b67fdbdd672ecbff1c0d603640189ac\": container with ID starting with 0143e2f9109b3c28c30aa19cdf7638765b67fdbdd672ecbff1c0d603640189ac not found: ID does not exist" Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.769308 4699 scope.go:117] "RemoveContainer" containerID="04bee53dc64cf311afb693c3d6c9eb4ce7e191e7b22c1fba0792d43c3e39874f" Feb 26 12:17:43 crc kubenswrapper[4699]: E0226 12:17:43.769567 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04bee53dc64cf311afb693c3d6c9eb4ce7e191e7b22c1fba0792d43c3e39874f\": container with ID starting with 04bee53dc64cf311afb693c3d6c9eb4ce7e191e7b22c1fba0792d43c3e39874f not found: ID does not exist" containerID="04bee53dc64cf311afb693c3d6c9eb4ce7e191e7b22c1fba0792d43c3e39874f" Feb 26 12:17:43 crc kubenswrapper[4699]: I0226 12:17:43.769588 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04bee53dc64cf311afb693c3d6c9eb4ce7e191e7b22c1fba0792d43c3e39874f"} err="failed to get container status \"04bee53dc64cf311afb693c3d6c9eb4ce7e191e7b22c1fba0792d43c3e39874f\": rpc error: code = NotFound desc = could not find container \"04bee53dc64cf311afb693c3d6c9eb4ce7e191e7b22c1fba0792d43c3e39874f\": container with ID starting with 04bee53dc64cf311afb693c3d6c9eb4ce7e191e7b22c1fba0792d43c3e39874f not found: ID does not exist" Feb 26 12:17:44 crc kubenswrapper[4699]: I0226 12:17:44.282576 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f9a62e3-8b3a-4741-93f3-910d206d1bde" path="/var/lib/kubelet/pods/5f9a62e3-8b3a-4741-93f3-910d206d1bde/volumes" Feb 26 12:18:00 crc kubenswrapper[4699]: I0226 12:18:00.148410 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535138-ghxdv"] Feb 26 12:18:00 crc kubenswrapper[4699]: E0226 12:18:00.149384 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9a62e3-8b3a-4741-93f3-910d206d1bde" containerName="extract-utilities" Feb 26 12:18:00 crc kubenswrapper[4699]: I0226 12:18:00.149398 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9a62e3-8b3a-4741-93f3-910d206d1bde" containerName="extract-utilities" Feb 26 12:18:00 crc kubenswrapper[4699]: E0226 12:18:00.149411 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9a62e3-8b3a-4741-93f3-910d206d1bde" containerName="registry-server" Feb 26 12:18:00 crc kubenswrapper[4699]: I0226 12:18:00.149420 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9a62e3-8b3a-4741-93f3-910d206d1bde" containerName="registry-server" Feb 26 12:18:00 crc kubenswrapper[4699]: E0226 12:18:00.149429 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9a62e3-8b3a-4741-93f3-910d206d1bde" containerName="extract-content" Feb 26 12:18:00 crc kubenswrapper[4699]: I0226 12:18:00.149434 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9a62e3-8b3a-4741-93f3-910d206d1bde" containerName="extract-content" Feb 26 12:18:00 crc kubenswrapper[4699]: I0226 12:18:00.149676 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f9a62e3-8b3a-4741-93f3-910d206d1bde" containerName="registry-server" Feb 26 12:18:00 crc kubenswrapper[4699]: I0226 12:18:00.151955 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535138-ghxdv" Feb 26 12:18:00 crc kubenswrapper[4699]: I0226 12:18:00.154393 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 12:18:00 crc kubenswrapper[4699]: I0226 12:18:00.154687 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 12:18:00 crc kubenswrapper[4699]: I0226 12:18:00.156906 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 12:18:00 crc kubenswrapper[4699]: I0226 12:18:00.161998 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535138-ghxdv"] Feb 26 12:18:00 crc kubenswrapper[4699]: I0226 12:18:00.302041 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bd8m\" (UniqueName: \"kubernetes.io/projected/3d73a20e-eea0-421b-8efd-6fd86f1e4d98-kube-api-access-9bd8m\") pod \"auto-csr-approver-29535138-ghxdv\" (UID: \"3d73a20e-eea0-421b-8efd-6fd86f1e4d98\") " pod="openshift-infra/auto-csr-approver-29535138-ghxdv" Feb 26 12:18:00 crc kubenswrapper[4699]: I0226 12:18:00.403847 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bd8m\" (UniqueName: \"kubernetes.io/projected/3d73a20e-eea0-421b-8efd-6fd86f1e4d98-kube-api-access-9bd8m\") pod \"auto-csr-approver-29535138-ghxdv\" (UID: \"3d73a20e-eea0-421b-8efd-6fd86f1e4d98\") " pod="openshift-infra/auto-csr-approver-29535138-ghxdv" Feb 26 12:18:00 crc kubenswrapper[4699]: I0226 12:18:00.420860 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bd8m\" (UniqueName: \"kubernetes.io/projected/3d73a20e-eea0-421b-8efd-6fd86f1e4d98-kube-api-access-9bd8m\") pod \"auto-csr-approver-29535138-ghxdv\" (UID: \"3d73a20e-eea0-421b-8efd-6fd86f1e4d98\") " pod="openshift-infra/auto-csr-approver-29535138-ghxdv" Feb 26 12:18:00 crc kubenswrapper[4699]: I0226 12:18:00.482473 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535138-ghxdv" Feb 26 12:18:00 crc kubenswrapper[4699]: I0226 12:18:00.949219 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535138-ghxdv"] Feb 26 12:18:00 crc kubenswrapper[4699]: W0226 12:18:00.953924 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d73a20e_eea0_421b_8efd_6fd86f1e4d98.slice/crio-cc10b7fb9885394124fc9da8f75eb4cdb80b5176fd79644199ce9dd857327c0b WatchSource:0}: Error finding container cc10b7fb9885394124fc9da8f75eb4cdb80b5176fd79644199ce9dd857327c0b: Status 404 returned error can't find the container with id cc10b7fb9885394124fc9da8f75eb4cdb80b5176fd79644199ce9dd857327c0b Feb 26 12:18:01 crc kubenswrapper[4699]: I0226 12:18:01.878925 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535138-ghxdv" event={"ID":"3d73a20e-eea0-421b-8efd-6fd86f1e4d98","Type":"ContainerStarted","Data":"cc10b7fb9885394124fc9da8f75eb4cdb80b5176fd79644199ce9dd857327c0b"} Feb 26 12:18:02 crc kubenswrapper[4699]: I0226 12:18:02.891422 4699 generic.go:334] "Generic (PLEG): container finished" podID="3d73a20e-eea0-421b-8efd-6fd86f1e4d98" containerID="206617da387e97d81b9b831e8d26536a56cede7f0a2daac8fe00d38d64e627ce" exitCode=0 Feb 26 12:18:02 crc kubenswrapper[4699]: I0226 12:18:02.891531 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535138-ghxdv" event={"ID":"3d73a20e-eea0-421b-8efd-6fd86f1e4d98","Type":"ContainerDied","Data":"206617da387e97d81b9b831e8d26536a56cede7f0a2daac8fe00d38d64e627ce"} Feb 26 12:18:04 crc kubenswrapper[4699]: I0226 12:18:04.319006 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535138-ghxdv" Feb 26 12:18:04 crc kubenswrapper[4699]: I0226 12:18:04.483380 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bd8m\" (UniqueName: \"kubernetes.io/projected/3d73a20e-eea0-421b-8efd-6fd86f1e4d98-kube-api-access-9bd8m\") pod \"3d73a20e-eea0-421b-8efd-6fd86f1e4d98\" (UID: \"3d73a20e-eea0-421b-8efd-6fd86f1e4d98\") " Feb 26 12:18:04 crc kubenswrapper[4699]: I0226 12:18:04.488855 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d73a20e-eea0-421b-8efd-6fd86f1e4d98-kube-api-access-9bd8m" (OuterVolumeSpecName: "kube-api-access-9bd8m") pod "3d73a20e-eea0-421b-8efd-6fd86f1e4d98" (UID: "3d73a20e-eea0-421b-8efd-6fd86f1e4d98"). InnerVolumeSpecName "kube-api-access-9bd8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:18:04 crc kubenswrapper[4699]: I0226 12:18:04.586032 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bd8m\" (UniqueName: \"kubernetes.io/projected/3d73a20e-eea0-421b-8efd-6fd86f1e4d98-kube-api-access-9bd8m\") on node \"crc\" DevicePath \"\"" Feb 26 12:18:04 crc kubenswrapper[4699]: I0226 12:18:04.911596 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535138-ghxdv" event={"ID":"3d73a20e-eea0-421b-8efd-6fd86f1e4d98","Type":"ContainerDied","Data":"cc10b7fb9885394124fc9da8f75eb4cdb80b5176fd79644199ce9dd857327c0b"} Feb 26 12:18:04 crc kubenswrapper[4699]: I0226 12:18:04.912038 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc10b7fb9885394124fc9da8f75eb4cdb80b5176fd79644199ce9dd857327c0b" Feb 26 12:18:04 crc kubenswrapper[4699]: I0226 12:18:04.911859 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535138-ghxdv" Feb 26 12:18:05 crc kubenswrapper[4699]: I0226 12:18:05.380835 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535132-hr4rf"] Feb 26 12:18:05 crc kubenswrapper[4699]: I0226 12:18:05.403643 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535132-hr4rf"] Feb 26 12:18:06 crc kubenswrapper[4699]: I0226 12:18:06.273316 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ff3c3e0-4f58-401c-9f5f-b733727f73ff" path="/var/lib/kubelet/pods/5ff3c3e0-4f58-401c-9f5f-b733727f73ff/volumes" Feb 26 12:18:40 crc kubenswrapper[4699]: I0226 12:18:40.364307 4699 scope.go:117] "RemoveContainer" containerID="747ddaa984d13eaf0f8ee9e7ae1b9299bffa91ea051e4eb23c1b1a2ab2aaf402" Feb 26 12:18:48 crc kubenswrapper[4699]: I0226 12:18:48.444791 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l2l5g/must-gather-zwd9v"] Feb 26 12:18:48 crc kubenswrapper[4699]: E0226 12:18:48.445686 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d73a20e-eea0-421b-8efd-6fd86f1e4d98" containerName="oc" Feb 26 12:18:48 crc kubenswrapper[4699]: I0226 12:18:48.445701 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d73a20e-eea0-421b-8efd-6fd86f1e4d98" containerName="oc" Feb 26 12:18:48 crc kubenswrapper[4699]: I0226 12:18:48.445949 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d73a20e-eea0-421b-8efd-6fd86f1e4d98" containerName="oc" Feb 26 12:18:48 crc kubenswrapper[4699]: I0226 12:18:48.447174 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l2l5g/must-gather-zwd9v" Feb 26 12:18:48 crc kubenswrapper[4699]: I0226 12:18:48.453843 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-l2l5g"/"default-dockercfg-hhg4j" Feb 26 12:18:48 crc kubenswrapper[4699]: I0226 12:18:48.453939 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-l2l5g"/"kube-root-ca.crt" Feb 26 12:18:48 crc kubenswrapper[4699]: I0226 12:18:48.454034 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-l2l5g"/"openshift-service-ca.crt" Feb 26 12:18:48 crc kubenswrapper[4699]: I0226 12:18:48.462359 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l2l5g/must-gather-zwd9v"] Feb 26 12:18:48 crc kubenswrapper[4699]: I0226 12:18:48.609183 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlmbm\" (UniqueName: \"kubernetes.io/projected/e1a2e674-d3fd-4fac-b5e0-b201dd644f25-kube-api-access-zlmbm\") pod \"must-gather-zwd9v\" (UID: \"e1a2e674-d3fd-4fac-b5e0-b201dd644f25\") " pod="openshift-must-gather-l2l5g/must-gather-zwd9v" Feb 26 12:18:48 crc kubenswrapper[4699]: I0226 12:18:48.609239 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e1a2e674-d3fd-4fac-b5e0-b201dd644f25-must-gather-output\") pod \"must-gather-zwd9v\" (UID: \"e1a2e674-d3fd-4fac-b5e0-b201dd644f25\") " pod="openshift-must-gather-l2l5g/must-gather-zwd9v" Feb 26 12:18:48 crc kubenswrapper[4699]: I0226 12:18:48.712546 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlmbm\" (UniqueName: \"kubernetes.io/projected/e1a2e674-d3fd-4fac-b5e0-b201dd644f25-kube-api-access-zlmbm\") pod \"must-gather-zwd9v\" (UID: \"e1a2e674-d3fd-4fac-b5e0-b201dd644f25\") " pod="openshift-must-gather-l2l5g/must-gather-zwd9v" Feb 26 12:18:48 crc kubenswrapper[4699]: I0226 12:18:48.712599 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e1a2e674-d3fd-4fac-b5e0-b201dd644f25-must-gather-output\") pod \"must-gather-zwd9v\" (UID: \"e1a2e674-d3fd-4fac-b5e0-b201dd644f25\") " pod="openshift-must-gather-l2l5g/must-gather-zwd9v" Feb 26 12:18:48 crc kubenswrapper[4699]: I0226 12:18:48.713616 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e1a2e674-d3fd-4fac-b5e0-b201dd644f25-must-gather-output\") pod \"must-gather-zwd9v\" (UID: \"e1a2e674-d3fd-4fac-b5e0-b201dd644f25\") " pod="openshift-must-gather-l2l5g/must-gather-zwd9v" Feb 26 12:18:48 crc kubenswrapper[4699]: I0226 12:18:48.739099 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlmbm\" (UniqueName: \"kubernetes.io/projected/e1a2e674-d3fd-4fac-b5e0-b201dd644f25-kube-api-access-zlmbm\") pod \"must-gather-zwd9v\" (UID: \"e1a2e674-d3fd-4fac-b5e0-b201dd644f25\") " pod="openshift-must-gather-l2l5g/must-gather-zwd9v" Feb 26 12:18:48 crc kubenswrapper[4699]: I0226 12:18:48.770014 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l2l5g/must-gather-zwd9v" Feb 26 12:18:49 crc kubenswrapper[4699]: I0226 12:18:49.266504 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l2l5g/must-gather-zwd9v"] Feb 26 12:18:49 crc kubenswrapper[4699]: W0226 12:18:49.271990 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1a2e674_d3fd_4fac_b5e0_b201dd644f25.slice/crio-d3f9348174a99f466fb6975b66d2932e06844b8873e974def4face740c871121 WatchSource:0}: Error finding container d3f9348174a99f466fb6975b66d2932e06844b8873e974def4face740c871121: Status 404 returned error can't find the container with id d3f9348174a99f466fb6975b66d2932e06844b8873e974def4face740c871121 Feb 26 12:18:49 crc kubenswrapper[4699]: I0226 12:18:49.350399 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l2l5g/must-gather-zwd9v" event={"ID":"e1a2e674-d3fd-4fac-b5e0-b201dd644f25","Type":"ContainerStarted","Data":"d3f9348174a99f466fb6975b66d2932e06844b8873e974def4face740c871121"} Feb 26 12:18:50 crc kubenswrapper[4699]: I0226 12:18:50.368568 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l2l5g/must-gather-zwd9v" event={"ID":"e1a2e674-d3fd-4fac-b5e0-b201dd644f25","Type":"ContainerStarted","Data":"69b0e2cefceef6db918052929a4b9d82fd8ed38656cef9eade78076401ce807e"} Feb 26 12:18:50 crc kubenswrapper[4699]: I0226 12:18:50.368625 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l2l5g/must-gather-zwd9v" event={"ID":"e1a2e674-d3fd-4fac-b5e0-b201dd644f25","Type":"ContainerStarted","Data":"f14354991cbd7caa139684c2977ca7c2377002acb68ea2e984ee4fc98ce473f0"} Feb 26 12:18:50 crc kubenswrapper[4699]: I0226 12:18:50.400539 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-l2l5g/must-gather-zwd9v" podStartSLOduration=2.400512573 podStartE2EDuration="2.400512573s" podCreationTimestamp="2026-02-26 12:18:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 12:18:50.386597419 +0000 UTC m=+4076.197423863" watchObservedRunningTime="2026-02-26 12:18:50.400512573 +0000 UTC m=+4076.211339017" Feb 26 12:18:51 crc kubenswrapper[4699]: E0226 12:18:51.620157 4699 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.213:38684->38.102.83.213:34509: write tcp 38.102.83.213:38684->38.102.83.213:34509: write: connection reset by peer Feb 26 12:18:53 crc kubenswrapper[4699]: I0226 12:18:53.525444 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l2l5g/crc-debug-j6ff4"] Feb 26 12:18:53 crc kubenswrapper[4699]: I0226 12:18:53.529153 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l2l5g/crc-debug-j6ff4" Feb 26 12:18:53 crc kubenswrapper[4699]: I0226 12:18:53.607544 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmtmt\" (UniqueName: \"kubernetes.io/projected/01c07c4e-7806-421d-abac-3a7288adae16-kube-api-access-zmtmt\") pod \"crc-debug-j6ff4\" (UID: \"01c07c4e-7806-421d-abac-3a7288adae16\") " pod="openshift-must-gather-l2l5g/crc-debug-j6ff4" Feb 26 12:18:53 crc kubenswrapper[4699]: I0226 12:18:53.607586 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01c07c4e-7806-421d-abac-3a7288adae16-host\") pod \"crc-debug-j6ff4\" (UID: \"01c07c4e-7806-421d-abac-3a7288adae16\") " pod="openshift-must-gather-l2l5g/crc-debug-j6ff4" Feb 26 12:18:53 crc kubenswrapper[4699]: I0226 12:18:53.709072 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmtmt\" (UniqueName: \"kubernetes.io/projected/01c07c4e-7806-421d-abac-3a7288adae16-kube-api-access-zmtmt\") pod \"crc-debug-j6ff4\" (UID: \"01c07c4e-7806-421d-abac-3a7288adae16\") " pod="openshift-must-gather-l2l5g/crc-debug-j6ff4" Feb 26 12:18:53 crc kubenswrapper[4699]: I0226 12:18:53.709126 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01c07c4e-7806-421d-abac-3a7288adae16-host\") pod \"crc-debug-j6ff4\" (UID: \"01c07c4e-7806-421d-abac-3a7288adae16\") " pod="openshift-must-gather-l2l5g/crc-debug-j6ff4" Feb 26 12:18:53 crc kubenswrapper[4699]: I0226 12:18:53.709448 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01c07c4e-7806-421d-abac-3a7288adae16-host\") pod \"crc-debug-j6ff4\" (UID: \"01c07c4e-7806-421d-abac-3a7288adae16\") " pod="openshift-must-gather-l2l5g/crc-debug-j6ff4" Feb 26 12:18:53 crc kubenswrapper[4699]: I0226 12:18:53.733043 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmtmt\" (UniqueName: \"kubernetes.io/projected/01c07c4e-7806-421d-abac-3a7288adae16-kube-api-access-zmtmt\") pod \"crc-debug-j6ff4\" (UID: \"01c07c4e-7806-421d-abac-3a7288adae16\") " pod="openshift-must-gather-l2l5g/crc-debug-j6ff4" Feb 26 12:18:53 crc kubenswrapper[4699]: I0226 12:18:53.849557 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l2l5g/crc-debug-j6ff4" Feb 26 12:18:53 crc kubenswrapper[4699]: W0226 12:18:53.882647 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01c07c4e_7806_421d_abac_3a7288adae16.slice/crio-048b22d49df716dfdf18bf0b5a37794f7b9ab766a5ef5901add33896fa293116 WatchSource:0}: Error finding container 048b22d49df716dfdf18bf0b5a37794f7b9ab766a5ef5901add33896fa293116: Status 404 returned error can't find the container with id 048b22d49df716dfdf18bf0b5a37794f7b9ab766a5ef5901add33896fa293116 Feb 26 12:18:54 crc kubenswrapper[4699]: I0226 12:18:54.408037 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l2l5g/crc-debug-j6ff4" event={"ID":"01c07c4e-7806-421d-abac-3a7288adae16","Type":"ContainerStarted","Data":"914b1b71a76d7ca6020b99c9b97dbd825c08917a7129daad95e968dab1ca96e3"} Feb 26 12:18:54 crc kubenswrapper[4699]: I0226 12:18:54.408374 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l2l5g/crc-debug-j6ff4" event={"ID":"01c07c4e-7806-421d-abac-3a7288adae16","Type":"ContainerStarted","Data":"048b22d49df716dfdf18bf0b5a37794f7b9ab766a5ef5901add33896fa293116"} Feb 26 12:18:54 crc kubenswrapper[4699]: I0226 12:18:54.432718 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-l2l5g/crc-debug-j6ff4" podStartSLOduration=1.432694492 podStartE2EDuration="1.432694492s" podCreationTimestamp="2026-02-26 12:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 12:18:54.421710691 +0000 UTC m=+4080.232537135" watchObservedRunningTime="2026-02-26 12:18:54.432694492 +0000 UTC m=+4080.243520946" Feb 26 12:19:25 crc kubenswrapper[4699]: I0226 12:19:25.702930 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wrd9m"] Feb 26 12:19:25 crc kubenswrapper[4699]: I0226 12:19:25.705979 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wrd9m" Feb 26 12:19:25 crc kubenswrapper[4699]: I0226 12:19:25.720930 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wrd9m"] Feb 26 12:19:25 crc kubenswrapper[4699]: I0226 12:19:25.885188 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf-utilities\") pod \"redhat-operators-wrd9m\" (UID: \"b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf\") " pod="openshift-marketplace/redhat-operators-wrd9m" Feb 26 12:19:25 crc kubenswrapper[4699]: I0226 12:19:25.885746 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf-catalog-content\") pod \"redhat-operators-wrd9m\" (UID: \"b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf\") " pod="openshift-marketplace/redhat-operators-wrd9m" Feb 26 12:19:25 crc kubenswrapper[4699]: I0226 12:19:25.886068 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbd45\" (UniqueName: \"kubernetes.io/projected/b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf-kube-api-access-cbd45\") pod \"redhat-operators-wrd9m\" (UID: \"b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf\") " pod="openshift-marketplace/redhat-operators-wrd9m" Feb 26 12:19:25 crc kubenswrapper[4699]: I0226 12:19:25.987389 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbd45\" (UniqueName: \"kubernetes.io/projected/b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf-kube-api-access-cbd45\") pod \"redhat-operators-wrd9m\" (UID: \"b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf\") " pod="openshift-marketplace/redhat-operators-wrd9m" Feb 26 12:19:25 crc kubenswrapper[4699]: I0226 12:19:25.987746 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf-utilities\") pod \"redhat-operators-wrd9m\" (UID: \"b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf\") " pod="openshift-marketplace/redhat-operators-wrd9m" Feb 26 12:19:25 crc kubenswrapper[4699]: I0226 12:19:25.987873 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf-catalog-content\") pod \"redhat-operators-wrd9m\" (UID: \"b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf\") " pod="openshift-marketplace/redhat-operators-wrd9m" Feb 26 12:19:25 crc kubenswrapper[4699]: I0226 12:19:25.988324 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf-catalog-content\") pod \"redhat-operators-wrd9m\" (UID: \"b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf\") " pod="openshift-marketplace/redhat-operators-wrd9m" Feb 26 12:19:25 crc kubenswrapper[4699]: I0226 12:19:25.988402 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf-utilities\") pod \"redhat-operators-wrd9m\" (UID: \"b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf\") " pod="openshift-marketplace/redhat-operators-wrd9m" Feb 26 12:19:26 crc kubenswrapper[4699]: I0226 12:19:26.020814 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbd45\" (UniqueName: \"kubernetes.io/projected/b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf-kube-api-access-cbd45\") pod \"redhat-operators-wrd9m\" (UID: \"b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf\") " pod="openshift-marketplace/redhat-operators-wrd9m" Feb 26 12:19:26 crc kubenswrapper[4699]: I0226 12:19:26.038254 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wrd9m" Feb 26 12:19:26 crc kubenswrapper[4699]: I0226 12:19:26.530010 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wrd9m"] Feb 26 12:19:26 crc kubenswrapper[4699]: I0226 12:19:26.699492 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrd9m" event={"ID":"b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf","Type":"ContainerStarted","Data":"d385ffaad2418df1f042a6db3f0181c3457f9bd79043978aea4ed59564bf6651"} Feb 26 12:19:27 crc kubenswrapper[4699]: I0226 12:19:27.738059 4699 generic.go:334] "Generic (PLEG): container finished" podID="b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf" containerID="2daebaa5370afaad83a99ae877015f25f2523dc3704a3d0897f2dfb9d894739f" exitCode=0 Feb 26 12:19:27 crc kubenswrapper[4699]: I0226 12:19:27.738130 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrd9m" event={"ID":"b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf","Type":"ContainerDied","Data":"2daebaa5370afaad83a99ae877015f25f2523dc3704a3d0897f2dfb9d894739f"} Feb 26 12:19:27 crc kubenswrapper[4699]: I0226 12:19:27.742020 4699 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 12:19:29 crc kubenswrapper[4699]: I0226 12:19:29.759066 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrd9m" event={"ID":"b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf","Type":"ContainerStarted","Data":"7de7cd2371d3abbfbf7615ebcd3acb90e41a2f685f589eb3f400c395ba8b7d8e"} Feb 26 12:19:30 crc kubenswrapper[4699]: I0226 12:19:30.770532 4699 generic.go:334] "Generic (PLEG): container finished" podID="01c07c4e-7806-421d-abac-3a7288adae16" containerID="914b1b71a76d7ca6020b99c9b97dbd825c08917a7129daad95e968dab1ca96e3" exitCode=0 Feb 26 12:19:30 crc kubenswrapper[4699]: I0226 12:19:30.770610 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l2l5g/crc-debug-j6ff4" event={"ID":"01c07c4e-7806-421d-abac-3a7288adae16","Type":"ContainerDied","Data":"914b1b71a76d7ca6020b99c9b97dbd825c08917a7129daad95e968dab1ca96e3"} Feb 26 12:19:31 crc kubenswrapper[4699]: I0226 12:19:31.976335 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l2l5g/crc-debug-j6ff4" Feb 26 12:19:32 crc kubenswrapper[4699]: I0226 12:19:32.013957 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-l2l5g/crc-debug-j6ff4"] Feb 26 12:19:32 crc kubenswrapper[4699]: I0226 12:19:32.021754 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-l2l5g/crc-debug-j6ff4"] Feb 26 12:19:32 crc kubenswrapper[4699]: I0226 12:19:32.064682 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmtmt\" (UniqueName: \"kubernetes.io/projected/01c07c4e-7806-421d-abac-3a7288adae16-kube-api-access-zmtmt\") pod \"01c07c4e-7806-421d-abac-3a7288adae16\" (UID: \"01c07c4e-7806-421d-abac-3a7288adae16\") " Feb 26 12:19:32 crc kubenswrapper[4699]: I0226 12:19:32.064852 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01c07c4e-7806-421d-abac-3a7288adae16-host\") pod \"01c07c4e-7806-421d-abac-3a7288adae16\" (UID: \"01c07c4e-7806-421d-abac-3a7288adae16\") " Feb 26 12:19:32 crc kubenswrapper[4699]: I0226 12:19:32.065048 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01c07c4e-7806-421d-abac-3a7288adae16-host" (OuterVolumeSpecName: "host") pod "01c07c4e-7806-421d-abac-3a7288adae16" (UID: "01c07c4e-7806-421d-abac-3a7288adae16"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 12:19:32 crc kubenswrapper[4699]: I0226 12:19:32.065514 4699 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01c07c4e-7806-421d-abac-3a7288adae16-host\") on node \"crc\" DevicePath \"\"" Feb 26 12:19:32 crc kubenswrapper[4699]: I0226 12:19:32.073076 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01c07c4e-7806-421d-abac-3a7288adae16-kube-api-access-zmtmt" (OuterVolumeSpecName: "kube-api-access-zmtmt") pod "01c07c4e-7806-421d-abac-3a7288adae16" (UID: "01c07c4e-7806-421d-abac-3a7288adae16"). InnerVolumeSpecName "kube-api-access-zmtmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:19:32 crc kubenswrapper[4699]: I0226 12:19:32.167481 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmtmt\" (UniqueName: \"kubernetes.io/projected/01c07c4e-7806-421d-abac-3a7288adae16-kube-api-access-zmtmt\") on node \"crc\" DevicePath \"\"" Feb 26 12:19:32 crc kubenswrapper[4699]: I0226 12:19:32.272721 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01c07c4e-7806-421d-abac-3a7288adae16" path="/var/lib/kubelet/pods/01c07c4e-7806-421d-abac-3a7288adae16/volumes" Feb 26 12:19:32 crc kubenswrapper[4699]: I0226 12:19:32.881571 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l2l5g/crc-debug-j6ff4" Feb 26 12:19:32 crc kubenswrapper[4699]: I0226 12:19:32.881622 4699 scope.go:117] "RemoveContainer" containerID="914b1b71a76d7ca6020b99c9b97dbd825c08917a7129daad95e968dab1ca96e3" Feb 26 12:19:33 crc kubenswrapper[4699]: I0226 12:19:33.207909 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l2l5g/crc-debug-dmh22"] Feb 26 12:19:33 crc kubenswrapper[4699]: E0226 12:19:33.208414 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01c07c4e-7806-421d-abac-3a7288adae16" containerName="container-00" Feb 26 12:19:33 crc kubenswrapper[4699]: I0226 12:19:33.208432 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="01c07c4e-7806-421d-abac-3a7288adae16" containerName="container-00" Feb 26 12:19:33 crc kubenswrapper[4699]: I0226 12:19:33.208743 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="01c07c4e-7806-421d-abac-3a7288adae16" containerName="container-00" Feb 26 12:19:33 crc kubenswrapper[4699]: I0226 12:19:33.209544 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l2l5g/crc-debug-dmh22" Feb 26 12:19:33 crc kubenswrapper[4699]: I0226 12:19:33.400621 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a70563cf-4017-4654-a730-3bd13e1b3b3a-host\") pod \"crc-debug-dmh22\" (UID: \"a70563cf-4017-4654-a730-3bd13e1b3b3a\") " pod="openshift-must-gather-l2l5g/crc-debug-dmh22" Feb 26 12:19:33 crc kubenswrapper[4699]: I0226 12:19:33.400949 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6k2d\" (UniqueName: \"kubernetes.io/projected/a70563cf-4017-4654-a730-3bd13e1b3b3a-kube-api-access-p6k2d\") pod \"crc-debug-dmh22\" (UID: \"a70563cf-4017-4654-a730-3bd13e1b3b3a\") " pod="openshift-must-gather-l2l5g/crc-debug-dmh22" Feb 26 12:19:33 crc kubenswrapper[4699]: I0226 12:19:33.504098 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6k2d\" (UniqueName: \"kubernetes.io/projected/a70563cf-4017-4654-a730-3bd13e1b3b3a-kube-api-access-p6k2d\") pod \"crc-debug-dmh22\" (UID: \"a70563cf-4017-4654-a730-3bd13e1b3b3a\") " pod="openshift-must-gather-l2l5g/crc-debug-dmh22" Feb 26 12:19:33 crc kubenswrapper[4699]: I0226 12:19:33.504199 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a70563cf-4017-4654-a730-3bd13e1b3b3a-host\") pod \"crc-debug-dmh22\" (UID: \"a70563cf-4017-4654-a730-3bd13e1b3b3a\") " pod="openshift-must-gather-l2l5g/crc-debug-dmh22" Feb 26 12:19:33 crc kubenswrapper[4699]: I0226 12:19:33.504346 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a70563cf-4017-4654-a730-3bd13e1b3b3a-host\") pod \"crc-debug-dmh22\" (UID: \"a70563cf-4017-4654-a730-3bd13e1b3b3a\") " pod="openshift-must-gather-l2l5g/crc-debug-dmh22" Feb 26 12:19:33 crc kubenswrapper[4699]: I0226 12:19:33.526542 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6k2d\" (UniqueName: \"kubernetes.io/projected/a70563cf-4017-4654-a730-3bd13e1b3b3a-kube-api-access-p6k2d\") pod \"crc-debug-dmh22\" (UID: \"a70563cf-4017-4654-a730-3bd13e1b3b3a\") " pod="openshift-must-gather-l2l5g/crc-debug-dmh22" Feb 26 12:19:33 crc kubenswrapper[4699]: I0226 12:19:33.529605 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l2l5g/crc-debug-dmh22" Feb 26 12:19:33 crc kubenswrapper[4699]: I0226 12:19:33.892910 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l2l5g/crc-debug-dmh22" event={"ID":"a70563cf-4017-4654-a730-3bd13e1b3b3a","Type":"ContainerStarted","Data":"cf08047c111eeed20efe0e8113b2e168d2dda4bf486d223eef2adf35e628e7e3"} Feb 26 12:19:34 crc kubenswrapper[4699]: I0226 12:19:34.950189 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l2l5g/crc-debug-dmh22" event={"ID":"a70563cf-4017-4654-a730-3bd13e1b3b3a","Type":"ContainerStarted","Data":"3661f7766c195df1890f39782c6ee0afb458e5fb745113c3cb232308b3d30727"} Feb 26 12:19:34 crc kubenswrapper[4699]: I0226 12:19:34.971711 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-l2l5g/crc-debug-dmh22" podStartSLOduration=1.971692563 podStartE2EDuration="1.971692563s" podCreationTimestamp="2026-02-26 12:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 12:19:34.965428836 +0000 UTC m=+4120.776255270" watchObservedRunningTime="2026-02-26 12:19:34.971692563 +0000 UTC m=+4120.782518987" Feb 26 12:19:38 crc kubenswrapper[4699]: I0226 12:19:38.141280 4699 generic.go:334] "Generic (PLEG): container finished" podID="a70563cf-4017-4654-a730-3bd13e1b3b3a" containerID="3661f7766c195df1890f39782c6ee0afb458e5fb745113c3cb232308b3d30727" exitCode=0 Feb 26 12:19:38 crc kubenswrapper[4699]: I0226 12:19:38.141404 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l2l5g/crc-debug-dmh22" event={"ID":"a70563cf-4017-4654-a730-3bd13e1b3b3a","Type":"ContainerDied","Data":"3661f7766c195df1890f39782c6ee0afb458e5fb745113c3cb232308b3d30727"} Feb 26 12:19:39 crc kubenswrapper[4699]: I0226 12:19:39.261699 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l2l5g/crc-debug-dmh22" Feb 26 12:19:39 crc kubenswrapper[4699]: I0226 12:19:39.290958 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-l2l5g/crc-debug-dmh22"] Feb 26 12:19:39 crc kubenswrapper[4699]: I0226 12:19:39.298920 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-l2l5g/crc-debug-dmh22"] Feb 26 12:19:39 crc kubenswrapper[4699]: I0226 12:19:39.446268 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6k2d\" (UniqueName: \"kubernetes.io/projected/a70563cf-4017-4654-a730-3bd13e1b3b3a-kube-api-access-p6k2d\") pod \"a70563cf-4017-4654-a730-3bd13e1b3b3a\" (UID: \"a70563cf-4017-4654-a730-3bd13e1b3b3a\") " Feb 26 12:19:39 crc kubenswrapper[4699]: I0226 12:19:39.446470 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a70563cf-4017-4654-a730-3bd13e1b3b3a-host\") pod \"a70563cf-4017-4654-a730-3bd13e1b3b3a\" (UID: \"a70563cf-4017-4654-a730-3bd13e1b3b3a\") " Feb 26 12:19:39 crc kubenswrapper[4699]: I0226 12:19:39.446752 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a70563cf-4017-4654-a730-3bd13e1b3b3a-host" (OuterVolumeSpecName: "host") pod "a70563cf-4017-4654-a730-3bd13e1b3b3a" (UID: "a70563cf-4017-4654-a730-3bd13e1b3b3a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 12:19:39 crc kubenswrapper[4699]: I0226 12:19:39.447133 4699 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a70563cf-4017-4654-a730-3bd13e1b3b3a-host\") on node \"crc\" DevicePath \"\"" Feb 26 12:19:39 crc kubenswrapper[4699]: I0226 12:19:39.452165 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a70563cf-4017-4654-a730-3bd13e1b3b3a-kube-api-access-p6k2d" (OuterVolumeSpecName: "kube-api-access-p6k2d") pod "a70563cf-4017-4654-a730-3bd13e1b3b3a" (UID: "a70563cf-4017-4654-a730-3bd13e1b3b3a"). InnerVolumeSpecName "kube-api-access-p6k2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:19:39 crc kubenswrapper[4699]: I0226 12:19:39.550155 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6k2d\" (UniqueName: \"kubernetes.io/projected/a70563cf-4017-4654-a730-3bd13e1b3b3a-kube-api-access-p6k2d\") on node \"crc\" DevicePath \"\"" Feb 26 12:19:40 crc kubenswrapper[4699]: I0226 12:19:40.373037 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l2l5g/crc-debug-dmh22" Feb 26 12:19:40 crc kubenswrapper[4699]: I0226 12:19:40.381538 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a70563cf-4017-4654-a730-3bd13e1b3b3a" path="/var/lib/kubelet/pods/a70563cf-4017-4654-a730-3bd13e1b3b3a/volumes" Feb 26 12:19:40 crc kubenswrapper[4699]: I0226 12:19:40.382261 4699 scope.go:117] "RemoveContainer" containerID="3661f7766c195df1890f39782c6ee0afb458e5fb745113c3cb232308b3d30727" Feb 26 12:19:40 crc kubenswrapper[4699]: I0226 12:19:40.551587 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l2l5g/crc-debug-ft29p"] Feb 26 12:19:40 crc kubenswrapper[4699]: E0226 12:19:40.552229 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a70563cf-4017-4654-a730-3bd13e1b3b3a" containerName="container-00" Feb 26 12:19:40 crc kubenswrapper[4699]: I0226 12:19:40.552250 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="a70563cf-4017-4654-a730-3bd13e1b3b3a" containerName="container-00" Feb 26 12:19:40 crc kubenswrapper[4699]: I0226 12:19:40.552493 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="a70563cf-4017-4654-a730-3bd13e1b3b3a" containerName="container-00" Feb 26 12:19:40 crc kubenswrapper[4699]: I0226 12:19:40.553379 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l2l5g/crc-debug-ft29p" Feb 26 12:19:40 crc kubenswrapper[4699]: I0226 12:19:40.565749 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/858ae445-a203-46c0-b9f1-4dcf82a7b902-host\") pod \"crc-debug-ft29p\" (UID: \"858ae445-a203-46c0-b9f1-4dcf82a7b902\") " pod="openshift-must-gather-l2l5g/crc-debug-ft29p" Feb 26 12:19:40 crc kubenswrapper[4699]: I0226 12:19:40.565790 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvns4\" (UniqueName: \"kubernetes.io/projected/858ae445-a203-46c0-b9f1-4dcf82a7b902-kube-api-access-cvns4\") pod \"crc-debug-ft29p\" (UID: \"858ae445-a203-46c0-b9f1-4dcf82a7b902\") " pod="openshift-must-gather-l2l5g/crc-debug-ft29p" Feb 26 12:19:40 crc kubenswrapper[4699]: I0226 12:19:40.667467 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/858ae445-a203-46c0-b9f1-4dcf82a7b902-host\") pod \"crc-debug-ft29p\" (UID: \"858ae445-a203-46c0-b9f1-4dcf82a7b902\") " pod="openshift-must-gather-l2l5g/crc-debug-ft29p" Feb 26 12:19:40 crc kubenswrapper[4699]: I0226 12:19:40.667512 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvns4\" (UniqueName: \"kubernetes.io/projected/858ae445-a203-46c0-b9f1-4dcf82a7b902-kube-api-access-cvns4\") pod \"crc-debug-ft29p\" (UID: \"858ae445-a203-46c0-b9f1-4dcf82a7b902\") " pod="openshift-must-gather-l2l5g/crc-debug-ft29p" Feb 26 12:19:40 crc kubenswrapper[4699]: I0226 12:19:40.667565 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/858ae445-a203-46c0-b9f1-4dcf82a7b902-host\") pod \"crc-debug-ft29p\" (UID: \"858ae445-a203-46c0-b9f1-4dcf82a7b902\") " pod="openshift-must-gather-l2l5g/crc-debug-ft29p" Feb 26 12:19:40 crc kubenswrapper[4699]: I0226 12:19:40.685710 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvns4\" (UniqueName: \"kubernetes.io/projected/858ae445-a203-46c0-b9f1-4dcf82a7b902-kube-api-access-cvns4\") pod \"crc-debug-ft29p\" (UID: \"858ae445-a203-46c0-b9f1-4dcf82a7b902\") " pod="openshift-must-gather-l2l5g/crc-debug-ft29p" Feb 26 12:19:40 crc kubenswrapper[4699]: I0226 12:19:40.874928 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l2l5g/crc-debug-ft29p" Feb 26 12:19:41 crc kubenswrapper[4699]: I0226 12:19:41.397197 4699 generic.go:334] "Generic (PLEG): container finished" podID="b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf" containerID="7de7cd2371d3abbfbf7615ebcd3acb90e41a2f685f589eb3f400c395ba8b7d8e" exitCode=0 Feb 26 12:19:41 crc kubenswrapper[4699]: I0226 12:19:41.397274 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrd9m" event={"ID":"b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf","Type":"ContainerDied","Data":"7de7cd2371d3abbfbf7615ebcd3acb90e41a2f685f589eb3f400c395ba8b7d8e"} Feb 26 12:19:41 crc kubenswrapper[4699]: I0226 12:19:41.399480 4699 generic.go:334] "Generic (PLEG): container finished" podID="858ae445-a203-46c0-b9f1-4dcf82a7b902" containerID="cbef9b8d06df85870411c16f203fb0797263277126ea2f97232cdb89a5553998" exitCode=0 Feb 26 12:19:41 crc kubenswrapper[4699]: I0226 12:19:41.399528 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l2l5g/crc-debug-ft29p" event={"ID":"858ae445-a203-46c0-b9f1-4dcf82a7b902","Type":"ContainerDied","Data":"cbef9b8d06df85870411c16f203fb0797263277126ea2f97232cdb89a5553998"} Feb 26 12:19:41 crc kubenswrapper[4699]: I0226 12:19:41.399557 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l2l5g/crc-debug-ft29p" event={"ID":"858ae445-a203-46c0-b9f1-4dcf82a7b902","Type":"ContainerStarted","Data":"fe8482311d347e90bb4cda3d78e7fb0585efc4b96b416a5a37d37b43b3af663f"} Feb 26 12:19:41 crc kubenswrapper[4699]: I0226 12:19:41.472905 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-l2l5g/crc-debug-ft29p"] Feb 26 12:19:41 crc kubenswrapper[4699]: I0226 12:19:41.483040 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-l2l5g/crc-debug-ft29p"] Feb 26 12:19:41 crc kubenswrapper[4699]: I0226 12:19:41.584665 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 12:19:41 crc kubenswrapper[4699]: I0226 12:19:41.584729 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 12:19:42 crc kubenswrapper[4699]: I0226 12:19:42.409727 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrd9m" event={"ID":"b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf","Type":"ContainerStarted","Data":"ed1cca4cfd46370f9a4e1dd43a9fe5c873f3c94e168cab7d19373995433384a7"} Feb 26 12:19:42 crc kubenswrapper[4699]: I0226 12:19:42.442480 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wrd9m" podStartSLOduration=3.360578438 podStartE2EDuration="17.442454066s" podCreationTimestamp="2026-02-26 12:19:25 +0000 UTC" firstStartedPulling="2026-02-26 12:19:27.74153858 +0000 UTC m=+4113.552365014" lastFinishedPulling="2026-02-26 12:19:41.823414208 +0000 UTC m=+4127.634240642" observedRunningTime="2026-02-26 12:19:42.430671812 +0000 UTC m=+4128.241498266" watchObservedRunningTime="2026-02-26 12:19:42.442454066 +0000 UTC m=+4128.253280510" Feb 26 12:19:42 crc kubenswrapper[4699]: I0226 12:19:42.662157 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l2l5g/crc-debug-ft29p" Feb 26 12:19:42 crc kubenswrapper[4699]: I0226 12:19:42.740773 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvns4\" (UniqueName: \"kubernetes.io/projected/858ae445-a203-46c0-b9f1-4dcf82a7b902-kube-api-access-cvns4\") pod \"858ae445-a203-46c0-b9f1-4dcf82a7b902\" (UID: \"858ae445-a203-46c0-b9f1-4dcf82a7b902\") " Feb 26 12:19:42 crc kubenswrapper[4699]: I0226 12:19:42.740905 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/858ae445-a203-46c0-b9f1-4dcf82a7b902-host\") pod \"858ae445-a203-46c0-b9f1-4dcf82a7b902\" (UID: \"858ae445-a203-46c0-b9f1-4dcf82a7b902\") " Feb 26 12:19:42 crc kubenswrapper[4699]: I0226 12:19:42.741139 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/858ae445-a203-46c0-b9f1-4dcf82a7b902-host" (OuterVolumeSpecName: "host") pod "858ae445-a203-46c0-b9f1-4dcf82a7b902" (UID: "858ae445-a203-46c0-b9f1-4dcf82a7b902"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 12:19:42 crc kubenswrapper[4699]: I0226 12:19:42.741782 4699 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/858ae445-a203-46c0-b9f1-4dcf82a7b902-host\") on node \"crc\" DevicePath \"\"" Feb 26 12:19:42 crc kubenswrapper[4699]: I0226 12:19:42.750582 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/858ae445-a203-46c0-b9f1-4dcf82a7b902-kube-api-access-cvns4" (OuterVolumeSpecName: "kube-api-access-cvns4") pod "858ae445-a203-46c0-b9f1-4dcf82a7b902" (UID: "858ae445-a203-46c0-b9f1-4dcf82a7b902"). InnerVolumeSpecName "kube-api-access-cvns4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:19:42 crc kubenswrapper[4699]: I0226 12:19:42.843488 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvns4\" (UniqueName: \"kubernetes.io/projected/858ae445-a203-46c0-b9f1-4dcf82a7b902-kube-api-access-cvns4\") on node \"crc\" DevicePath \"\"" Feb 26 12:19:43 crc kubenswrapper[4699]: I0226 12:19:43.429094 4699 scope.go:117] "RemoveContainer" containerID="cbef9b8d06df85870411c16f203fb0797263277126ea2f97232cdb89a5553998" Feb 26 12:19:43 crc kubenswrapper[4699]: I0226 12:19:43.429305 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l2l5g/crc-debug-ft29p" Feb 26 12:19:44 crc kubenswrapper[4699]: I0226 12:19:44.277823 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="858ae445-a203-46c0-b9f1-4dcf82a7b902" path="/var/lib/kubelet/pods/858ae445-a203-46c0-b9f1-4dcf82a7b902/volumes" Feb 26 12:19:46 crc kubenswrapper[4699]: I0226 12:19:46.039251 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wrd9m" Feb 26 12:19:46 crc kubenswrapper[4699]: I0226 12:19:46.039598 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wrd9m" Feb 26 12:19:47 crc kubenswrapper[4699]: I0226 12:19:47.271959 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wrd9m" podUID="b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf" containerName="registry-server" probeResult="failure" output=< Feb 26 12:19:47 crc kubenswrapper[4699]: timeout: failed to connect service ":50051" within 1s Feb 26 12:19:47 crc kubenswrapper[4699]: > Feb 26 12:19:56 crc kubenswrapper[4699]: I0226 12:19:56.104184 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wrd9m" Feb 26 12:19:56 crc kubenswrapper[4699]: I0226 12:19:56.239706 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wrd9m" Feb 26 12:19:56 crc kubenswrapper[4699]: I0226 12:19:56.903815 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wrd9m"] Feb 26 12:19:57 crc kubenswrapper[4699]: I0226 12:19:57.930931 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wrd9m" podUID="b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf" containerName="registry-server" containerID="cri-o://ed1cca4cfd46370f9a4e1dd43a9fe5c873f3c94e168cab7d19373995433384a7" gracePeriod=2 Feb 26 12:19:58 crc kubenswrapper[4699]: I0226 12:19:58.417657 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wrd9m" Feb 26 12:19:58 crc kubenswrapper[4699]: I0226 12:19:58.619680 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbd45\" (UniqueName: \"kubernetes.io/projected/b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf-kube-api-access-cbd45\") pod \"b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf\" (UID: \"b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf\") " Feb 26 12:19:58 crc kubenswrapper[4699]: I0226 12:19:58.620044 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf-utilities\") pod \"b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf\" (UID: \"b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf\") " Feb 26 12:19:58 crc kubenswrapper[4699]: I0226 12:19:58.620243 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf-catalog-content\") pod \"b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf\" (UID: \"b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf\") " Feb 26 12:19:58 crc kubenswrapper[4699]: I0226 12:19:58.620840 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf-utilities" (OuterVolumeSpecName: "utilities") pod "b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf" (UID: "b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:19:58 crc kubenswrapper[4699]: I0226 12:19:58.621095 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 12:19:58 crc kubenswrapper[4699]: I0226 12:19:58.628313 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf-kube-api-access-cbd45" (OuterVolumeSpecName: "kube-api-access-cbd45") pod "b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf" (UID: "b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf"). InnerVolumeSpecName "kube-api-access-cbd45". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:19:58 crc kubenswrapper[4699]: I0226 12:19:58.722945 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbd45\" (UniqueName: \"kubernetes.io/projected/b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf-kube-api-access-cbd45\") on node \"crc\" DevicePath \"\"" Feb 26 12:19:58 crc kubenswrapper[4699]: I0226 12:19:58.745723 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf" (UID: "b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:19:58 crc kubenswrapper[4699]: I0226 12:19:58.823891 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 12:19:58 crc kubenswrapper[4699]: I0226 12:19:58.943234 4699 generic.go:334] "Generic (PLEG): container finished" podID="b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf" containerID="ed1cca4cfd46370f9a4e1dd43a9fe5c873f3c94e168cab7d19373995433384a7" exitCode=0 Feb 26 12:19:58 crc kubenswrapper[4699]: I0226 12:19:58.943280 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrd9m" event={"ID":"b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf","Type":"ContainerDied","Data":"ed1cca4cfd46370f9a4e1dd43a9fe5c873f3c94e168cab7d19373995433384a7"} Feb 26 12:19:58 crc kubenswrapper[4699]: I0226 12:19:58.943315 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrd9m" event={"ID":"b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf","Type":"ContainerDied","Data":"d385ffaad2418df1f042a6db3f0181c3457f9bd79043978aea4ed59564bf6651"} Feb 26 12:19:58 crc kubenswrapper[4699]: I0226 12:19:58.943375 4699 scope.go:117] "RemoveContainer" containerID="ed1cca4cfd46370f9a4e1dd43a9fe5c873f3c94e168cab7d19373995433384a7" Feb 26 12:19:58 crc kubenswrapper[4699]: I0226 12:19:58.944388 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wrd9m" Feb 26 12:19:58 crc kubenswrapper[4699]: I0226 12:19:58.975360 4699 scope.go:117] "RemoveContainer" containerID="7de7cd2371d3abbfbf7615ebcd3acb90e41a2f685f589eb3f400c395ba8b7d8e" Feb 26 12:19:58 crc kubenswrapper[4699]: I0226 12:19:58.994717 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wrd9m"] Feb 26 12:19:59 crc kubenswrapper[4699]: I0226 12:19:59.003071 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wrd9m"] Feb 26 12:19:59 crc kubenswrapper[4699]: I0226 12:19:59.006230 4699 scope.go:117] "RemoveContainer" containerID="2daebaa5370afaad83a99ae877015f25f2523dc3704a3d0897f2dfb9d894739f" Feb 26 12:19:59 crc kubenswrapper[4699]: I0226 12:19:59.258350 4699 scope.go:117] "RemoveContainer" containerID="ed1cca4cfd46370f9a4e1dd43a9fe5c873f3c94e168cab7d19373995433384a7" Feb 26 12:19:59 crc kubenswrapper[4699]: E0226 12:19:59.258786 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed1cca4cfd46370f9a4e1dd43a9fe5c873f3c94e168cab7d19373995433384a7\": container with ID starting with ed1cca4cfd46370f9a4e1dd43a9fe5c873f3c94e168cab7d19373995433384a7 not found: ID does not exist" containerID="ed1cca4cfd46370f9a4e1dd43a9fe5c873f3c94e168cab7d19373995433384a7" Feb 26 12:19:59 crc kubenswrapper[4699]: I0226 12:19:59.258824 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed1cca4cfd46370f9a4e1dd43a9fe5c873f3c94e168cab7d19373995433384a7"} err="failed to get container status \"ed1cca4cfd46370f9a4e1dd43a9fe5c873f3c94e168cab7d19373995433384a7\": rpc error: code = NotFound desc = could not find container \"ed1cca4cfd46370f9a4e1dd43a9fe5c873f3c94e168cab7d19373995433384a7\": container with ID starting with ed1cca4cfd46370f9a4e1dd43a9fe5c873f3c94e168cab7d19373995433384a7 not found: ID does not exist" Feb 26 12:19:59 crc kubenswrapper[4699]: I0226 12:19:59.258845 4699 scope.go:117] "RemoveContainer" containerID="7de7cd2371d3abbfbf7615ebcd3acb90e41a2f685f589eb3f400c395ba8b7d8e" Feb 26 12:19:59 crc kubenswrapper[4699]: E0226 12:19:59.259080 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7de7cd2371d3abbfbf7615ebcd3acb90e41a2f685f589eb3f400c395ba8b7d8e\": container with ID starting with 7de7cd2371d3abbfbf7615ebcd3acb90e41a2f685f589eb3f400c395ba8b7d8e not found: ID does not exist" containerID="7de7cd2371d3abbfbf7615ebcd3acb90e41a2f685f589eb3f400c395ba8b7d8e" Feb 26 12:19:59 crc kubenswrapper[4699]: I0226 12:19:59.259161 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7de7cd2371d3abbfbf7615ebcd3acb90e41a2f685f589eb3f400c395ba8b7d8e"} err="failed to get container status \"7de7cd2371d3abbfbf7615ebcd3acb90e41a2f685f589eb3f400c395ba8b7d8e\": rpc error: code = NotFound desc = could not find container \"7de7cd2371d3abbfbf7615ebcd3acb90e41a2f685f589eb3f400c395ba8b7d8e\": container with ID starting with 7de7cd2371d3abbfbf7615ebcd3acb90e41a2f685f589eb3f400c395ba8b7d8e not found: ID does not exist" Feb 26 12:19:59 crc kubenswrapper[4699]: I0226 12:19:59.259174 4699 scope.go:117] "RemoveContainer" containerID="2daebaa5370afaad83a99ae877015f25f2523dc3704a3d0897f2dfb9d894739f" Feb 26 12:19:59 crc kubenswrapper[4699]: E0226 12:19:59.260266 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2daebaa5370afaad83a99ae877015f25f2523dc3704a3d0897f2dfb9d894739f\": container with ID starting with 2daebaa5370afaad83a99ae877015f25f2523dc3704a3d0897f2dfb9d894739f not found: ID does not exist" containerID="2daebaa5370afaad83a99ae877015f25f2523dc3704a3d0897f2dfb9d894739f" Feb 26 12:19:59 crc kubenswrapper[4699]: I0226 12:19:59.260306 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2daebaa5370afaad83a99ae877015f25f2523dc3704a3d0897f2dfb9d894739f"} err="failed to get container status \"2daebaa5370afaad83a99ae877015f25f2523dc3704a3d0897f2dfb9d894739f\": rpc error: code = NotFound desc = could not find container \"2daebaa5370afaad83a99ae877015f25f2523dc3704a3d0897f2dfb9d894739f\": container with ID starting with 2daebaa5370afaad83a99ae877015f25f2523dc3704a3d0897f2dfb9d894739f not found: ID does not exist" Feb 26 12:20:00 crc kubenswrapper[4699]: I0226 12:20:00.154455 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535140-wg97p"] Feb 26 12:20:00 crc kubenswrapper[4699]: E0226 12:20:00.155000 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf" containerName="extract-utilities" Feb 26 12:20:00 crc kubenswrapper[4699]: I0226 12:20:00.155028 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf" containerName="extract-utilities" Feb 26 12:20:00 crc kubenswrapper[4699]: E0226 12:20:00.155059 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858ae445-a203-46c0-b9f1-4dcf82a7b902" containerName="container-00" Feb 26 12:20:00 crc kubenswrapper[4699]: I0226 12:20:00.155068 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="858ae445-a203-46c0-b9f1-4dcf82a7b902" containerName="container-00" Feb 26 12:20:00 crc kubenswrapper[4699]: E0226 12:20:00.155083 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf" containerName="registry-server" Feb 26 12:20:00 crc kubenswrapper[4699]: I0226 12:20:00.155090 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf" containerName="registry-server" Feb 26 12:20:00 crc kubenswrapper[4699]: E0226 12:20:00.155132 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf" containerName="extract-content" Feb 26 12:20:00 crc kubenswrapper[4699]: I0226 12:20:00.155139 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf" containerName="extract-content" Feb 26 12:20:00 crc kubenswrapper[4699]: I0226 12:20:00.155450 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf" containerName="registry-server" Feb 26 12:20:00 crc kubenswrapper[4699]: I0226 12:20:00.155473 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="858ae445-a203-46c0-b9f1-4dcf82a7b902" containerName="container-00" Feb 26 12:20:00 crc kubenswrapper[4699]: I0226 12:20:00.156404 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535140-wg97p" Feb 26 12:20:00 crc kubenswrapper[4699]: I0226 12:20:00.161323 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 12:20:00 crc kubenswrapper[4699]: I0226 12:20:00.161617 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 12:20:00 crc kubenswrapper[4699]: I0226 12:20:00.161783 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 12:20:00 crc kubenswrapper[4699]: I0226 12:20:00.175763 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535140-wg97p"] Feb 26 12:20:00 crc kubenswrapper[4699]: I0226 12:20:00.271682 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf" path="/var/lib/kubelet/pods/b51326a2-4ccd-48ce-8c3b-e6fb9ffc56bf/volumes" Feb 26 12:20:00 crc kubenswrapper[4699]: I0226 12:20:00.339174 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hrsf\" (UniqueName: \"kubernetes.io/projected/924cba42-fd14-4d50-815d-0d8fa83c6b06-kube-api-access-7hrsf\") pod \"auto-csr-approver-29535140-wg97p\" (UID: \"924cba42-fd14-4d50-815d-0d8fa83c6b06\") " pod="openshift-infra/auto-csr-approver-29535140-wg97p" Feb 26 12:20:00 crc kubenswrapper[4699]: I0226 12:20:00.441444 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hrsf\" (UniqueName: \"kubernetes.io/projected/924cba42-fd14-4d50-815d-0d8fa83c6b06-kube-api-access-7hrsf\") pod \"auto-csr-approver-29535140-wg97p\" (UID: \"924cba42-fd14-4d50-815d-0d8fa83c6b06\") " pod="openshift-infra/auto-csr-approver-29535140-wg97p" Feb 26 12:20:00 crc kubenswrapper[4699]: I0226 12:20:00.479699 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hrsf\" (UniqueName: \"kubernetes.io/projected/924cba42-fd14-4d50-815d-0d8fa83c6b06-kube-api-access-7hrsf\") pod \"auto-csr-approver-29535140-wg97p\" (UID: \"924cba42-fd14-4d50-815d-0d8fa83c6b06\") " pod="openshift-infra/auto-csr-approver-29535140-wg97p" Feb 26 12:20:00 crc kubenswrapper[4699]: I0226 12:20:00.776379 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535140-wg97p" Feb 26 12:20:01 crc kubenswrapper[4699]: I0226 12:20:01.608572 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535140-wg97p"] Feb 26 12:20:02 crc kubenswrapper[4699]: I0226 12:20:02.143752 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535140-wg97p" event={"ID":"924cba42-fd14-4d50-815d-0d8fa83c6b06","Type":"ContainerStarted","Data":"86525d46e644f97de7bd2d890833add4ab8654ccebe3f730334374a10853b020"} Feb 26 12:20:06 crc kubenswrapper[4699]: I0226 12:20:06.179874 4699 generic.go:334] "Generic (PLEG): container finished" podID="924cba42-fd14-4d50-815d-0d8fa83c6b06" containerID="74ea3c51dc439314ff3bb87ede5fd5f905e28e2682d357fd7d7822dde4facddf" exitCode=0 Feb 26 12:20:06 crc kubenswrapper[4699]: I0226 12:20:06.180082 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535140-wg97p" event={"ID":"924cba42-fd14-4d50-815d-0d8fa83c6b06","Type":"ContainerDied","Data":"74ea3c51dc439314ff3bb87ede5fd5f905e28e2682d357fd7d7822dde4facddf"} Feb 26 12:20:07 crc kubenswrapper[4699]: I0226 12:20:07.864588 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535140-wg97p" Feb 26 12:20:08 crc kubenswrapper[4699]: I0226 12:20:08.007405 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hrsf\" (UniqueName: \"kubernetes.io/projected/924cba42-fd14-4d50-815d-0d8fa83c6b06-kube-api-access-7hrsf\") pod \"924cba42-fd14-4d50-815d-0d8fa83c6b06\" (UID: \"924cba42-fd14-4d50-815d-0d8fa83c6b06\") " Feb 26 12:20:08 crc kubenswrapper[4699]: I0226 12:20:08.013771 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/924cba42-fd14-4d50-815d-0d8fa83c6b06-kube-api-access-7hrsf" (OuterVolumeSpecName: "kube-api-access-7hrsf") pod "924cba42-fd14-4d50-815d-0d8fa83c6b06" (UID: "924cba42-fd14-4d50-815d-0d8fa83c6b06"). InnerVolumeSpecName "kube-api-access-7hrsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:20:08 crc kubenswrapper[4699]: I0226 12:20:08.110010 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hrsf\" (UniqueName: \"kubernetes.io/projected/924cba42-fd14-4d50-815d-0d8fa83c6b06-kube-api-access-7hrsf\") on node \"crc\" DevicePath \"\"" Feb 26 12:20:08 crc kubenswrapper[4699]: I0226 12:20:08.201054 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535140-wg97p" event={"ID":"924cba42-fd14-4d50-815d-0d8fa83c6b06","Type":"ContainerDied","Data":"86525d46e644f97de7bd2d890833add4ab8654ccebe3f730334374a10853b020"} Feb 26 12:20:08 crc kubenswrapper[4699]: I0226 12:20:08.201106 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86525d46e644f97de7bd2d890833add4ab8654ccebe3f730334374a10853b020" Feb 26 12:20:08 crc kubenswrapper[4699]: I0226 12:20:08.201125 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535140-wg97p" Feb 26 12:20:09 crc kubenswrapper[4699]: I0226 12:20:09.259469 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535134-bh8fn"] Feb 26 12:20:09 crc kubenswrapper[4699]: I0226 12:20:09.268722 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535134-bh8fn"] Feb 26 12:20:10 crc kubenswrapper[4699]: I0226 12:20:10.271986 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="916cd984-33ed-4299-ade5-5064478d656f" path="/var/lib/kubelet/pods/916cd984-33ed-4299-ade5-5064478d656f/volumes" Feb 26 12:20:11 crc kubenswrapper[4699]: I0226 12:20:11.585376 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 12:20:11 crc kubenswrapper[4699]: I0226 12:20:11.585479 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 12:20:11 crc kubenswrapper[4699]: I0226 12:20:11.844609 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-977f89944-b96zk_dd004e01-9dac-4316-b6ee-05c1a0f20713/barbican-api/0.log" Feb 26 12:20:12 crc kubenswrapper[4699]: I0226 12:20:12.223986 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-977f89944-b96zk_dd004e01-9dac-4316-b6ee-05c1a0f20713/barbican-api-log/0.log" Feb 26 12:20:12 crc kubenswrapper[4699]: I0226 12:20:12.241099 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5bb8c656f4-cl8tt_770f4ffe-352c-416b-8f67-a894c4107003/barbican-keystone-listener/0.log" Feb 26 12:20:12 crc kubenswrapper[4699]: I0226 12:20:12.340594 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5bb8c656f4-cl8tt_770f4ffe-352c-416b-8f67-a894c4107003/barbican-keystone-listener-log/0.log" Feb 26 12:20:12 crc kubenswrapper[4699]: I0226 12:20:12.492397 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6596b66679-qmv4f_edb59470-4038-48c2-a3ec-f3046406a971/barbican-worker-log/0.log" Feb 26 12:20:12 crc kubenswrapper[4699]: I0226 12:20:12.517539 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6596b66679-qmv4f_edb59470-4038-48c2-a3ec-f3046406a971/barbican-worker/0.log" Feb 26 12:20:12 crc kubenswrapper[4699]: I0226 12:20:12.728726 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-pl6wj_fee4a36b-0896-43c1-9b23-3da3ae870cbe/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:20:12 crc kubenswrapper[4699]: I0226 12:20:12.789764 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_09a6eb79-27c3-465b-adae-b32d96c56b65/ceilometer-central-agent/0.log" Feb 26 12:20:13 crc kubenswrapper[4699]: I0226 12:20:13.348088 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_09a6eb79-27c3-465b-adae-b32d96c56b65/ceilometer-notification-agent/0.log" Feb 26 12:20:13 crc kubenswrapper[4699]: I0226 12:20:13.442464 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_09a6eb79-27c3-465b-adae-b32d96c56b65/proxy-httpd/0.log" Feb 26 12:20:13 crc kubenswrapper[4699]: I0226 12:20:13.554871 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_09a6eb79-27c3-465b-adae-b32d96c56b65/sg-core/0.log" Feb 26 12:20:13 crc kubenswrapper[4699]: I0226 12:20:13.603534 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c2c2d2c1-e68e-4b14-a732-3b42a6132503/cinder-api/0.log" Feb 26 12:20:13 crc kubenswrapper[4699]: I0226 12:20:13.739097 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c2c2d2c1-e68e-4b14-a732-3b42a6132503/cinder-api-log/0.log" Feb 26 12:20:13 crc kubenswrapper[4699]: I0226 12:20:13.819994 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fbf1f488-444f-45d3-b5e6-44506bf45f8e/cinder-scheduler/0.log" Feb 26 12:20:13 crc kubenswrapper[4699]: I0226 12:20:13.862639 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fbf1f488-444f-45d3-b5e6-44506bf45f8e/probe/0.log" Feb 26 12:20:14 crc kubenswrapper[4699]: I0226 12:20:14.643393 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-86gl7_b1a06be0-15ce-4abd-b9e7-7e11e789bd64/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:20:14 crc kubenswrapper[4699]: I0226 12:20:14.659759 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-h9q25_85e0d37e-fb25-4bbc-afe5-7e6ab304390c/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:20:14 crc kubenswrapper[4699]: I0226 12:20:14.849667 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-hddfn_24dd88a8-4737-4ebc-8925-b2bcedb760c2/init/0.log" Feb 26 12:20:15 crc kubenswrapper[4699]: I0226 12:20:15.109427 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-hddfn_24dd88a8-4737-4ebc-8925-b2bcedb760c2/dnsmasq-dns/0.log" Feb 26 12:20:15 crc kubenswrapper[4699]: I0226 12:20:15.298061 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-hddfn_24dd88a8-4737-4ebc-8925-b2bcedb760c2/init/0.log" Feb 26 12:20:15 crc kubenswrapper[4699]: I0226 12:20:15.312364 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-f97wz_8d139dcb-cb7b-4711-a8a6-62e27c3cd7e2/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:20:15 crc kubenswrapper[4699]: I0226 12:20:15.645162 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9c58ea0a-4ad4-47cf-8976-a004ef7e56da/glance-httpd/0.log" Feb 26 12:20:15 crc kubenswrapper[4699]: I0226 12:20:15.715130 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9c58ea0a-4ad4-47cf-8976-a004ef7e56da/glance-log/0.log" Feb 26 12:20:15 crc kubenswrapper[4699]: I0226 12:20:15.844941 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_796738f1-8a6c-4e91-bdfe-bee2f252b3fc/glance-log/0.log" Feb 26 12:20:15 crc kubenswrapper[4699]: I0226 12:20:15.892575 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_796738f1-8a6c-4e91-bdfe-bee2f252b3fc/glance-httpd/0.log" Feb 26 12:20:16 crc kubenswrapper[4699]: I0226 12:20:16.486504 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5795557cd8-dvzqq_15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0/horizon/0.log" Feb 26 12:20:16 crc kubenswrapper[4699]: I0226 12:20:16.649143 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-hp4pv_e537c30c-dc6b-406f-bb86-5540ebd8a36d/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:20:16 crc kubenswrapper[4699]: I0226 12:20:16.679932 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5795557cd8-dvzqq_15a1a0c7-f6c0-46a5-86f9-a0dec5a257e0/horizon-log/0.log" Feb 26 12:20:16 crc kubenswrapper[4699]: I0226 12:20:16.745399 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-mlb2f_ac66647f-74c0-4a4e-9925-e47cd90568a1/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:20:16 crc kubenswrapper[4699]: I0226 12:20:16.955389 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29535121-plvtd_ef1e8bd7-66e8-4eef-979e-8bf3e57b2a68/keystone-cron/0.log" Feb 26 12:20:16 crc kubenswrapper[4699]: I0226 12:20:16.993619 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-67d4f89fb9-65kmq_5d9e1983-3363-4542-a5f0-deb132ea6994/keystone-api/0.log" Feb 26 12:20:17 crc kubenswrapper[4699]: I0226 12:20:17.091346 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_c685fadd-b283-40bc-9de2-3372317b9875/kube-state-metrics/0.log" Feb 26 12:20:17 crc kubenswrapper[4699]: I0226 12:20:17.148673 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-9wl2f_6436c321-6850-4db3-81b2-0dc329e10900/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:20:17 crc kubenswrapper[4699]: I0226 12:20:17.574761 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d45896d49-mh862_862cb546-78f8-4864-a158-9dc217ec2796/neutron-httpd/0.log" Feb 26 12:20:17 crc kubenswrapper[4699]: I0226 12:20:17.658219 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbp7l_59456382-a459-4f82-ac99-b96eb735ddb9/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:20:17 crc kubenswrapper[4699]: I0226 12:20:17.678864 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d45896d49-mh862_862cb546-78f8-4864-a158-9dc217ec2796/neutron-api/0.log" Feb 26 12:20:18 crc kubenswrapper[4699]: I0226 12:20:18.257440 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2d0d807f-7fdc-4239-b7bb-1952c2f7c222/nova-api-log/0.log" Feb 26 12:20:18 crc kubenswrapper[4699]: I0226 12:20:18.369109 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_2ff15a2d-962f-421b-be00-e3bf6ef22612/nova-cell0-conductor-conductor/0.log" Feb 26 12:20:18 crc kubenswrapper[4699]: I0226 12:20:18.572039 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2d0d807f-7fdc-4239-b7bb-1952c2f7c222/nova-api-api/0.log" Feb 26 12:20:18 crc kubenswrapper[4699]: I0226 12:20:18.620852 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_ff2b3846-c197-4cc6-a442-0f466d97d53d/nova-cell1-conductor-conductor/0.log" Feb 26 12:20:18 crc kubenswrapper[4699]: I0226 12:20:18.726590 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_8bb28763-ceae-456c-a0d6-5df33b478106/nova-cell1-novncproxy-novncproxy/0.log" Feb 26 12:20:18 crc kubenswrapper[4699]: I0226 12:20:18.911124 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-wv666_2c2e8329-038c-4347-b30f-f8b42f36cc67/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:20:19 crc kubenswrapper[4699]: I0226 12:20:19.273017 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_15752dfa-4afb-412f-99a0-75c5fe76f6a8/nova-metadata-log/0.log" Feb 26 12:20:19 crc kubenswrapper[4699]: I0226 12:20:19.532677 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_9d8371db-373f-4a41-97cb-b2d00aa17571/nova-scheduler-scheduler/0.log" Feb 26 12:20:19 crc kubenswrapper[4699]: I0226 12:20:19.533857 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_edce8e75-6dd5-4fbd-8f76-bc6553cc27b9/mysql-bootstrap/0.log" Feb 26 12:20:19 crc kubenswrapper[4699]: I0226 12:20:19.691527 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_edce8e75-6dd5-4fbd-8f76-bc6553cc27b9/mysql-bootstrap/0.log" Feb 26 12:20:19 crc kubenswrapper[4699]: I0226 12:20:19.782910 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_edce8e75-6dd5-4fbd-8f76-bc6553cc27b9/galera/0.log" Feb 26 12:20:19 crc kubenswrapper[4699]: I0226 12:20:19.914815 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6fdc6b6d-ac77-4179-9864-f220d622c0f4/mysql-bootstrap/0.log" Feb 26 12:20:20 crc kubenswrapper[4699]: I0226 12:20:20.119940 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6fdc6b6d-ac77-4179-9864-f220d622c0f4/mysql-bootstrap/0.log" Feb 26 12:20:20 crc kubenswrapper[4699]: I0226 12:20:20.197336 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6fdc6b6d-ac77-4179-9864-f220d622c0f4/galera/0.log" Feb 26 12:20:20 crc kubenswrapper[4699]: I0226 12:20:20.323527 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_16db7cc3-bd7c-44aa-b92f-d2a645d96ef0/openstackclient/0.log" Feb 26 12:20:20 crc kubenswrapper[4699]: I0226 12:20:20.516076 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-qfxsz_a4767003-9eba-4b86-933c-5bcbaa93e458/openstack-network-exporter/0.log" Feb 26 12:20:20 crc kubenswrapper[4699]: I0226 12:20:20.717974 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_15752dfa-4afb-412f-99a0-75c5fe76f6a8/nova-metadata-metadata/0.log" Feb 26 12:20:20 crc kubenswrapper[4699]: I0226 12:20:20.719141 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-nrvng_cd4015f0-f1a7-40d7-ae69-089f74a6873d/ovn-controller/0.log" Feb 26 12:20:20 crc kubenswrapper[4699]: I0226 12:20:20.804518 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gxnxl_8afc038e-11dc-4959-a6b0-61e9b1c2dc35/ovsdb-server-init/0.log" Feb 26 12:20:21 crc kubenswrapper[4699]: I0226 12:20:21.020782 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gxnxl_8afc038e-11dc-4959-a6b0-61e9b1c2dc35/ovs-vswitchd/0.log" Feb 26 12:20:21 crc kubenswrapper[4699]: I0226 12:20:21.022478 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gxnxl_8afc038e-11dc-4959-a6b0-61e9b1c2dc35/ovsdb-server-init/0.log" Feb 26 12:20:21 crc kubenswrapper[4699]: I0226 12:20:21.121185 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gxnxl_8afc038e-11dc-4959-a6b0-61e9b1c2dc35/ovsdb-server/0.log" Feb 26 12:20:21 crc kubenswrapper[4699]: I0226 12:20:21.248601 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-hmpqg_dd7a61f4-47ba-4b2f-9015-c6ab27ff3c6b/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:20:21 crc kubenswrapper[4699]: I0226 12:20:21.283571 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8fbd47d6-02c1-4ac4-a981-231eb0f13530/openstack-network-exporter/0.log" Feb 26 12:20:21 crc kubenswrapper[4699]: I0226 12:20:21.320870 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8fbd47d6-02c1-4ac4-a981-231eb0f13530/ovn-northd/0.log" Feb 26 12:20:21 crc kubenswrapper[4699]: I0226 12:20:21.507818 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ef805480-81ec-4d0b-b2ca-06db4bf74383/openstack-network-exporter/0.log" Feb 26 12:20:21 crc kubenswrapper[4699]: I0226 12:20:21.538134 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ef805480-81ec-4d0b-b2ca-06db4bf74383/ovsdbserver-nb/0.log" Feb 26 12:20:21 crc kubenswrapper[4699]: I0226 12:20:21.725459 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b981c8a5-ce76-4bc1-a018-28255391e3f2/openstack-network-exporter/0.log" Feb 26 12:20:21 crc kubenswrapper[4699]: I0226 12:20:21.733256 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b981c8a5-ce76-4bc1-a018-28255391e3f2/ovsdbserver-sb/0.log" Feb 26 12:20:21 crc kubenswrapper[4699]: I0226 12:20:21.941405 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-d4878dd78-qpvzg_b7700bd0-21d8-4b96-9753-2619443038a3/placement-api/0.log" Feb 26 12:20:22 crc kubenswrapper[4699]: I0226 12:20:22.038842 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3b731314-eb90-4a19-a425-2f9282af2a7f/setup-container/0.log" Feb 26 12:20:22 crc kubenswrapper[4699]: I0226 12:20:22.094481 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-d4878dd78-qpvzg_b7700bd0-21d8-4b96-9753-2619443038a3/placement-log/0.log" Feb 26 12:20:22 crc kubenswrapper[4699]: I0226 12:20:22.202961 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3b731314-eb90-4a19-a425-2f9282af2a7f/setup-container/0.log" Feb 26 12:20:22 crc kubenswrapper[4699]: I0226 12:20:22.267756 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3b731314-eb90-4a19-a425-2f9282af2a7f/rabbitmq/0.log" Feb 26 12:20:22 crc kubenswrapper[4699]: I0226 12:20:22.416480 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0d9b2e6e-c43b-49ae-a71e-844610621e3e/setup-container/0.log" Feb 26 12:20:22 crc kubenswrapper[4699]: I0226 12:20:22.837772 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0d9b2e6e-c43b-49ae-a71e-844610621e3e/setup-container/0.log" Feb 26 12:20:22 crc kubenswrapper[4699]: I0226 12:20:22.866035 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0d9b2e6e-c43b-49ae-a71e-844610621e3e/rabbitmq/0.log" Feb 26 12:20:22 crc kubenswrapper[4699]: I0226 12:20:22.904383 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-lqq4l_a1aabb80-3c23-4f5a-9bd1-4d573089856c/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:20:23 crc kubenswrapper[4699]: I0226 12:20:23.084351 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-zdf2z_fcea0fcf-0c80-4334-9327-f0a57b385cc9/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:20:23 crc kubenswrapper[4699]: I0226 12:20:23.177604 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-mqd9n_57bbec48-f33e-43b8-9f82-8cc3a42e7723/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:20:23 crc kubenswrapper[4699]: I0226 12:20:23.310666 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-8w2tv_96b6beba-4e99-4cb7-b49b-3f211c5e12b7/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:20:23 crc kubenswrapper[4699]: I0226 12:20:23.417742 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-t4sjg_2930a730-d5e2-49e1-a618-7428b999a73d/ssh-known-hosts-edpm-deployment/0.log" Feb 26 12:20:23 crc kubenswrapper[4699]: I0226 12:20:23.622056 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-78cbc76b59-m6shv_5a4ece68-df2a-480c-9531-1d133d7f4bd0/proxy-server/0.log" Feb 26 12:20:23 crc kubenswrapper[4699]: I0226 12:20:23.736689 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-78cbc76b59-m6shv_5a4ece68-df2a-480c-9531-1d133d7f4bd0/proxy-httpd/0.log" Feb 26 12:20:24 crc kubenswrapper[4699]: I0226 12:20:24.307390 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-lqqdx_9125ee3a-a0b6-469b-b79d-3a376f2d5d91/swift-ring-rebalance/0.log" Feb 26 12:20:24 crc kubenswrapper[4699]: I0226 12:20:24.317111 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/account-auditor/0.log" Feb 26 12:20:24 crc kubenswrapper[4699]: I0226 12:20:24.356798 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/account-reaper/0.log" Feb 26 12:20:24 crc kubenswrapper[4699]: I0226 12:20:24.500051 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/account-server/0.log" Feb 26 12:20:24 crc kubenswrapper[4699]: I0226 12:20:24.544531 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/account-replicator/0.log" Feb 26 12:20:24 crc kubenswrapper[4699]: I0226 12:20:24.611524 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/container-auditor/0.log" Feb 26 12:20:24 crc kubenswrapper[4699]: I0226 12:20:24.640661 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/container-replicator/0.log" Feb 26 12:20:24 crc kubenswrapper[4699]: I0226 12:20:24.715569 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/container-server/0.log" Feb 26 12:20:24 crc kubenswrapper[4699]: I0226 12:20:24.770885 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/container-updater/0.log" Feb 26 12:20:24 crc kubenswrapper[4699]: I0226 12:20:24.815792 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/object-auditor/0.log" Feb 26 12:20:24 crc kubenswrapper[4699]: I0226 12:20:24.912878 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/object-expirer/0.log" Feb 26 12:20:24 crc kubenswrapper[4699]: I0226 12:20:24.913227 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/object-replicator/0.log" Feb 26 12:20:24 crc kubenswrapper[4699]: I0226 12:20:24.967942 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/object-server/0.log" Feb 26 12:20:25 crc kubenswrapper[4699]: I0226 12:20:25.060394 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/object-updater/0.log" Feb 26 12:20:25 crc kubenswrapper[4699]: I0226 12:20:25.130922 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/swift-recon-cron/0.log" Feb 26 12:20:25 crc kubenswrapper[4699]: I0226 12:20:25.142760 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f23ec57b-7ab1-4152-8108-e0e27b4ba95c/rsync/0.log" Feb 26 12:20:25 crc kubenswrapper[4699]: I0226 12:20:25.312247 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-zkpz9_08bdd16a-fc18-4262-9175-a05b613a76c9/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:20:26 crc kubenswrapper[4699]: I0226 12:20:26.107588 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_66beadbe-fd5d-48af-8a33-8a652c8d1c71/test-operator-logs-container/0.log" Feb 26 12:20:26 crc kubenswrapper[4699]: I0226 12:20:26.121390 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_19e02200-91be-49f8-8174-4a0bf6cda9dd/tempest-tests-tempest-tests-runner/0.log" Feb 26 12:20:26 crc kubenswrapper[4699]: I0226 12:20:26.351496 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-9npsm_974c869a-b430-4a83-81d0-ece37d67c0b0/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 12:20:35 crc kubenswrapper[4699]: I0226 12:20:35.335403 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_6530fcf8-efdc-4f91-96cb-4f4bdc8bd1d2/memcached/0.log" Feb 26 12:20:40 crc kubenswrapper[4699]: I0226 12:20:40.494281 4699 scope.go:117] "RemoveContainer" containerID="ae1928085c149280cf3addf69107c792048518ecf95f2de337f2886f53e0e594" Feb 26 12:20:41 crc kubenswrapper[4699]: I0226 12:20:41.585102 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 12:20:41 crc kubenswrapper[4699]: I0226 12:20:41.585487 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 12:20:41 crc kubenswrapper[4699]: I0226 12:20:41.585537 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 12:20:41 crc kubenswrapper[4699]: I0226 12:20:41.586382 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"321e80bae8579e8007aa1cc495575fd7eef57d9379aadf703c862dea223958e5"} pod="openshift-machine-config-operator/machine-config-daemon-28p79" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 12:20:41 crc kubenswrapper[4699]: I0226 12:20:41.586432 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" containerID="cri-o://321e80bae8579e8007aa1cc495575fd7eef57d9379aadf703c862dea223958e5" gracePeriod=600 Feb 26 12:20:42 crc kubenswrapper[4699]: I0226 12:20:42.572551 4699 generic.go:334] "Generic (PLEG): container finished" podID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerID="321e80bae8579e8007aa1cc495575fd7eef57d9379aadf703c862dea223958e5" exitCode=0 Feb 26 12:20:42 crc kubenswrapper[4699]: I0226 12:20:42.572642 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerDied","Data":"321e80bae8579e8007aa1cc495575fd7eef57d9379aadf703c862dea223958e5"} Feb 26 12:20:42 crc kubenswrapper[4699]: I0226 12:20:42.573530 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerStarted","Data":"db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30"} Feb 26 12:20:42 crc kubenswrapper[4699]: I0226 12:20:42.573560 4699 scope.go:117] "RemoveContainer" containerID="994e7fffe13fcdad450e5df047474260de901881a302afaa9d85c3116c5763b7" Feb 26 12:20:55 crc kubenswrapper[4699]: I0226 12:20:55.252026 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-4k4sm_07c2552c-8182-4cfe-a397-39ad287029e5/manager/0.log" Feb 26 12:20:55 crc kubenswrapper[4699]: I0226 12:20:55.474267 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8_449351cd-8256-4e21-b27e-be3c4db11ca5/util/0.log" Feb 26 12:20:55 crc kubenswrapper[4699]: I0226 12:20:55.665572 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8_449351cd-8256-4e21-b27e-be3c4db11ca5/pull/0.log" Feb 26 12:20:55 crc kubenswrapper[4699]: I0226 12:20:55.720355 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8_449351cd-8256-4e21-b27e-be3c4db11ca5/util/0.log" Feb 26 12:20:56 crc kubenswrapper[4699]: I0226 12:20:56.063865 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8_449351cd-8256-4e21-b27e-be3c4db11ca5/pull/0.log" Feb 26 12:20:56 crc kubenswrapper[4699]: I0226 12:20:56.251408 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8_449351cd-8256-4e21-b27e-be3c4db11ca5/util/0.log" Feb 26 12:20:56 crc kubenswrapper[4699]: I0226 12:20:56.255325 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8_449351cd-8256-4e21-b27e-be3c4db11ca5/pull/0.log" Feb 26 12:20:56 crc kubenswrapper[4699]: I0226 12:20:56.438930 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dfb67d78cf82038cea84fa4b5c0f8697740883dc99ef57c523054710b1xdxm8_449351cd-8256-4e21-b27e-be3c4db11ca5/extract/0.log" Feb 26 12:20:56 crc kubenswrapper[4699]: I0226 12:20:56.776930 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-784b5bb6c5-jh7vz_27e251bb-8f9b-48d4-9ea3-81d03fd85244/manager/0.log" Feb 26 12:20:57 crc kubenswrapper[4699]: I0226 12:20:57.002556 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-t8c9f_7b204025-d5ff-4c74-96b9-6774b62e0cc4/manager/0.log" Feb 26 12:20:57 crc kubenswrapper[4699]: I0226 12:20:57.286994 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-qf9vd_619dff06-7255-4aab-9ffe-9f2561bcc904/manager/0.log" Feb 26 12:20:57 crc kubenswrapper[4699]: I0226 12:20:57.434207 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-xw85z_35555f68-d5c4-44b2-9dfa-af5f91f57c7c/manager/0.log" Feb 26 12:20:57 crc kubenswrapper[4699]: I0226 12:20:57.664053 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-5k85p_d56efcbf-3414-4bd1-9cbf-d56c434ac529/manager/0.log" Feb 26 12:20:57 crc kubenswrapper[4699]: I0226 12:20:57.878911 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-mtrs6_afbeb2d8-c332-447b-a931-9fe7b246914d/manager/0.log" Feb 26 12:20:57 crc kubenswrapper[4699]: I0226 12:20:57.974300 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-d2pxc_a2c419ab-2a99-4d37-b46c-b84024f24b2e/manager/0.log" Feb 26 12:20:58 crc kubenswrapper[4699]: I0226 12:20:58.138442 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-9gwwj_caabfe5b-420c-47c7-9ed2-b4ac9b2d54f2/manager/0.log" Feb 26 12:20:58 crc kubenswrapper[4699]: I0226 12:20:58.317926 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-95whc_38eef260-c32f-4568-9936-6197ba984f05/manager/0.log" Feb 26 12:20:58 crc kubenswrapper[4699]: I0226 12:20:58.678012 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6bd4687957-6gblm_54959b79-361c-415a-986d-1af6d8eb6701/manager/0.log" Feb 26 12:20:58 crc kubenswrapper[4699]: I0226 12:20:58.717854 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-4mghs_0ca72d8e-ed5f-47e4-b9c0-62b41fd687ee/manager/0.log" Feb 26 12:20:58 crc kubenswrapper[4699]: I0226 12:20:58.734251 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-659dc6bbfc-2wj2n_a6e7ca85-e18b-4605-9180-316f65b82006/manager/0.log" Feb 26 12:20:58 crc kubenswrapper[4699]: I0226 12:20:58.878038 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cktpfb_ce7c40ca-05ad-49ca-a091-02ac588c3eb7/manager/0.log" Feb 26 12:20:59 crc kubenswrapper[4699]: I0226 12:20:59.217905 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7c5cc54f9c-wjrrd_3a6d1210-ece5-4666-80bf-c7c7821e441c/operator/0.log" Feb 26 12:20:59 crc kubenswrapper[4699]: I0226 12:20:59.343492 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-gmh8j_22cfe789-87ae-4b23-91c2-cbb5112e4285/registry-server/0.log" Feb 26 12:20:59 crc kubenswrapper[4699]: I0226 12:20:59.472136 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5955d8c787-96png_a90c4025-7bd1-401b-8f92-5f15a58fb3d6/manager/0.log" Feb 26 12:20:59 crc kubenswrapper[4699]: I0226 12:20:59.735204 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-jxr77_7545763d-d2d2-4b6e-980d-737062f0a894/manager/0.log" Feb 26 12:20:59 crc kubenswrapper[4699]: I0226 12:20:59.808253 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-ghqf4_8d440653-f1c3-483c-a37d-463dcfc15224/operator/0.log" Feb 26 12:20:59 crc kubenswrapper[4699]: I0226 12:20:59.985700 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-bqvxr_33fc0a61-18c9-4e80-b898-92a5b1b71dac/manager/0.log" Feb 26 12:21:00 crc kubenswrapper[4699]: I0226 12:21:00.448472 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5dc6794d5b-mwvnr_5be0c14a-e51f-4b69-ab58-c0cac66910e2/manager/0.log" Feb 26 12:21:00 crc kubenswrapper[4699]: I0226 12:21:00.483482 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-589c568786-f9kz5_15255a9b-0767-4518-8e81-ca9044f9190a/manager/0.log" Feb 26 12:21:00 crc kubenswrapper[4699]: I0226 12:21:00.693107 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-fnnc7_a2b3bf3b-a815-4033-983b-eedc16b8609f/manager/0.log" Feb 26 12:21:00 crc kubenswrapper[4699]: I0226 12:21:00.882721 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-947f4f86b-m69sv_ebf1a568-be30-4ceb-bc67-e3158a0280b9/manager/0.log" Feb 26 12:21:05 crc kubenswrapper[4699]: I0226 12:21:05.177778 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-sndb9_1814471e-5f82-4464-9528-75da66d7235b/manager/0.log" Feb 26 12:21:23 crc kubenswrapper[4699]: I0226 12:21:23.540183 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-p9wj4_bad776f4-e24b-41f1-88d8-2b1fe6258783/control-plane-machine-set-operator/0.log" Feb 26 12:21:23 crc kubenswrapper[4699]: I0226 12:21:23.626714 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pw64v_5d015dd8-56c9-4f61-b133-4951cda91ca5/kube-rbac-proxy/0.log" Feb 26 12:21:23 crc kubenswrapper[4699]: I0226 12:21:23.647958 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pw64v_5d015dd8-56c9-4f61-b133-4951cda91ca5/machine-api-operator/0.log" Feb 26 12:21:36 crc kubenswrapper[4699]: I0226 12:21:36.352136 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-fhn2n_fc42522b-c5f4-4df2-8435-3e3985dd960c/cert-manager-controller/0.log" Feb 26 12:21:36 crc kubenswrapper[4699]: I0226 12:21:36.529445 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-dswxp_f026799a-39c7-443e-9801-f046ba8ae94b/cert-manager-cainjector/0.log" Feb 26 12:21:36 crc kubenswrapper[4699]: I0226 12:21:36.667721 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-l2fdt_fad1f923-b22c-4c0d-9eb9-684636bc76c0/cert-manager-webhook/0.log" Feb 26 12:21:50 crc kubenswrapper[4699]: I0226 12:21:50.374479 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-7f4bx_13fc1aa0-a043-4b42-952b-7f718ff577d2/nmstate-console-plugin/0.log" Feb 26 12:21:50 crc kubenswrapper[4699]: I0226 12:21:50.581810 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-5jrwg_80de38f0-8620-4e27-988e-6d85d7c8bc24/nmstate-handler/0.log" Feb 26 12:21:50 crc kubenswrapper[4699]: I0226 12:21:50.670530 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-jnrsc_c4897df9-3a79-41bf-a7ba-7a72d888f8e1/kube-rbac-proxy/0.log" Feb 26 12:21:50 crc kubenswrapper[4699]: I0226 12:21:50.752752 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-jnrsc_c4897df9-3a79-41bf-a7ba-7a72d888f8e1/nmstate-metrics/0.log" Feb 26 12:21:50 crc kubenswrapper[4699]: I0226 12:21:50.899225 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-8l8n8_15312afe-49aa-4681-8513-6ed9c774d222/nmstate-operator/0.log" Feb 26 12:21:50 crc kubenswrapper[4699]: I0226 12:21:50.966377 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-qmw66_d674e733-7357-43e5-be9c-4d4e9bad252c/nmstate-webhook/0.log" Feb 26 12:22:00 crc kubenswrapper[4699]: I0226 12:22:00.142739 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535142-vfnhz"] Feb 26 12:22:00 crc kubenswrapper[4699]: E0226 12:22:00.144166 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="924cba42-fd14-4d50-815d-0d8fa83c6b06" containerName="oc" Feb 26 12:22:00 crc kubenswrapper[4699]: I0226 12:22:00.144181 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="924cba42-fd14-4d50-815d-0d8fa83c6b06" containerName="oc" Feb 26 12:22:00 crc kubenswrapper[4699]: I0226 12:22:00.144432 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="924cba42-fd14-4d50-815d-0d8fa83c6b06" containerName="oc" Feb 26 12:22:00 crc kubenswrapper[4699]: I0226 12:22:00.145084 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535142-vfnhz" Feb 26 12:22:00 crc kubenswrapper[4699]: I0226 12:22:00.148245 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 12:22:00 crc kubenswrapper[4699]: I0226 12:22:00.148535 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 12:22:00 crc kubenswrapper[4699]: I0226 12:22:00.148746 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 12:22:00 crc kubenswrapper[4699]: I0226 12:22:00.178254 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535142-vfnhz"] Feb 26 12:22:00 crc kubenswrapper[4699]: I0226 12:22:00.252101 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn9wm\" (UniqueName: \"kubernetes.io/projected/8a1ba6a1-6a82-47c4-9706-f77275f34d3a-kube-api-access-jn9wm\") pod \"auto-csr-approver-29535142-vfnhz\" (UID: \"8a1ba6a1-6a82-47c4-9706-f77275f34d3a\") " pod="openshift-infra/auto-csr-approver-29535142-vfnhz" Feb 26 12:22:00 crc kubenswrapper[4699]: I0226 12:22:00.354162 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn9wm\" (UniqueName: \"kubernetes.io/projected/8a1ba6a1-6a82-47c4-9706-f77275f34d3a-kube-api-access-jn9wm\") pod \"auto-csr-approver-29535142-vfnhz\" (UID: \"8a1ba6a1-6a82-47c4-9706-f77275f34d3a\") " pod="openshift-infra/auto-csr-approver-29535142-vfnhz" Feb 26 12:22:00 crc kubenswrapper[4699]: I0226 12:22:00.378891 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn9wm\" (UniqueName: \"kubernetes.io/projected/8a1ba6a1-6a82-47c4-9706-f77275f34d3a-kube-api-access-jn9wm\") pod \"auto-csr-approver-29535142-vfnhz\" (UID: \"8a1ba6a1-6a82-47c4-9706-f77275f34d3a\") " pod="openshift-infra/auto-csr-approver-29535142-vfnhz" Feb 26 12:22:00 crc kubenswrapper[4699]: I0226 12:22:00.468454 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535142-vfnhz" Feb 26 12:22:00 crc kubenswrapper[4699]: I0226 12:22:00.987987 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535142-vfnhz"] Feb 26 12:22:01 crc kubenswrapper[4699]: I0226 12:22:01.300044 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535142-vfnhz" event={"ID":"8a1ba6a1-6a82-47c4-9706-f77275f34d3a","Type":"ContainerStarted","Data":"33a0346b64cb96af9bc84b1f89b7928e64871fca412d10252ed5502ee0a2b2fa"} Feb 26 12:22:03 crc kubenswrapper[4699]: I0226 12:22:03.318603 4699 generic.go:334] "Generic (PLEG): container finished" podID="8a1ba6a1-6a82-47c4-9706-f77275f34d3a" containerID="99ba6bfc8510f503ebb43686b1e59641b632a364615001984ba3d20ee91c082d" exitCode=0 Feb 26 12:22:03 crc kubenswrapper[4699]: I0226 12:22:03.318703 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535142-vfnhz" event={"ID":"8a1ba6a1-6a82-47c4-9706-f77275f34d3a","Type":"ContainerDied","Data":"99ba6bfc8510f503ebb43686b1e59641b632a364615001984ba3d20ee91c082d"} Feb 26 12:22:04 crc kubenswrapper[4699]: I0226 12:22:04.688294 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535142-vfnhz" Feb 26 12:22:04 crc kubenswrapper[4699]: I0226 12:22:04.838744 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn9wm\" (UniqueName: \"kubernetes.io/projected/8a1ba6a1-6a82-47c4-9706-f77275f34d3a-kube-api-access-jn9wm\") pod \"8a1ba6a1-6a82-47c4-9706-f77275f34d3a\" (UID: \"8a1ba6a1-6a82-47c4-9706-f77275f34d3a\") " Feb 26 12:22:04 crc kubenswrapper[4699]: I0226 12:22:04.844509 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a1ba6a1-6a82-47c4-9706-f77275f34d3a-kube-api-access-jn9wm" (OuterVolumeSpecName: "kube-api-access-jn9wm") pod "8a1ba6a1-6a82-47c4-9706-f77275f34d3a" (UID: "8a1ba6a1-6a82-47c4-9706-f77275f34d3a"). InnerVolumeSpecName "kube-api-access-jn9wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:22:04 crc kubenswrapper[4699]: I0226 12:22:04.941116 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn9wm\" (UniqueName: \"kubernetes.io/projected/8a1ba6a1-6a82-47c4-9706-f77275f34d3a-kube-api-access-jn9wm\") on node \"crc\" DevicePath \"\"" Feb 26 12:22:05 crc kubenswrapper[4699]: I0226 12:22:05.339462 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535142-vfnhz" event={"ID":"8a1ba6a1-6a82-47c4-9706-f77275f34d3a","Type":"ContainerDied","Data":"33a0346b64cb96af9bc84b1f89b7928e64871fca412d10252ed5502ee0a2b2fa"} Feb 26 12:22:05 crc kubenswrapper[4699]: I0226 12:22:05.339502 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33a0346b64cb96af9bc84b1f89b7928e64871fca412d10252ed5502ee0a2b2fa" Feb 26 12:22:05 crc kubenswrapper[4699]: I0226 12:22:05.339563 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535142-vfnhz" Feb 26 12:22:05 crc kubenswrapper[4699]: I0226 12:22:05.756078 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535136-zr5lr"] Feb 26 12:22:05 crc kubenswrapper[4699]: I0226 12:22:05.764901 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535136-zr5lr"] Feb 26 12:22:06 crc kubenswrapper[4699]: I0226 12:22:06.270924 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="502aea63-b1be-4c9e-850b-bc5a2503b628" path="/var/lib/kubelet/pods/502aea63-b1be-4c9e-850b-bc5a2503b628/volumes" Feb 26 12:22:18 crc kubenswrapper[4699]: I0226 12:22:18.965252 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-bs5nk_6ef6a9d7-6997-485a-a812-ded9d3a2df85/kube-rbac-proxy/0.log" Feb 26 12:22:19 crc kubenswrapper[4699]: I0226 12:22:19.038456 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-bs5nk_6ef6a9d7-6997-485a-a812-ded9d3a2df85/controller/0.log" Feb 26 12:22:19 crc kubenswrapper[4699]: I0226 12:22:19.179922 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-svsrb_35357e2c-2a03-46f8-bc28-f7daad3b679d/frr-k8s-webhook-server/0.log" Feb 26 12:22:19 crc kubenswrapper[4699]: I0226 12:22:19.249561 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-frr-files/0.log" Feb 26 12:22:19 crc kubenswrapper[4699]: I0226 12:22:19.431478 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-frr-files/0.log" Feb 26 12:22:19 crc kubenswrapper[4699]: I0226 12:22:19.462935 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-reloader/0.log" Feb 26 12:22:19 crc kubenswrapper[4699]: I0226 12:22:19.462984 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-metrics/0.log" Feb 26 12:22:19 crc kubenswrapper[4699]: I0226 12:22:19.493170 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-reloader/0.log" Feb 26 12:22:19 crc kubenswrapper[4699]: I0226 12:22:19.613733 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-metrics/0.log" Feb 26 12:22:19 crc kubenswrapper[4699]: I0226 12:22:19.647511 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-metrics/0.log" Feb 26 12:22:19 crc kubenswrapper[4699]: I0226 12:22:19.665999 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-frr-files/0.log" Feb 26 12:22:19 crc kubenswrapper[4699]: I0226 12:22:19.672092 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-reloader/0.log" Feb 26 12:22:19 crc kubenswrapper[4699]: I0226 12:22:19.922610 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-frr-files/0.log" Feb 26 12:22:19 crc kubenswrapper[4699]: I0226 12:22:19.927340 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-metrics/0.log" Feb 26 12:22:19 crc kubenswrapper[4699]: I0226 12:22:19.929239 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/cp-reloader/0.log" Feb 26 12:22:19 crc kubenswrapper[4699]: I0226 12:22:19.930917 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/controller/0.log" Feb 26 12:22:20 crc kubenswrapper[4699]: I0226 12:22:20.115154 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/kube-rbac-proxy/0.log" Feb 26 12:22:20 crc kubenswrapper[4699]: I0226 12:22:20.138342 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/frr-metrics/0.log" Feb 26 12:22:20 crc kubenswrapper[4699]: I0226 12:22:20.196823 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/kube-rbac-proxy-frr/0.log" Feb 26 12:22:20 crc kubenswrapper[4699]: I0226 12:22:20.375686 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/reloader/0.log" Feb 26 12:22:20 crc kubenswrapper[4699]: I0226 12:22:20.430693 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5d58b8658b-qjr5b_cacc25f7-9d5d-4ba4-b7b7-f95d7cba63e8/manager/0.log" Feb 26 12:22:20 crc kubenswrapper[4699]: I0226 12:22:20.575434 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6d98597f89-glkjh_af2438c1-8812-4bb1-8999-66cb8d804c05/webhook-server/0.log" Feb 26 12:22:20 crc kubenswrapper[4699]: I0226 12:22:20.745498 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-l8phj_d656ca89-f955-44bb-9944-f75bf485a254/kube-rbac-proxy/0.log" Feb 26 12:22:21 crc kubenswrapper[4699]: I0226 12:22:21.491223 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-l8phj_d656ca89-f955-44bb-9944-f75bf485a254/speaker/0.log" Feb 26 12:22:21 crc kubenswrapper[4699]: I0226 12:22:21.810522 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wszs7_dfa29d17-a66a-42fe-8275-1526f8fb6dc9/frr/0.log" Feb 26 12:22:33 crc kubenswrapper[4699]: I0226 12:22:33.445163 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5_a0751c34-68ec-4fd1-821f-94e314dd5621/util/0.log" Feb 26 12:22:33 crc kubenswrapper[4699]: I0226 12:22:33.657437 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5_a0751c34-68ec-4fd1-821f-94e314dd5621/util/0.log" Feb 26 12:22:33 crc kubenswrapper[4699]: I0226 12:22:33.711915 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5_a0751c34-68ec-4fd1-821f-94e314dd5621/pull/0.log" Feb 26 12:22:33 crc kubenswrapper[4699]: I0226 12:22:33.747691 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5_a0751c34-68ec-4fd1-821f-94e314dd5621/pull/0.log" Feb 26 12:22:33 crc kubenswrapper[4699]: I0226 12:22:33.904660 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5_a0751c34-68ec-4fd1-821f-94e314dd5621/pull/0.log" Feb 26 12:22:33 crc kubenswrapper[4699]: I0226 12:22:33.929537 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5_a0751c34-68ec-4fd1-821f-94e314dd5621/extract/0.log" Feb 26 12:22:33 crc kubenswrapper[4699]: I0226 12:22:33.930315 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8274hh5_a0751c34-68ec-4fd1-821f-94e314dd5621/util/0.log" Feb 26 12:22:34 crc kubenswrapper[4699]: I0226 12:22:34.069477 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5vsj9_a23d2795-eec2-4e37-8902-7f9220e44cb1/extract-utilities/0.log" Feb 26 12:22:34 crc kubenswrapper[4699]: I0226 12:22:34.229094 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5vsj9_a23d2795-eec2-4e37-8902-7f9220e44cb1/extract-utilities/0.log" Feb 26 12:22:34 crc kubenswrapper[4699]: I0226 12:22:34.234567 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5vsj9_a23d2795-eec2-4e37-8902-7f9220e44cb1/extract-content/0.log" Feb 26 12:22:34 crc kubenswrapper[4699]: I0226 12:22:34.248351 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5vsj9_a23d2795-eec2-4e37-8902-7f9220e44cb1/extract-content/0.log" Feb 26 12:22:34 crc kubenswrapper[4699]: I0226 12:22:34.418197 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5vsj9_a23d2795-eec2-4e37-8902-7f9220e44cb1/extract-content/0.log" Feb 26 12:22:34 crc kubenswrapper[4699]: I0226 12:22:34.436125 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5vsj9_a23d2795-eec2-4e37-8902-7f9220e44cb1/extract-utilities/0.log" Feb 26 12:22:34 crc kubenswrapper[4699]: I0226 12:22:34.584632 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hfvdf_fde9effb-9fa9-46a0-a8e6-08080ed0b8ba/extract-utilities/0.log" Feb 26 12:22:34 crc kubenswrapper[4699]: I0226 12:22:34.886013 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hfvdf_fde9effb-9fa9-46a0-a8e6-08080ed0b8ba/extract-utilities/0.log" Feb 26 12:22:34 crc kubenswrapper[4699]: I0226 12:22:34.926541 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hfvdf_fde9effb-9fa9-46a0-a8e6-08080ed0b8ba/extract-content/0.log" Feb 26 12:22:34 crc kubenswrapper[4699]: I0226 12:22:34.975348 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hfvdf_fde9effb-9fa9-46a0-a8e6-08080ed0b8ba/extract-content/0.log" Feb 26 12:22:34 crc kubenswrapper[4699]: I0226 12:22:34.981716 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5vsj9_a23d2795-eec2-4e37-8902-7f9220e44cb1/registry-server/0.log" Feb 26 12:22:35 crc kubenswrapper[4699]: I0226 12:22:35.080924 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hfvdf_fde9effb-9fa9-46a0-a8e6-08080ed0b8ba/extract-utilities/0.log" Feb 26 12:22:35 crc kubenswrapper[4699]: I0226 12:22:35.160938 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hfvdf_fde9effb-9fa9-46a0-a8e6-08080ed0b8ba/extract-content/0.log" Feb 26 12:22:35 crc kubenswrapper[4699]: I0226 12:22:35.274385 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb_2628fd13-0f89-4bb3-9b76-86a9331a303e/util/0.log" Feb 26 12:22:35 crc kubenswrapper[4699]: I0226 12:22:35.530824 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb_2628fd13-0f89-4bb3-9b76-86a9331a303e/pull/0.log" Feb 26 12:22:35 crc kubenswrapper[4699]: I0226 12:22:35.557680 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb_2628fd13-0f89-4bb3-9b76-86a9331a303e/pull/0.log" Feb 26 12:22:35 crc kubenswrapper[4699]: I0226 12:22:35.595644 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb_2628fd13-0f89-4bb3-9b76-86a9331a303e/util/0.log" Feb 26 12:22:35 crc kubenswrapper[4699]: I0226 12:22:35.788462 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb_2628fd13-0f89-4bb3-9b76-86a9331a303e/pull/0.log" Feb 26 12:22:35 crc kubenswrapper[4699]: I0226 12:22:35.795578 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb_2628fd13-0f89-4bb3-9b76-86a9331a303e/util/0.log" Feb 26 12:22:35 crc kubenswrapper[4699]: I0226 12:22:35.813785 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hfvdf_fde9effb-9fa9-46a0-a8e6-08080ed0b8ba/registry-server/0.log" Feb 26 12:22:35 crc kubenswrapper[4699]: I0226 12:22:35.840775 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d7whb_2628fd13-0f89-4bb3-9b76-86a9331a303e/extract/0.log" Feb 26 12:22:35 crc kubenswrapper[4699]: I0226 12:22:35.971745 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-nwbkq_43a980f6-1eff-4610-aa3e-69729c3eb7c7/marketplace-operator/0.log" Feb 26 12:22:36 crc kubenswrapper[4699]: I0226 12:22:36.069554 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r555d_d174508d-e5d5-4912-a652-e7b264f1c882/extract-utilities/0.log" Feb 26 12:22:36 crc kubenswrapper[4699]: I0226 12:22:36.247835 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r555d_d174508d-e5d5-4912-a652-e7b264f1c882/extract-utilities/0.log" Feb 26 12:22:36 crc kubenswrapper[4699]: I0226 12:22:36.252481 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r555d_d174508d-e5d5-4912-a652-e7b264f1c882/extract-content/0.log" Feb 26 12:22:36 crc kubenswrapper[4699]: I0226 12:22:36.309031 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r555d_d174508d-e5d5-4912-a652-e7b264f1c882/extract-content/0.log" Feb 26 12:22:36 crc kubenswrapper[4699]: I0226 12:22:36.479438 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r555d_d174508d-e5d5-4912-a652-e7b264f1c882/extract-utilities/0.log" Feb 26 12:22:36 crc kubenswrapper[4699]: I0226 12:22:36.548953 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r555d_d174508d-e5d5-4912-a652-e7b264f1c882/extract-content/0.log" Feb 26 12:22:36 crc kubenswrapper[4699]: I0226 12:22:36.734337 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wqmqz_a69df934-4fa7-472d-abe7-8fa4ec5d4296/extract-utilities/0.log" Feb 26 12:22:36 crc kubenswrapper[4699]: I0226 12:22:36.734937 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r555d_d174508d-e5d5-4912-a652-e7b264f1c882/registry-server/0.log" Feb 26 12:22:36 crc kubenswrapper[4699]: I0226 12:22:36.913576 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wqmqz_a69df934-4fa7-472d-abe7-8fa4ec5d4296/extract-content/0.log" Feb 26 12:22:36 crc kubenswrapper[4699]: I0226 12:22:36.918731 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wqmqz_a69df934-4fa7-472d-abe7-8fa4ec5d4296/extract-utilities/0.log" Feb 26 12:22:36 crc kubenswrapper[4699]: I0226 12:22:36.921488 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wqmqz_a69df934-4fa7-472d-abe7-8fa4ec5d4296/extract-content/0.log" Feb 26 12:22:37 crc kubenswrapper[4699]: I0226 12:22:37.162181 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wqmqz_a69df934-4fa7-472d-abe7-8fa4ec5d4296/extract-utilities/0.log" Feb 26 12:22:37 crc kubenswrapper[4699]: I0226 12:22:37.168103 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wqmqz_a69df934-4fa7-472d-abe7-8fa4ec5d4296/extract-content/0.log" Feb 26 12:22:37 crc kubenswrapper[4699]: I0226 12:22:37.680403 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wqmqz_a69df934-4fa7-472d-abe7-8fa4ec5d4296/registry-server/0.log" Feb 26 12:22:40 crc kubenswrapper[4699]: I0226 12:22:40.653004 4699 scope.go:117] "RemoveContainer" containerID="6e828c6eb232b14fedfc4161c27c5a5dd3b91bd1fe215ef080f8deb69fce1e31" Feb 26 12:22:41 crc kubenswrapper[4699]: I0226 12:22:41.585243 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 12:22:41 crc kubenswrapper[4699]: I0226 12:22:41.585648 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 12:23:11 crc kubenswrapper[4699]: I0226 12:23:11.585684 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 12:23:11 crc kubenswrapper[4699]: I0226 12:23:11.586285 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 12:23:41 crc kubenswrapper[4699]: I0226 12:23:41.584788 4699 patch_prober.go:28] interesting pod/machine-config-daemon-28p79 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 12:23:41 crc kubenswrapper[4699]: I0226 12:23:41.585367 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 12:23:41 crc kubenswrapper[4699]: I0226 12:23:41.585421 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-28p79" Feb 26 12:23:41 crc kubenswrapper[4699]: I0226 12:23:41.586305 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30"} pod="openshift-machine-config-operator/machine-config-daemon-28p79" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 12:23:41 crc kubenswrapper[4699]: I0226 12:23:41.586365 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerName="machine-config-daemon" containerID="cri-o://db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" gracePeriod=600 Feb 26 12:23:41 crc kubenswrapper[4699]: E0226 12:23:41.736028 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:23:42 crc kubenswrapper[4699]: I0226 12:23:42.204668 4699 generic.go:334] "Generic (PLEG): container finished" podID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" exitCode=0 Feb 26 12:23:42 crc kubenswrapper[4699]: I0226 12:23:42.204740 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerDied","Data":"db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30"} Feb 26 12:23:42 crc kubenswrapper[4699]: I0226 12:23:42.204789 4699 scope.go:117] "RemoveContainer" containerID="321e80bae8579e8007aa1cc495575fd7eef57d9379aadf703c862dea223958e5" Feb 26 12:23:42 crc kubenswrapper[4699]: I0226 12:23:42.206973 4699 scope.go:117] "RemoveContainer" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" Feb 26 12:23:42 crc kubenswrapper[4699]: E0226 12:23:42.209575 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:23:55 crc kubenswrapper[4699]: I0226 12:23:55.261317 4699 scope.go:117] "RemoveContainer" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" Feb 26 12:23:55 crc kubenswrapper[4699]: E0226 12:23:55.262161 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:24:00 crc kubenswrapper[4699]: I0226 12:24:00.147345 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535144-5h2px"] Feb 26 12:24:00 crc kubenswrapper[4699]: E0226 12:24:00.150030 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a1ba6a1-6a82-47c4-9706-f77275f34d3a" containerName="oc" Feb 26 12:24:00 crc kubenswrapper[4699]: I0226 12:24:00.150204 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a1ba6a1-6a82-47c4-9706-f77275f34d3a" containerName="oc" Feb 26 12:24:00 crc kubenswrapper[4699]: I0226 12:24:00.150820 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a1ba6a1-6a82-47c4-9706-f77275f34d3a" containerName="oc" Feb 26 12:24:00 crc kubenswrapper[4699]: I0226 12:24:00.151919 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535144-5h2px" Feb 26 12:24:00 crc kubenswrapper[4699]: I0226 12:24:00.156445 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 12:24:00 crc kubenswrapper[4699]: I0226 12:24:00.156714 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 12:24:00 crc kubenswrapper[4699]: I0226 12:24:00.157089 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 12:24:00 crc kubenswrapper[4699]: I0226 12:24:00.164966 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535144-5h2px"] Feb 26 12:24:00 crc kubenswrapper[4699]: I0226 12:24:00.262295 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ndns\" (UniqueName: \"kubernetes.io/projected/1c4ce589-abf9-443e-8f50-2d1904d537ad-kube-api-access-8ndns\") pod \"auto-csr-approver-29535144-5h2px\" (UID: \"1c4ce589-abf9-443e-8f50-2d1904d537ad\") " pod="openshift-infra/auto-csr-approver-29535144-5h2px" Feb 26 12:24:00 crc kubenswrapper[4699]: I0226 12:24:00.365341 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ndns\" (UniqueName: \"kubernetes.io/projected/1c4ce589-abf9-443e-8f50-2d1904d537ad-kube-api-access-8ndns\") pod \"auto-csr-approver-29535144-5h2px\" (UID: \"1c4ce589-abf9-443e-8f50-2d1904d537ad\") " pod="openshift-infra/auto-csr-approver-29535144-5h2px" Feb 26 12:24:00 crc kubenswrapper[4699]: I0226 12:24:00.637583 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ndns\" (UniqueName: \"kubernetes.io/projected/1c4ce589-abf9-443e-8f50-2d1904d537ad-kube-api-access-8ndns\") pod \"auto-csr-approver-29535144-5h2px\" (UID: \"1c4ce589-abf9-443e-8f50-2d1904d537ad\") " pod="openshift-infra/auto-csr-approver-29535144-5h2px" Feb 26 12:24:00 crc kubenswrapper[4699]: I0226 12:24:00.773404 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535144-5h2px" Feb 26 12:24:01 crc kubenswrapper[4699]: I0226 12:24:01.233782 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535144-5h2px"] Feb 26 12:24:01 crc kubenswrapper[4699]: I0226 12:24:01.404164 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535144-5h2px" event={"ID":"1c4ce589-abf9-443e-8f50-2d1904d537ad","Type":"ContainerStarted","Data":"3ccbcce0173c0d2bec4896a426264bd199f80a8b3d7002d7d0419cc9a24823a7"} Feb 26 12:24:03 crc kubenswrapper[4699]: I0226 12:24:03.422299 4699 generic.go:334] "Generic (PLEG): container finished" podID="1c4ce589-abf9-443e-8f50-2d1904d537ad" containerID="78ce7123b16eb0d6213f96c0626817e0eb21374dea43ac7a0eeccd31bcc7f327" exitCode=0 Feb 26 12:24:03 crc kubenswrapper[4699]: I0226 12:24:03.422352 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535144-5h2px" event={"ID":"1c4ce589-abf9-443e-8f50-2d1904d537ad","Type":"ContainerDied","Data":"78ce7123b16eb0d6213f96c0626817e0eb21374dea43ac7a0eeccd31bcc7f327"} Feb 26 12:24:04 crc kubenswrapper[4699]: I0226 12:24:04.757485 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535144-5h2px" Feb 26 12:24:04 crc kubenswrapper[4699]: I0226 12:24:04.868306 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ndns\" (UniqueName: \"kubernetes.io/projected/1c4ce589-abf9-443e-8f50-2d1904d537ad-kube-api-access-8ndns\") pod \"1c4ce589-abf9-443e-8f50-2d1904d537ad\" (UID: \"1c4ce589-abf9-443e-8f50-2d1904d537ad\") " Feb 26 12:24:04 crc kubenswrapper[4699]: I0226 12:24:04.876628 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c4ce589-abf9-443e-8f50-2d1904d537ad-kube-api-access-8ndns" (OuterVolumeSpecName: "kube-api-access-8ndns") pod "1c4ce589-abf9-443e-8f50-2d1904d537ad" (UID: "1c4ce589-abf9-443e-8f50-2d1904d537ad"). InnerVolumeSpecName "kube-api-access-8ndns". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:24:04 crc kubenswrapper[4699]: I0226 12:24:04.971092 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ndns\" (UniqueName: \"kubernetes.io/projected/1c4ce589-abf9-443e-8f50-2d1904d537ad-kube-api-access-8ndns\") on node \"crc\" DevicePath \"\"" Feb 26 12:24:05 crc kubenswrapper[4699]: I0226 12:24:05.443053 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535144-5h2px" event={"ID":"1c4ce589-abf9-443e-8f50-2d1904d537ad","Type":"ContainerDied","Data":"3ccbcce0173c0d2bec4896a426264bd199f80a8b3d7002d7d0419cc9a24823a7"} Feb 26 12:24:05 crc kubenswrapper[4699]: I0226 12:24:05.443347 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ccbcce0173c0d2bec4896a426264bd199f80a8b3d7002d7d0419cc9a24823a7" Feb 26 12:24:05 crc kubenswrapper[4699]: I0226 12:24:05.443130 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535144-5h2px" Feb 26 12:24:05 crc kubenswrapper[4699]: I0226 12:24:05.826052 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535138-ghxdv"] Feb 26 12:24:05 crc kubenswrapper[4699]: I0226 12:24:05.837142 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535138-ghxdv"] Feb 26 12:24:06 crc kubenswrapper[4699]: I0226 12:24:06.276514 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d73a20e-eea0-421b-8efd-6fd86f1e4d98" path="/var/lib/kubelet/pods/3d73a20e-eea0-421b-8efd-6fd86f1e4d98/volumes" Feb 26 12:24:10 crc kubenswrapper[4699]: I0226 12:24:10.260525 4699 scope.go:117] "RemoveContainer" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" Feb 26 12:24:10 crc kubenswrapper[4699]: E0226 12:24:10.261070 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:24:25 crc kubenswrapper[4699]: I0226 12:24:25.260930 4699 scope.go:117] "RemoveContainer" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" Feb 26 12:24:25 crc kubenswrapper[4699]: E0226 12:24:25.261864 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:24:28 crc kubenswrapper[4699]: I0226 12:24:28.705315 4699 generic.go:334] "Generic (PLEG): container finished" podID="e1a2e674-d3fd-4fac-b5e0-b201dd644f25" containerID="f14354991cbd7caa139684c2977ca7c2377002acb68ea2e984ee4fc98ce473f0" exitCode=0 Feb 26 12:24:28 crc kubenswrapper[4699]: I0226 12:24:28.705408 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l2l5g/must-gather-zwd9v" event={"ID":"e1a2e674-d3fd-4fac-b5e0-b201dd644f25","Type":"ContainerDied","Data":"f14354991cbd7caa139684c2977ca7c2377002acb68ea2e984ee4fc98ce473f0"} Feb 26 12:24:28 crc kubenswrapper[4699]: I0226 12:24:28.706234 4699 scope.go:117] "RemoveContainer" containerID="f14354991cbd7caa139684c2977ca7c2377002acb68ea2e984ee4fc98ce473f0" Feb 26 12:24:28 crc kubenswrapper[4699]: I0226 12:24:28.819647 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l2l5g_must-gather-zwd9v_e1a2e674-d3fd-4fac-b5e0-b201dd644f25/gather/0.log" Feb 26 12:24:37 crc kubenswrapper[4699]: I0226 12:24:37.261272 4699 scope.go:117] "RemoveContainer" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" Feb 26 12:24:37 crc kubenswrapper[4699]: E0226 12:24:37.262361 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:24:39 crc kubenswrapper[4699]: I0226 12:24:39.853448 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-l2l5g/must-gather-zwd9v"] Feb 26 12:24:39 crc kubenswrapper[4699]: I0226 12:24:39.854312 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-l2l5g/must-gather-zwd9v" podUID="e1a2e674-d3fd-4fac-b5e0-b201dd644f25" containerName="copy" containerID="cri-o://69b0e2cefceef6db918052929a4b9d82fd8ed38656cef9eade78076401ce807e" gracePeriod=2 Feb 26 12:24:39 crc kubenswrapper[4699]: I0226 12:24:39.868867 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-l2l5g/must-gather-zwd9v"] Feb 26 12:24:40 crc kubenswrapper[4699]: I0226 12:24:40.290742 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l2l5g_must-gather-zwd9v_e1a2e674-d3fd-4fac-b5e0-b201dd644f25/copy/0.log" Feb 26 12:24:40 crc kubenswrapper[4699]: I0226 12:24:40.291623 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l2l5g/must-gather-zwd9v" Feb 26 12:24:40 crc kubenswrapper[4699]: I0226 12:24:40.421645 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e1a2e674-d3fd-4fac-b5e0-b201dd644f25-must-gather-output\") pod \"e1a2e674-d3fd-4fac-b5e0-b201dd644f25\" (UID: \"e1a2e674-d3fd-4fac-b5e0-b201dd644f25\") " Feb 26 12:24:40 crc kubenswrapper[4699]: I0226 12:24:40.421715 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlmbm\" (UniqueName: \"kubernetes.io/projected/e1a2e674-d3fd-4fac-b5e0-b201dd644f25-kube-api-access-zlmbm\") pod \"e1a2e674-d3fd-4fac-b5e0-b201dd644f25\" (UID: \"e1a2e674-d3fd-4fac-b5e0-b201dd644f25\") " Feb 26 12:24:40 crc kubenswrapper[4699]: I0226 12:24:40.428719 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1a2e674-d3fd-4fac-b5e0-b201dd644f25-kube-api-access-zlmbm" (OuterVolumeSpecName: "kube-api-access-zlmbm") pod "e1a2e674-d3fd-4fac-b5e0-b201dd644f25" (UID: "e1a2e674-d3fd-4fac-b5e0-b201dd644f25"). InnerVolumeSpecName "kube-api-access-zlmbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:24:40 crc kubenswrapper[4699]: I0226 12:24:40.523697 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlmbm\" (UniqueName: \"kubernetes.io/projected/e1a2e674-d3fd-4fac-b5e0-b201dd644f25-kube-api-access-zlmbm\") on node \"crc\" DevicePath \"\"" Feb 26 12:24:40 crc kubenswrapper[4699]: I0226 12:24:40.610925 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1a2e674-d3fd-4fac-b5e0-b201dd644f25-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e1a2e674-d3fd-4fac-b5e0-b201dd644f25" (UID: "e1a2e674-d3fd-4fac-b5e0-b201dd644f25"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:24:40 crc kubenswrapper[4699]: I0226 12:24:40.631148 4699 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e1a2e674-d3fd-4fac-b5e0-b201dd644f25-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 26 12:24:40 crc kubenswrapper[4699]: I0226 12:24:40.833672 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l2l5g_must-gather-zwd9v_e1a2e674-d3fd-4fac-b5e0-b201dd644f25/copy/0.log" Feb 26 12:24:40 crc kubenswrapper[4699]: I0226 12:24:40.834183 4699 generic.go:334] "Generic (PLEG): container finished" podID="e1a2e674-d3fd-4fac-b5e0-b201dd644f25" containerID="69b0e2cefceef6db918052929a4b9d82fd8ed38656cef9eade78076401ce807e" exitCode=143 Feb 26 12:24:40 crc kubenswrapper[4699]: I0226 12:24:40.834250 4699 scope.go:117] "RemoveContainer" containerID="69b0e2cefceef6db918052929a4b9d82fd8ed38656cef9eade78076401ce807e" Feb 26 12:24:40 crc kubenswrapper[4699]: I0226 12:24:40.834282 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l2l5g/must-gather-zwd9v" Feb 26 12:24:40 crc kubenswrapper[4699]: I0226 12:24:40.890162 4699 scope.go:117] "RemoveContainer" containerID="f14354991cbd7caa139684c2977ca7c2377002acb68ea2e984ee4fc98ce473f0" Feb 26 12:24:40 crc kubenswrapper[4699]: I0226 12:24:40.974839 4699 scope.go:117] "RemoveContainer" containerID="69b0e2cefceef6db918052929a4b9d82fd8ed38656cef9eade78076401ce807e" Feb 26 12:24:40 crc kubenswrapper[4699]: E0226 12:24:40.975291 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69b0e2cefceef6db918052929a4b9d82fd8ed38656cef9eade78076401ce807e\": container with ID starting with 69b0e2cefceef6db918052929a4b9d82fd8ed38656cef9eade78076401ce807e not found: ID does not exist" containerID="69b0e2cefceef6db918052929a4b9d82fd8ed38656cef9eade78076401ce807e" Feb 26 12:24:40 crc kubenswrapper[4699]: I0226 12:24:40.975332 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69b0e2cefceef6db918052929a4b9d82fd8ed38656cef9eade78076401ce807e"} err="failed to get container status \"69b0e2cefceef6db918052929a4b9d82fd8ed38656cef9eade78076401ce807e\": rpc error: code = NotFound desc = could not find container \"69b0e2cefceef6db918052929a4b9d82fd8ed38656cef9eade78076401ce807e\": container with ID starting with 69b0e2cefceef6db918052929a4b9d82fd8ed38656cef9eade78076401ce807e not found: ID does not exist" Feb 26 12:24:40 crc kubenswrapper[4699]: I0226 12:24:40.975360 4699 scope.go:117] "RemoveContainer" containerID="f14354991cbd7caa139684c2977ca7c2377002acb68ea2e984ee4fc98ce473f0" Feb 26 12:24:40 crc kubenswrapper[4699]: E0226 12:24:40.975583 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f14354991cbd7caa139684c2977ca7c2377002acb68ea2e984ee4fc98ce473f0\": container with ID starting with f14354991cbd7caa139684c2977ca7c2377002acb68ea2e984ee4fc98ce473f0 not found: ID does not exist" containerID="f14354991cbd7caa139684c2977ca7c2377002acb68ea2e984ee4fc98ce473f0" Feb 26 12:24:40 crc kubenswrapper[4699]: I0226 12:24:40.975612 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f14354991cbd7caa139684c2977ca7c2377002acb68ea2e984ee4fc98ce473f0"} err="failed to get container status \"f14354991cbd7caa139684c2977ca7c2377002acb68ea2e984ee4fc98ce473f0\": rpc error: code = NotFound desc = could not find container \"f14354991cbd7caa139684c2977ca7c2377002acb68ea2e984ee4fc98ce473f0\": container with ID starting with f14354991cbd7caa139684c2977ca7c2377002acb68ea2e984ee4fc98ce473f0 not found: ID does not exist" Feb 26 12:24:41 crc kubenswrapper[4699]: I0226 12:24:41.317644 4699 scope.go:117] "RemoveContainer" containerID="206617da387e97d81b9b831e8d26536a56cede7f0a2daac8fe00d38d64e627ce" Feb 26 12:24:42 crc kubenswrapper[4699]: I0226 12:24:42.278645 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1a2e674-d3fd-4fac-b5e0-b201dd644f25" path="/var/lib/kubelet/pods/e1a2e674-d3fd-4fac-b5e0-b201dd644f25/volumes" Feb 26 12:24:48 crc kubenswrapper[4699]: I0226 12:24:48.261588 4699 scope.go:117] "RemoveContainer" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" Feb 26 12:24:48 crc kubenswrapper[4699]: E0226 12:24:48.262688 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:25:00 crc kubenswrapper[4699]: I0226 12:25:00.262994 4699 scope.go:117] "RemoveContainer" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" Feb 26 12:25:00 crc kubenswrapper[4699]: E0226 12:25:00.263828 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:25:13 crc kubenswrapper[4699]: I0226 12:25:13.261140 4699 scope.go:117] "RemoveContainer" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" Feb 26 12:25:13 crc kubenswrapper[4699]: E0226 12:25:13.261900 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:25:28 crc kubenswrapper[4699]: I0226 12:25:28.261236 4699 scope.go:117] "RemoveContainer" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" Feb 26 12:25:28 crc kubenswrapper[4699]: E0226 12:25:28.262094 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:25:43 crc kubenswrapper[4699]: I0226 12:25:43.260208 4699 scope.go:117] "RemoveContainer" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" Feb 26 12:25:43 crc kubenswrapper[4699]: E0226 12:25:43.261005 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:25:45 crc kubenswrapper[4699]: I0226 12:25:45.502654 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q6n44"] Feb 26 12:25:45 crc kubenswrapper[4699]: E0226 12:25:45.503517 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c4ce589-abf9-443e-8f50-2d1904d537ad" containerName="oc" Feb 26 12:25:45 crc kubenswrapper[4699]: I0226 12:25:45.503536 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c4ce589-abf9-443e-8f50-2d1904d537ad" containerName="oc" Feb 26 12:25:45 crc kubenswrapper[4699]: E0226 12:25:45.503567 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a2e674-d3fd-4fac-b5e0-b201dd644f25" containerName="gather" Feb 26 12:25:45 crc kubenswrapper[4699]: I0226 12:25:45.503576 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a2e674-d3fd-4fac-b5e0-b201dd644f25" containerName="gather" Feb 26 12:25:45 crc kubenswrapper[4699]: E0226 12:25:45.503596 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a2e674-d3fd-4fac-b5e0-b201dd644f25" containerName="copy" Feb 26 12:25:45 crc kubenswrapper[4699]: I0226 12:25:45.503603 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a2e674-d3fd-4fac-b5e0-b201dd644f25" containerName="copy" Feb 26 12:25:45 crc kubenswrapper[4699]: I0226 12:25:45.505371 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1a2e674-d3fd-4fac-b5e0-b201dd644f25" containerName="gather" Feb 26 12:25:45 crc kubenswrapper[4699]: I0226 12:25:45.505400 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c4ce589-abf9-443e-8f50-2d1904d537ad" containerName="oc" Feb 26 12:25:45 crc kubenswrapper[4699]: I0226 12:25:45.505423 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1a2e674-d3fd-4fac-b5e0-b201dd644f25" containerName="copy" Feb 26 12:25:45 crc kubenswrapper[4699]: I0226 12:25:45.507206 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q6n44" Feb 26 12:25:45 crc kubenswrapper[4699]: I0226 12:25:45.517400 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q6n44"] Feb 26 12:25:45 crc kubenswrapper[4699]: I0226 12:25:45.610746 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f891d809-e0a8-4802-a2a3-2fd5d0d45607-utilities\") pod \"community-operators-q6n44\" (UID: \"f891d809-e0a8-4802-a2a3-2fd5d0d45607\") " pod="openshift-marketplace/community-operators-q6n44" Feb 26 12:25:45 crc kubenswrapper[4699]: I0226 12:25:45.610828 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnsp6\" (UniqueName: \"kubernetes.io/projected/f891d809-e0a8-4802-a2a3-2fd5d0d45607-kube-api-access-pnsp6\") pod \"community-operators-q6n44\" (UID: \"f891d809-e0a8-4802-a2a3-2fd5d0d45607\") " pod="openshift-marketplace/community-operators-q6n44" Feb 26 12:25:45 crc kubenswrapper[4699]: I0226 12:25:45.610987 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f891d809-e0a8-4802-a2a3-2fd5d0d45607-catalog-content\") pod \"community-operators-q6n44\" (UID: \"f891d809-e0a8-4802-a2a3-2fd5d0d45607\") " pod="openshift-marketplace/community-operators-q6n44" Feb 26 12:25:45 crc kubenswrapper[4699]: I0226 12:25:45.713137 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f891d809-e0a8-4802-a2a3-2fd5d0d45607-catalog-content\") pod \"community-operators-q6n44\" (UID: \"f891d809-e0a8-4802-a2a3-2fd5d0d45607\") " pod="openshift-marketplace/community-operators-q6n44" Feb 26 12:25:45 crc kubenswrapper[4699]: I0226 12:25:45.713283 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f891d809-e0a8-4802-a2a3-2fd5d0d45607-utilities\") pod \"community-operators-q6n44\" (UID: \"f891d809-e0a8-4802-a2a3-2fd5d0d45607\") " pod="openshift-marketplace/community-operators-q6n44" Feb 26 12:25:45 crc kubenswrapper[4699]: I0226 12:25:45.713321 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnsp6\" (UniqueName: \"kubernetes.io/projected/f891d809-e0a8-4802-a2a3-2fd5d0d45607-kube-api-access-pnsp6\") pod \"community-operators-q6n44\" (UID: \"f891d809-e0a8-4802-a2a3-2fd5d0d45607\") " pod="openshift-marketplace/community-operators-q6n44" Feb 26 12:25:45 crc kubenswrapper[4699]: I0226 12:25:45.713865 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f891d809-e0a8-4802-a2a3-2fd5d0d45607-catalog-content\") pod \"community-operators-q6n44\" (UID: \"f891d809-e0a8-4802-a2a3-2fd5d0d45607\") " pod="openshift-marketplace/community-operators-q6n44" Feb 26 12:25:45 crc kubenswrapper[4699]: I0226 12:25:45.713888 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f891d809-e0a8-4802-a2a3-2fd5d0d45607-utilities\") pod \"community-operators-q6n44\" (UID: \"f891d809-e0a8-4802-a2a3-2fd5d0d45607\") " pod="openshift-marketplace/community-operators-q6n44" Feb 26 12:25:45 crc kubenswrapper[4699]: I0226 12:25:45.735156 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnsp6\" (UniqueName: \"kubernetes.io/projected/f891d809-e0a8-4802-a2a3-2fd5d0d45607-kube-api-access-pnsp6\") pod \"community-operators-q6n44\" (UID: \"f891d809-e0a8-4802-a2a3-2fd5d0d45607\") " pod="openshift-marketplace/community-operators-q6n44" Feb 26 12:25:45 crc kubenswrapper[4699]: I0226 12:25:45.837215 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q6n44" Feb 26 12:25:46 crc kubenswrapper[4699]: I0226 12:25:46.137101 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q6n44"] Feb 26 12:25:46 crc kubenswrapper[4699]: I0226 12:25:46.498474 4699 generic.go:334] "Generic (PLEG): container finished" podID="f891d809-e0a8-4802-a2a3-2fd5d0d45607" containerID="f63a74735857407b0a5b43b39056c8d70c60ab1d68d78bf2372e9fa58517adae" exitCode=0 Feb 26 12:25:46 crc kubenswrapper[4699]: I0226 12:25:46.498909 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q6n44" event={"ID":"f891d809-e0a8-4802-a2a3-2fd5d0d45607","Type":"ContainerDied","Data":"f63a74735857407b0a5b43b39056c8d70c60ab1d68d78bf2372e9fa58517adae"} Feb 26 12:25:46 crc kubenswrapper[4699]: I0226 12:25:46.498968 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q6n44" event={"ID":"f891d809-e0a8-4802-a2a3-2fd5d0d45607","Type":"ContainerStarted","Data":"f86903ef9072f064c8fb46b2178effaa7337edbc72bcdb85b9669719d99d0bbb"} Feb 26 12:25:46 crc kubenswrapper[4699]: I0226 12:25:46.500993 4699 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 12:25:49 crc kubenswrapper[4699]: I0226 12:25:49.537187 4699 generic.go:334] "Generic (PLEG): container finished" podID="f891d809-e0a8-4802-a2a3-2fd5d0d45607" containerID="61ef761611c76e0cf0549c286fd56950d21e43fe0a1e1a9112ef04ae2af064a2" exitCode=0 Feb 26 12:25:49 crc kubenswrapper[4699]: I0226 12:25:49.537225 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q6n44" event={"ID":"f891d809-e0a8-4802-a2a3-2fd5d0d45607","Type":"ContainerDied","Data":"61ef761611c76e0cf0549c286fd56950d21e43fe0a1e1a9112ef04ae2af064a2"} Feb 26 12:25:50 crc kubenswrapper[4699]: I0226 12:25:50.548262 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q6n44" event={"ID":"f891d809-e0a8-4802-a2a3-2fd5d0d45607","Type":"ContainerStarted","Data":"2fa874b7398241542e9ad132af0702e21777587a267cb55b37517ce2148cb215"} Feb 26 12:25:50 crc kubenswrapper[4699]: I0226 12:25:50.572610 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q6n44" podStartSLOduration=2.065656132 podStartE2EDuration="5.572548488s" podCreationTimestamp="2026-02-26 12:25:45 +0000 UTC" firstStartedPulling="2026-02-26 12:25:46.500675526 +0000 UTC m=+4492.311501960" lastFinishedPulling="2026-02-26 12:25:50.007567882 +0000 UTC m=+4495.818394316" observedRunningTime="2026-02-26 12:25:50.569719896 +0000 UTC m=+4496.380546340" watchObservedRunningTime="2026-02-26 12:25:50.572548488 +0000 UTC m=+4496.383374932" Feb 26 12:25:55 crc kubenswrapper[4699]: I0226 12:25:55.837853 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q6n44" Feb 26 12:25:55 crc kubenswrapper[4699]: I0226 12:25:55.838450 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q6n44" Feb 26 12:25:55 crc kubenswrapper[4699]: I0226 12:25:55.892751 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q6n44" Feb 26 12:25:56 crc kubenswrapper[4699]: I0226 12:25:56.267420 4699 scope.go:117] "RemoveContainer" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" Feb 26 12:25:56 crc kubenswrapper[4699]: E0226 12:25:56.267814 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:25:56 crc kubenswrapper[4699]: I0226 12:25:56.648375 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q6n44" Feb 26 12:25:56 crc kubenswrapper[4699]: I0226 12:25:56.696815 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q6n44"] Feb 26 12:25:58 crc kubenswrapper[4699]: I0226 12:25:58.623612 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q6n44" podUID="f891d809-e0a8-4802-a2a3-2fd5d0d45607" containerName="registry-server" containerID="cri-o://2fa874b7398241542e9ad132af0702e21777587a267cb55b37517ce2148cb215" gracePeriod=2 Feb 26 12:25:59 crc kubenswrapper[4699]: I0226 12:25:59.643422 4699 generic.go:334] "Generic (PLEG): container finished" podID="f891d809-e0a8-4802-a2a3-2fd5d0d45607" containerID="2fa874b7398241542e9ad132af0702e21777587a267cb55b37517ce2148cb215" exitCode=0 Feb 26 12:25:59 crc kubenswrapper[4699]: I0226 12:25:59.643566 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q6n44" event={"ID":"f891d809-e0a8-4802-a2a3-2fd5d0d45607","Type":"ContainerDied","Data":"2fa874b7398241542e9ad132af0702e21777587a267cb55b37517ce2148cb215"} Feb 26 12:25:59 crc kubenswrapper[4699]: I0226 12:25:59.770904 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q6n44" Feb 26 12:25:59 crc kubenswrapper[4699]: I0226 12:25:59.900030 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f891d809-e0a8-4802-a2a3-2fd5d0d45607-catalog-content\") pod \"f891d809-e0a8-4802-a2a3-2fd5d0d45607\" (UID: \"f891d809-e0a8-4802-a2a3-2fd5d0d45607\") " Feb 26 12:25:59 crc kubenswrapper[4699]: I0226 12:25:59.900092 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnsp6\" (UniqueName: \"kubernetes.io/projected/f891d809-e0a8-4802-a2a3-2fd5d0d45607-kube-api-access-pnsp6\") pod \"f891d809-e0a8-4802-a2a3-2fd5d0d45607\" (UID: \"f891d809-e0a8-4802-a2a3-2fd5d0d45607\") " Feb 26 12:25:59 crc kubenswrapper[4699]: I0226 12:25:59.900218 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f891d809-e0a8-4802-a2a3-2fd5d0d45607-utilities\") pod \"f891d809-e0a8-4802-a2a3-2fd5d0d45607\" (UID: \"f891d809-e0a8-4802-a2a3-2fd5d0d45607\") " Feb 26 12:25:59 crc kubenswrapper[4699]: I0226 12:25:59.901357 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f891d809-e0a8-4802-a2a3-2fd5d0d45607-utilities" (OuterVolumeSpecName: "utilities") pod "f891d809-e0a8-4802-a2a3-2fd5d0d45607" (UID: "f891d809-e0a8-4802-a2a3-2fd5d0d45607"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:25:59 crc kubenswrapper[4699]: I0226 12:25:59.905387 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f891d809-e0a8-4802-a2a3-2fd5d0d45607-kube-api-access-pnsp6" (OuterVolumeSpecName: "kube-api-access-pnsp6") pod "f891d809-e0a8-4802-a2a3-2fd5d0d45607" (UID: "f891d809-e0a8-4802-a2a3-2fd5d0d45607"). InnerVolumeSpecName "kube-api-access-pnsp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.002924 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnsp6\" (UniqueName: \"kubernetes.io/projected/f891d809-e0a8-4802-a2a3-2fd5d0d45607-kube-api-access-pnsp6\") on node \"crc\" DevicePath \"\"" Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.002963 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f891d809-e0a8-4802-a2a3-2fd5d0d45607-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.151607 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535146-glcgw"] Feb 26 12:26:00 crc kubenswrapper[4699]: E0226 12:26:00.152385 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f891d809-e0a8-4802-a2a3-2fd5d0d45607" containerName="extract-utilities" Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.152403 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f891d809-e0a8-4802-a2a3-2fd5d0d45607" containerName="extract-utilities" Feb 26 12:26:00 crc kubenswrapper[4699]: E0226 12:26:00.152420 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f891d809-e0a8-4802-a2a3-2fd5d0d45607" containerName="extract-content" Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.152427 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f891d809-e0a8-4802-a2a3-2fd5d0d45607" containerName="extract-content" Feb 26 12:26:00 crc kubenswrapper[4699]: E0226 12:26:00.152458 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f891d809-e0a8-4802-a2a3-2fd5d0d45607" containerName="registry-server" Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.152463 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f891d809-e0a8-4802-a2a3-2fd5d0d45607" containerName="registry-server" Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.152656 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="f891d809-e0a8-4802-a2a3-2fd5d0d45607" containerName="registry-server" Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.153247 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535146-glcgw" Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.155310 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.156208 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.159407 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.182173 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535146-glcgw"] Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.208663 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrxqm\" (UniqueName: \"kubernetes.io/projected/654da7d8-e431-4d84-97bb-81179a5c382f-kube-api-access-mrxqm\") pod \"auto-csr-approver-29535146-glcgw\" (UID: \"654da7d8-e431-4d84-97bb-81179a5c382f\") " pod="openshift-infra/auto-csr-approver-29535146-glcgw" Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.311185 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrxqm\" (UniqueName: \"kubernetes.io/projected/654da7d8-e431-4d84-97bb-81179a5c382f-kube-api-access-mrxqm\") pod \"auto-csr-approver-29535146-glcgw\" (UID: \"654da7d8-e431-4d84-97bb-81179a5c382f\") " pod="openshift-infra/auto-csr-approver-29535146-glcgw" Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.322962 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f891d809-e0a8-4802-a2a3-2fd5d0d45607-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f891d809-e0a8-4802-a2a3-2fd5d0d45607" (UID: "f891d809-e0a8-4802-a2a3-2fd5d0d45607"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.334044 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrxqm\" (UniqueName: \"kubernetes.io/projected/654da7d8-e431-4d84-97bb-81179a5c382f-kube-api-access-mrxqm\") pod \"auto-csr-approver-29535146-glcgw\" (UID: \"654da7d8-e431-4d84-97bb-81179a5c382f\") " pod="openshift-infra/auto-csr-approver-29535146-glcgw" Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.412965 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f891d809-e0a8-4802-a2a3-2fd5d0d45607-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.483051 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535146-glcgw" Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.656979 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q6n44" event={"ID":"f891d809-e0a8-4802-a2a3-2fd5d0d45607","Type":"ContainerDied","Data":"f86903ef9072f064c8fb46b2178effaa7337edbc72bcdb85b9669719d99d0bbb"} Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.657317 4699 scope.go:117] "RemoveContainer" containerID="2fa874b7398241542e9ad132af0702e21777587a267cb55b37517ce2148cb215" Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.657061 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q6n44" Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.690319 4699 scope.go:117] "RemoveContainer" containerID="61ef761611c76e0cf0549c286fd56950d21e43fe0a1e1a9112ef04ae2af064a2" Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.730560 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q6n44"] Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.745402 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q6n44"] Feb 26 12:26:00 crc kubenswrapper[4699]: I0226 12:26:00.946619 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535146-glcgw"] Feb 26 12:26:01 crc kubenswrapper[4699]: I0226 12:26:01.238017 4699 scope.go:117] "RemoveContainer" containerID="f63a74735857407b0a5b43b39056c8d70c60ab1d68d78bf2372e9fa58517adae" Feb 26 12:26:01 crc kubenswrapper[4699]: I0226 12:26:01.668512 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535146-glcgw" event={"ID":"654da7d8-e431-4d84-97bb-81179a5c382f","Type":"ContainerStarted","Data":"7494a362efef771ab7f6ae41a997f1736c91f2472d7fbc4bddad128eb8ce5e9a"} Feb 26 12:26:02 crc kubenswrapper[4699]: I0226 12:26:02.290266 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f891d809-e0a8-4802-a2a3-2fd5d0d45607" path="/var/lib/kubelet/pods/f891d809-e0a8-4802-a2a3-2fd5d0d45607/volumes" Feb 26 12:26:03 crc kubenswrapper[4699]: I0226 12:26:03.687832 4699 generic.go:334] "Generic (PLEG): container finished" podID="654da7d8-e431-4d84-97bb-81179a5c382f" containerID="ad9766b89198b6923833e66b567c7898f5dfe70a013994bcfa98a01accc75132" exitCode=0 Feb 26 12:26:03 crc kubenswrapper[4699]: I0226 12:26:03.687881 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535146-glcgw" event={"ID":"654da7d8-e431-4d84-97bb-81179a5c382f","Type":"ContainerDied","Data":"ad9766b89198b6923833e66b567c7898f5dfe70a013994bcfa98a01accc75132"} Feb 26 12:26:05 crc kubenswrapper[4699]: I0226 12:26:05.023840 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535146-glcgw" Feb 26 12:26:05 crc kubenswrapper[4699]: I0226 12:26:05.111341 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrxqm\" (UniqueName: \"kubernetes.io/projected/654da7d8-e431-4d84-97bb-81179a5c382f-kube-api-access-mrxqm\") pod \"654da7d8-e431-4d84-97bb-81179a5c382f\" (UID: \"654da7d8-e431-4d84-97bb-81179a5c382f\") " Feb 26 12:26:05 crc kubenswrapper[4699]: I0226 12:26:05.129157 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/654da7d8-e431-4d84-97bb-81179a5c382f-kube-api-access-mrxqm" (OuterVolumeSpecName: "kube-api-access-mrxqm") pod "654da7d8-e431-4d84-97bb-81179a5c382f" (UID: "654da7d8-e431-4d84-97bb-81179a5c382f"). InnerVolumeSpecName "kube-api-access-mrxqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:26:05 crc kubenswrapper[4699]: I0226 12:26:05.213946 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrxqm\" (UniqueName: \"kubernetes.io/projected/654da7d8-e431-4d84-97bb-81179a5c382f-kube-api-access-mrxqm\") on node \"crc\" DevicePath \"\"" Feb 26 12:26:05 crc kubenswrapper[4699]: I0226 12:26:05.707873 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535146-glcgw" event={"ID":"654da7d8-e431-4d84-97bb-81179a5c382f","Type":"ContainerDied","Data":"7494a362efef771ab7f6ae41a997f1736c91f2472d7fbc4bddad128eb8ce5e9a"} Feb 26 12:26:05 crc kubenswrapper[4699]: I0226 12:26:05.707925 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7494a362efef771ab7f6ae41a997f1736c91f2472d7fbc4bddad128eb8ce5e9a" Feb 26 12:26:05 crc kubenswrapper[4699]: I0226 12:26:05.707946 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535146-glcgw" Feb 26 12:26:06 crc kubenswrapper[4699]: I0226 12:26:06.102666 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535140-wg97p"] Feb 26 12:26:06 crc kubenswrapper[4699]: I0226 12:26:06.116280 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535140-wg97p"] Feb 26 12:26:06 crc kubenswrapper[4699]: I0226 12:26:06.272520 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="924cba42-fd14-4d50-815d-0d8fa83c6b06" path="/var/lib/kubelet/pods/924cba42-fd14-4d50-815d-0d8fa83c6b06/volumes" Feb 26 12:26:09 crc kubenswrapper[4699]: I0226 12:26:09.260964 4699 scope.go:117] "RemoveContainer" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" Feb 26 12:26:09 crc kubenswrapper[4699]: E0226 12:26:09.262050 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:26:23 crc kubenswrapper[4699]: I0226 12:26:23.261284 4699 scope.go:117] "RemoveContainer" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" Feb 26 12:26:23 crc kubenswrapper[4699]: E0226 12:26:23.263209 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:26:37 crc kubenswrapper[4699]: I0226 12:26:37.260898 4699 scope.go:117] "RemoveContainer" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" Feb 26 12:26:37 crc kubenswrapper[4699]: E0226 12:26:37.261602 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:26:41 crc kubenswrapper[4699]: I0226 12:26:41.408622 4699 scope.go:117] "RemoveContainer" containerID="74ea3c51dc439314ff3bb87ede5fd5f905e28e2682d357fd7d7822dde4facddf" Feb 26 12:26:52 crc kubenswrapper[4699]: I0226 12:26:52.261536 4699 scope.go:117] "RemoveContainer" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" Feb 26 12:26:52 crc kubenswrapper[4699]: E0226 12:26:52.262521 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:27:03 crc kubenswrapper[4699]: I0226 12:27:03.260878 4699 scope.go:117] "RemoveContainer" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" Feb 26 12:27:03 crc kubenswrapper[4699]: E0226 12:27:03.261719 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:27:18 crc kubenswrapper[4699]: I0226 12:27:18.260627 4699 scope.go:117] "RemoveContainer" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" Feb 26 12:27:18 crc kubenswrapper[4699]: E0226 12:27:18.261524 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:27:33 crc kubenswrapper[4699]: I0226 12:27:33.163705 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6t9z2"] Feb 26 12:27:33 crc kubenswrapper[4699]: E0226 12:27:33.168478 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="654da7d8-e431-4d84-97bb-81179a5c382f" containerName="oc" Feb 26 12:27:33 crc kubenswrapper[4699]: I0226 12:27:33.168499 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="654da7d8-e431-4d84-97bb-81179a5c382f" containerName="oc" Feb 26 12:27:33 crc kubenswrapper[4699]: I0226 12:27:33.168703 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="654da7d8-e431-4d84-97bb-81179a5c382f" containerName="oc" Feb 26 12:27:33 crc kubenswrapper[4699]: I0226 12:27:33.170169 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6t9z2" Feb 26 12:27:33 crc kubenswrapper[4699]: I0226 12:27:33.179643 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6t9z2"] Feb 26 12:27:33 crc kubenswrapper[4699]: I0226 12:27:33.260423 4699 scope.go:117] "RemoveContainer" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" Feb 26 12:27:33 crc kubenswrapper[4699]: E0226 12:27:33.260766 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:27:33 crc kubenswrapper[4699]: I0226 12:27:33.361457 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82dx9\" (UniqueName: \"kubernetes.io/projected/eaaa487f-21d2-470a-9bce-914a42da0710-kube-api-access-82dx9\") pod \"redhat-marketplace-6t9z2\" (UID: \"eaaa487f-21d2-470a-9bce-914a42da0710\") " pod="openshift-marketplace/redhat-marketplace-6t9z2" Feb 26 12:27:33 crc kubenswrapper[4699]: I0226 12:27:33.361980 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaaa487f-21d2-470a-9bce-914a42da0710-catalog-content\") pod \"redhat-marketplace-6t9z2\" (UID: \"eaaa487f-21d2-470a-9bce-914a42da0710\") " pod="openshift-marketplace/redhat-marketplace-6t9z2" Feb 26 12:27:33 crc kubenswrapper[4699]: I0226 12:27:33.362602 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaaa487f-21d2-470a-9bce-914a42da0710-utilities\") pod \"redhat-marketplace-6t9z2\" (UID: \"eaaa487f-21d2-470a-9bce-914a42da0710\") " pod="openshift-marketplace/redhat-marketplace-6t9z2" Feb 26 12:27:33 crc kubenswrapper[4699]: I0226 12:27:33.464243 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82dx9\" (UniqueName: \"kubernetes.io/projected/eaaa487f-21d2-470a-9bce-914a42da0710-kube-api-access-82dx9\") pod \"redhat-marketplace-6t9z2\" (UID: \"eaaa487f-21d2-470a-9bce-914a42da0710\") " pod="openshift-marketplace/redhat-marketplace-6t9z2" Feb 26 12:27:33 crc kubenswrapper[4699]: I0226 12:27:33.464314 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaaa487f-21d2-470a-9bce-914a42da0710-catalog-content\") pod \"redhat-marketplace-6t9z2\" (UID: \"eaaa487f-21d2-470a-9bce-914a42da0710\") " pod="openshift-marketplace/redhat-marketplace-6t9z2" Feb 26 12:27:33 crc kubenswrapper[4699]: I0226 12:27:33.464446 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaaa487f-21d2-470a-9bce-914a42da0710-utilities\") pod \"redhat-marketplace-6t9z2\" (UID: \"eaaa487f-21d2-470a-9bce-914a42da0710\") " pod="openshift-marketplace/redhat-marketplace-6t9z2" Feb 26 12:27:33 crc kubenswrapper[4699]: I0226 12:27:33.464923 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaaa487f-21d2-470a-9bce-914a42da0710-catalog-content\") pod \"redhat-marketplace-6t9z2\" (UID: \"eaaa487f-21d2-470a-9bce-914a42da0710\") " pod="openshift-marketplace/redhat-marketplace-6t9z2" Feb 26 12:27:33 crc kubenswrapper[4699]: I0226 12:27:33.465138 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaaa487f-21d2-470a-9bce-914a42da0710-utilities\") pod \"redhat-marketplace-6t9z2\" (UID: \"eaaa487f-21d2-470a-9bce-914a42da0710\") " pod="openshift-marketplace/redhat-marketplace-6t9z2" Feb 26 12:27:33 crc kubenswrapper[4699]: I0226 12:27:33.484712 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82dx9\" (UniqueName: \"kubernetes.io/projected/eaaa487f-21d2-470a-9bce-914a42da0710-kube-api-access-82dx9\") pod \"redhat-marketplace-6t9z2\" (UID: \"eaaa487f-21d2-470a-9bce-914a42da0710\") " pod="openshift-marketplace/redhat-marketplace-6t9z2" Feb 26 12:27:33 crc kubenswrapper[4699]: I0226 12:27:33.487478 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6t9z2" Feb 26 12:27:33 crc kubenswrapper[4699]: I0226 12:27:33.953630 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6t9z2"] Feb 26 12:27:34 crc kubenswrapper[4699]: I0226 12:27:34.495400 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6t9z2" event={"ID":"eaaa487f-21d2-470a-9bce-914a42da0710","Type":"ContainerStarted","Data":"482f4c95523e33a2ef21be55b9a8f8be2b13ce328c104dfd3b67ee35efb3e958"} Feb 26 12:27:35 crc kubenswrapper[4699]: I0226 12:27:35.507150 4699 generic.go:334] "Generic (PLEG): container finished" podID="eaaa487f-21d2-470a-9bce-914a42da0710" containerID="fc69724a90e386282f69cffc1a4653e7d87b6517a937f65097c7cf5f0f76da29" exitCode=0 Feb 26 12:27:35 crc kubenswrapper[4699]: I0226 12:27:35.507329 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6t9z2" event={"ID":"eaaa487f-21d2-470a-9bce-914a42da0710","Type":"ContainerDied","Data":"fc69724a90e386282f69cffc1a4653e7d87b6517a937f65097c7cf5f0f76da29"} Feb 26 12:27:37 crc kubenswrapper[4699]: I0226 12:27:37.529549 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6t9z2" event={"ID":"eaaa487f-21d2-470a-9bce-914a42da0710","Type":"ContainerStarted","Data":"28c31cfddd5850fe577c6d8f161045b7c089c1ee5bd2e321948f303c98410cf0"} Feb 26 12:27:38 crc kubenswrapper[4699]: I0226 12:27:38.541801 4699 generic.go:334] "Generic (PLEG): container finished" podID="eaaa487f-21d2-470a-9bce-914a42da0710" containerID="28c31cfddd5850fe577c6d8f161045b7c089c1ee5bd2e321948f303c98410cf0" exitCode=0 Feb 26 12:27:38 crc kubenswrapper[4699]: I0226 12:27:38.541897 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6t9z2" event={"ID":"eaaa487f-21d2-470a-9bce-914a42da0710","Type":"ContainerDied","Data":"28c31cfddd5850fe577c6d8f161045b7c089c1ee5bd2e321948f303c98410cf0"} Feb 26 12:27:39 crc kubenswrapper[4699]: I0226 12:27:39.551082 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6t9z2" event={"ID":"eaaa487f-21d2-470a-9bce-914a42da0710","Type":"ContainerStarted","Data":"e77c07126944e5ff02c11f60634c728650d676e0f4404cb5b18f9299b090c39c"} Feb 26 12:27:39 crc kubenswrapper[4699]: I0226 12:27:39.576293 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6t9z2" podStartSLOduration=3.027688096 podStartE2EDuration="6.57627032s" podCreationTimestamp="2026-02-26 12:27:33 +0000 UTC" firstStartedPulling="2026-02-26 12:27:35.51129264 +0000 UTC m=+4601.322119074" lastFinishedPulling="2026-02-26 12:27:39.059874874 +0000 UTC m=+4604.870701298" observedRunningTime="2026-02-26 12:27:39.566286161 +0000 UTC m=+4605.377112615" watchObservedRunningTime="2026-02-26 12:27:39.57627032 +0000 UTC m=+4605.387096754" Feb 26 12:27:41 crc kubenswrapper[4699]: I0226 12:27:41.557184 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jt4jp"] Feb 26 12:27:41 crc kubenswrapper[4699]: I0226 12:27:41.560243 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jt4jp" Feb 26 12:27:41 crc kubenswrapper[4699]: I0226 12:27:41.587052 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jt4jp"] Feb 26 12:27:41 crc kubenswrapper[4699]: I0226 12:27:41.623276 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02d3bec2-0500-4cf0-bd86-afb60afff196-utilities\") pod \"certified-operators-jt4jp\" (UID: \"02d3bec2-0500-4cf0-bd86-afb60afff196\") " pod="openshift-marketplace/certified-operators-jt4jp" Feb 26 12:27:41 crc kubenswrapper[4699]: I0226 12:27:41.623570 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02d3bec2-0500-4cf0-bd86-afb60afff196-catalog-content\") pod \"certified-operators-jt4jp\" (UID: \"02d3bec2-0500-4cf0-bd86-afb60afff196\") " pod="openshift-marketplace/certified-operators-jt4jp" Feb 26 12:27:41 crc kubenswrapper[4699]: I0226 12:27:41.623825 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghqg8\" (UniqueName: \"kubernetes.io/projected/02d3bec2-0500-4cf0-bd86-afb60afff196-kube-api-access-ghqg8\") pod \"certified-operators-jt4jp\" (UID: \"02d3bec2-0500-4cf0-bd86-afb60afff196\") " pod="openshift-marketplace/certified-operators-jt4jp" Feb 26 12:27:41 crc kubenswrapper[4699]: I0226 12:27:41.725701 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghqg8\" (UniqueName: \"kubernetes.io/projected/02d3bec2-0500-4cf0-bd86-afb60afff196-kube-api-access-ghqg8\") pod \"certified-operators-jt4jp\" (UID: \"02d3bec2-0500-4cf0-bd86-afb60afff196\") " pod="openshift-marketplace/certified-operators-jt4jp" Feb 26 12:27:41 crc kubenswrapper[4699]: I0226 12:27:41.725777 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02d3bec2-0500-4cf0-bd86-afb60afff196-utilities\") pod \"certified-operators-jt4jp\" (UID: \"02d3bec2-0500-4cf0-bd86-afb60afff196\") " pod="openshift-marketplace/certified-operators-jt4jp" Feb 26 12:27:41 crc kubenswrapper[4699]: I0226 12:27:41.725809 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02d3bec2-0500-4cf0-bd86-afb60afff196-catalog-content\") pod \"certified-operators-jt4jp\" (UID: \"02d3bec2-0500-4cf0-bd86-afb60afff196\") " pod="openshift-marketplace/certified-operators-jt4jp" Feb 26 12:27:41 crc kubenswrapper[4699]: I0226 12:27:41.726386 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02d3bec2-0500-4cf0-bd86-afb60afff196-catalog-content\") pod \"certified-operators-jt4jp\" (UID: \"02d3bec2-0500-4cf0-bd86-afb60afff196\") " pod="openshift-marketplace/certified-operators-jt4jp" Feb 26 12:27:41 crc kubenswrapper[4699]: I0226 12:27:41.726904 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02d3bec2-0500-4cf0-bd86-afb60afff196-utilities\") pod \"certified-operators-jt4jp\" (UID: \"02d3bec2-0500-4cf0-bd86-afb60afff196\") " pod="openshift-marketplace/certified-operators-jt4jp" Feb 26 12:27:41 crc kubenswrapper[4699]: I0226 12:27:41.749798 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghqg8\" (UniqueName: \"kubernetes.io/projected/02d3bec2-0500-4cf0-bd86-afb60afff196-kube-api-access-ghqg8\") pod \"certified-operators-jt4jp\" (UID: \"02d3bec2-0500-4cf0-bd86-afb60afff196\") " pod="openshift-marketplace/certified-operators-jt4jp" Feb 26 12:27:41 crc kubenswrapper[4699]: I0226 12:27:41.885735 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jt4jp" Feb 26 12:27:42 crc kubenswrapper[4699]: I0226 12:27:42.360449 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jt4jp"] Feb 26 12:27:42 crc kubenswrapper[4699]: I0226 12:27:42.576404 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jt4jp" event={"ID":"02d3bec2-0500-4cf0-bd86-afb60afff196","Type":"ContainerStarted","Data":"02a24e01b1c182ae9a9efe8461a684df616383c8ce03e663ec76416efe2f36ea"} Feb 26 12:27:43 crc kubenswrapper[4699]: I0226 12:27:43.487817 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6t9z2" Feb 26 12:27:43 crc kubenswrapper[4699]: I0226 12:27:43.489392 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6t9z2" Feb 26 12:27:43 crc kubenswrapper[4699]: I0226 12:27:43.548211 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6t9z2" Feb 26 12:27:43 crc kubenswrapper[4699]: I0226 12:27:43.586994 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jt4jp" event={"ID":"02d3bec2-0500-4cf0-bd86-afb60afff196","Type":"ContainerDied","Data":"b4022229d156fc5677e43d0ae8674a6a03ab663afab5557ed38024993deb55f7"} Feb 26 12:27:43 crc kubenswrapper[4699]: I0226 12:27:43.586930 4699 generic.go:334] "Generic (PLEG): container finished" podID="02d3bec2-0500-4cf0-bd86-afb60afff196" containerID="b4022229d156fc5677e43d0ae8674a6a03ab663afab5557ed38024993deb55f7" exitCode=0 Feb 26 12:27:44 crc kubenswrapper[4699]: I0226 12:27:44.647474 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6t9z2" Feb 26 12:27:46 crc kubenswrapper[4699]: I0226 12:27:46.543933 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6t9z2"] Feb 26 12:27:46 crc kubenswrapper[4699]: I0226 12:27:46.615185 4699 generic.go:334] "Generic (PLEG): container finished" podID="02d3bec2-0500-4cf0-bd86-afb60afff196" containerID="596ca82299eb4b3c2c0c7cdfb8e6e760467fefb1fafd10f080a0a95e6093e4d7" exitCode=0 Feb 26 12:27:46 crc kubenswrapper[4699]: I0226 12:27:46.615390 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6t9z2" podUID="eaaa487f-21d2-470a-9bce-914a42da0710" containerName="registry-server" containerID="cri-o://e77c07126944e5ff02c11f60634c728650d676e0f4404cb5b18f9299b090c39c" gracePeriod=2 Feb 26 12:27:46 crc kubenswrapper[4699]: I0226 12:27:46.615377 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jt4jp" event={"ID":"02d3bec2-0500-4cf0-bd86-afb60afff196","Type":"ContainerDied","Data":"596ca82299eb4b3c2c0c7cdfb8e6e760467fefb1fafd10f080a0a95e6093e4d7"} Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.150593 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6t9z2" Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.251459 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82dx9\" (UniqueName: \"kubernetes.io/projected/eaaa487f-21d2-470a-9bce-914a42da0710-kube-api-access-82dx9\") pod \"eaaa487f-21d2-470a-9bce-914a42da0710\" (UID: \"eaaa487f-21d2-470a-9bce-914a42da0710\") " Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.253204 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaaa487f-21d2-470a-9bce-914a42da0710-utilities\") pod \"eaaa487f-21d2-470a-9bce-914a42da0710\" (UID: \"eaaa487f-21d2-470a-9bce-914a42da0710\") " Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.253242 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaaa487f-21d2-470a-9bce-914a42da0710-catalog-content\") pod \"eaaa487f-21d2-470a-9bce-914a42da0710\" (UID: \"eaaa487f-21d2-470a-9bce-914a42da0710\") " Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.254776 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaaa487f-21d2-470a-9bce-914a42da0710-utilities" (OuterVolumeSpecName: "utilities") pod "eaaa487f-21d2-470a-9bce-914a42da0710" (UID: "eaaa487f-21d2-470a-9bce-914a42da0710"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.291904 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaaa487f-21d2-470a-9bce-914a42da0710-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eaaa487f-21d2-470a-9bce-914a42da0710" (UID: "eaaa487f-21d2-470a-9bce-914a42da0710"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.356709 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaaa487f-21d2-470a-9bce-914a42da0710-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.356978 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaaa487f-21d2-470a-9bce-914a42da0710-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.626067 4699 generic.go:334] "Generic (PLEG): container finished" podID="eaaa487f-21d2-470a-9bce-914a42da0710" containerID="e77c07126944e5ff02c11f60634c728650d676e0f4404cb5b18f9299b090c39c" exitCode=0 Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.626186 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6t9z2" event={"ID":"eaaa487f-21d2-470a-9bce-914a42da0710","Type":"ContainerDied","Data":"e77c07126944e5ff02c11f60634c728650d676e0f4404cb5b18f9299b090c39c"} Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.626228 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6t9z2" event={"ID":"eaaa487f-21d2-470a-9bce-914a42da0710","Type":"ContainerDied","Data":"482f4c95523e33a2ef21be55b9a8f8be2b13ce328c104dfd3b67ee35efb3e958"} Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.626253 4699 scope.go:117] "RemoveContainer" containerID="e77c07126944e5ff02c11f60634c728650d676e0f4404cb5b18f9299b090c39c" Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.627354 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6t9z2" Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.629220 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jt4jp" event={"ID":"02d3bec2-0500-4cf0-bd86-afb60afff196","Type":"ContainerStarted","Data":"ba7fc42bb71cb3014343c4f82cbede20f0bcd5139709fa5f64efb305f44bb521"} Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.646016 4699 scope.go:117] "RemoveContainer" containerID="28c31cfddd5850fe577c6d8f161045b7c089c1ee5bd2e321948f303c98410cf0" Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.658447 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jt4jp" podStartSLOduration=3.266455104 podStartE2EDuration="6.658425395s" podCreationTimestamp="2026-02-26 12:27:41 +0000 UTC" firstStartedPulling="2026-02-26 12:27:43.589416978 +0000 UTC m=+4609.400243402" lastFinishedPulling="2026-02-26 12:27:46.981387259 +0000 UTC m=+4612.792213693" observedRunningTime="2026-02-26 12:27:47.65241493 +0000 UTC m=+4613.463241374" watchObservedRunningTime="2026-02-26 12:27:47.658425395 +0000 UTC m=+4613.469251829" Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.738778 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaaa487f-21d2-470a-9bce-914a42da0710-kube-api-access-82dx9" (OuterVolumeSpecName: "kube-api-access-82dx9") pod "eaaa487f-21d2-470a-9bce-914a42da0710" (UID: "eaaa487f-21d2-470a-9bce-914a42da0710"). InnerVolumeSpecName "kube-api-access-82dx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.741891 4699 scope.go:117] "RemoveContainer" containerID="fc69724a90e386282f69cffc1a4653e7d87b6517a937f65097c7cf5f0f76da29" Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.766327 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82dx9\" (UniqueName: \"kubernetes.io/projected/eaaa487f-21d2-470a-9bce-914a42da0710-kube-api-access-82dx9\") on node \"crc\" DevicePath \"\"" Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.832383 4699 scope.go:117] "RemoveContainer" containerID="e77c07126944e5ff02c11f60634c728650d676e0f4404cb5b18f9299b090c39c" Feb 26 12:27:47 crc kubenswrapper[4699]: E0226 12:27:47.832983 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e77c07126944e5ff02c11f60634c728650d676e0f4404cb5b18f9299b090c39c\": container with ID starting with e77c07126944e5ff02c11f60634c728650d676e0f4404cb5b18f9299b090c39c not found: ID does not exist" containerID="e77c07126944e5ff02c11f60634c728650d676e0f4404cb5b18f9299b090c39c" Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.833015 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e77c07126944e5ff02c11f60634c728650d676e0f4404cb5b18f9299b090c39c"} err="failed to get container status \"e77c07126944e5ff02c11f60634c728650d676e0f4404cb5b18f9299b090c39c\": rpc error: code = NotFound desc = could not find container \"e77c07126944e5ff02c11f60634c728650d676e0f4404cb5b18f9299b090c39c\": container with ID starting with e77c07126944e5ff02c11f60634c728650d676e0f4404cb5b18f9299b090c39c not found: ID does not exist" Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.833036 4699 scope.go:117] "RemoveContainer" containerID="28c31cfddd5850fe577c6d8f161045b7c089c1ee5bd2e321948f303c98410cf0" Feb 26 12:27:47 crc kubenswrapper[4699]: E0226 12:27:47.833404 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28c31cfddd5850fe577c6d8f161045b7c089c1ee5bd2e321948f303c98410cf0\": container with ID starting with 28c31cfddd5850fe577c6d8f161045b7c089c1ee5bd2e321948f303c98410cf0 not found: ID does not exist" containerID="28c31cfddd5850fe577c6d8f161045b7c089c1ee5bd2e321948f303c98410cf0" Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.833425 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28c31cfddd5850fe577c6d8f161045b7c089c1ee5bd2e321948f303c98410cf0"} err="failed to get container status \"28c31cfddd5850fe577c6d8f161045b7c089c1ee5bd2e321948f303c98410cf0\": rpc error: code = NotFound desc = could not find container \"28c31cfddd5850fe577c6d8f161045b7c089c1ee5bd2e321948f303c98410cf0\": container with ID starting with 28c31cfddd5850fe577c6d8f161045b7c089c1ee5bd2e321948f303c98410cf0 not found: ID does not exist" Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.833440 4699 scope.go:117] "RemoveContainer" containerID="fc69724a90e386282f69cffc1a4653e7d87b6517a937f65097c7cf5f0f76da29" Feb 26 12:27:47 crc kubenswrapper[4699]: E0226 12:27:47.833765 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc69724a90e386282f69cffc1a4653e7d87b6517a937f65097c7cf5f0f76da29\": container with ID starting with fc69724a90e386282f69cffc1a4653e7d87b6517a937f65097c7cf5f0f76da29 not found: ID does not exist" containerID="fc69724a90e386282f69cffc1a4653e7d87b6517a937f65097c7cf5f0f76da29" Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.833786 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc69724a90e386282f69cffc1a4653e7d87b6517a937f65097c7cf5f0f76da29"} err="failed to get container status \"fc69724a90e386282f69cffc1a4653e7d87b6517a937f65097c7cf5f0f76da29\": rpc error: code = NotFound desc = could not find container \"fc69724a90e386282f69cffc1a4653e7d87b6517a937f65097c7cf5f0f76da29\": container with ID starting with fc69724a90e386282f69cffc1a4653e7d87b6517a937f65097c7cf5f0f76da29 not found: ID does not exist" Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.972976 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6t9z2"] Feb 26 12:27:47 crc kubenswrapper[4699]: I0226 12:27:47.982035 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6t9z2"] Feb 26 12:27:48 crc kubenswrapper[4699]: I0226 12:27:48.261221 4699 scope.go:117] "RemoveContainer" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" Feb 26 12:27:48 crc kubenswrapper[4699]: E0226 12:27:48.261541 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:27:48 crc kubenswrapper[4699]: I0226 12:27:48.274872 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaaa487f-21d2-470a-9bce-914a42da0710" path="/var/lib/kubelet/pods/eaaa487f-21d2-470a-9bce-914a42da0710/volumes" Feb 26 12:27:51 crc kubenswrapper[4699]: I0226 12:27:51.886874 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jt4jp" Feb 26 12:27:51 crc kubenswrapper[4699]: I0226 12:27:51.887531 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jt4jp" Feb 26 12:27:51 crc kubenswrapper[4699]: I0226 12:27:51.939050 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jt4jp" Feb 26 12:27:52 crc kubenswrapper[4699]: I0226 12:27:52.728186 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jt4jp" Feb 26 12:27:53 crc kubenswrapper[4699]: I0226 12:27:53.341228 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jt4jp"] Feb 26 12:27:54 crc kubenswrapper[4699]: I0226 12:27:54.700576 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jt4jp" podUID="02d3bec2-0500-4cf0-bd86-afb60afff196" containerName="registry-server" containerID="cri-o://ba7fc42bb71cb3014343c4f82cbede20f0bcd5139709fa5f64efb305f44bb521" gracePeriod=2 Feb 26 12:27:55 crc kubenswrapper[4699]: I0226 12:27:55.224456 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jt4jp" Feb 26 12:27:55 crc kubenswrapper[4699]: I0226 12:27:55.336756 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02d3bec2-0500-4cf0-bd86-afb60afff196-catalog-content\") pod \"02d3bec2-0500-4cf0-bd86-afb60afff196\" (UID: \"02d3bec2-0500-4cf0-bd86-afb60afff196\") " Feb 26 12:27:55 crc kubenswrapper[4699]: I0226 12:27:55.336879 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02d3bec2-0500-4cf0-bd86-afb60afff196-utilities\") pod \"02d3bec2-0500-4cf0-bd86-afb60afff196\" (UID: \"02d3bec2-0500-4cf0-bd86-afb60afff196\") " Feb 26 12:27:55 crc kubenswrapper[4699]: I0226 12:27:55.336989 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghqg8\" (UniqueName: \"kubernetes.io/projected/02d3bec2-0500-4cf0-bd86-afb60afff196-kube-api-access-ghqg8\") pod \"02d3bec2-0500-4cf0-bd86-afb60afff196\" (UID: \"02d3bec2-0500-4cf0-bd86-afb60afff196\") " Feb 26 12:27:55 crc kubenswrapper[4699]: I0226 12:27:55.338653 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02d3bec2-0500-4cf0-bd86-afb60afff196-utilities" (OuterVolumeSpecName: "utilities") pod "02d3bec2-0500-4cf0-bd86-afb60afff196" (UID: "02d3bec2-0500-4cf0-bd86-afb60afff196"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:27:55 crc kubenswrapper[4699]: I0226 12:27:55.345161 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02d3bec2-0500-4cf0-bd86-afb60afff196-kube-api-access-ghqg8" (OuterVolumeSpecName: "kube-api-access-ghqg8") pod "02d3bec2-0500-4cf0-bd86-afb60afff196" (UID: "02d3bec2-0500-4cf0-bd86-afb60afff196"). InnerVolumeSpecName "kube-api-access-ghqg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:27:55 crc kubenswrapper[4699]: I0226 12:27:55.414509 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02d3bec2-0500-4cf0-bd86-afb60afff196-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02d3bec2-0500-4cf0-bd86-afb60afff196" (UID: "02d3bec2-0500-4cf0-bd86-afb60afff196"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 12:27:55 crc kubenswrapper[4699]: I0226 12:27:55.439423 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghqg8\" (UniqueName: \"kubernetes.io/projected/02d3bec2-0500-4cf0-bd86-afb60afff196-kube-api-access-ghqg8\") on node \"crc\" DevicePath \"\"" Feb 26 12:27:55 crc kubenswrapper[4699]: I0226 12:27:55.439455 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02d3bec2-0500-4cf0-bd86-afb60afff196-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 12:27:55 crc kubenswrapper[4699]: I0226 12:27:55.439465 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02d3bec2-0500-4cf0-bd86-afb60afff196-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 12:27:55 crc kubenswrapper[4699]: I0226 12:27:55.711053 4699 generic.go:334] "Generic (PLEG): container finished" podID="02d3bec2-0500-4cf0-bd86-afb60afff196" containerID="ba7fc42bb71cb3014343c4f82cbede20f0bcd5139709fa5f64efb305f44bb521" exitCode=0 Feb 26 12:27:55 crc kubenswrapper[4699]: I0226 12:27:55.711149 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jt4jp" event={"ID":"02d3bec2-0500-4cf0-bd86-afb60afff196","Type":"ContainerDied","Data":"ba7fc42bb71cb3014343c4f82cbede20f0bcd5139709fa5f64efb305f44bb521"} Feb 26 12:27:55 crc kubenswrapper[4699]: I0226 12:27:55.711196 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jt4jp" event={"ID":"02d3bec2-0500-4cf0-bd86-afb60afff196","Type":"ContainerDied","Data":"02a24e01b1c182ae9a9efe8461a684df616383c8ce03e663ec76416efe2f36ea"} Feb 26 12:27:55 crc kubenswrapper[4699]: I0226 12:27:55.711209 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jt4jp" Feb 26 12:27:55 crc kubenswrapper[4699]: I0226 12:27:55.711231 4699 scope.go:117] "RemoveContainer" containerID="ba7fc42bb71cb3014343c4f82cbede20f0bcd5139709fa5f64efb305f44bb521" Feb 26 12:27:55 crc kubenswrapper[4699]: I0226 12:27:55.733227 4699 scope.go:117] "RemoveContainer" containerID="596ca82299eb4b3c2c0c7cdfb8e6e760467fefb1fafd10f080a0a95e6093e4d7" Feb 26 12:27:55 crc kubenswrapper[4699]: I0226 12:27:55.757512 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jt4jp"] Feb 26 12:27:55 crc kubenswrapper[4699]: I0226 12:27:55.766874 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jt4jp"] Feb 26 12:27:56 crc kubenswrapper[4699]: I0226 12:27:56.289248 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02d3bec2-0500-4cf0-bd86-afb60afff196" path="/var/lib/kubelet/pods/02d3bec2-0500-4cf0-bd86-afb60afff196/volumes" Feb 26 12:27:56 crc kubenswrapper[4699]: I0226 12:27:56.339323 4699 scope.go:117] "RemoveContainer" containerID="b4022229d156fc5677e43d0ae8674a6a03ab663afab5557ed38024993deb55f7" Feb 26 12:27:56 crc kubenswrapper[4699]: I0226 12:27:56.426488 4699 scope.go:117] "RemoveContainer" containerID="ba7fc42bb71cb3014343c4f82cbede20f0bcd5139709fa5f64efb305f44bb521" Feb 26 12:27:56 crc kubenswrapper[4699]: E0226 12:27:56.429707 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba7fc42bb71cb3014343c4f82cbede20f0bcd5139709fa5f64efb305f44bb521\": container with ID starting with ba7fc42bb71cb3014343c4f82cbede20f0bcd5139709fa5f64efb305f44bb521 not found: ID does not exist" containerID="ba7fc42bb71cb3014343c4f82cbede20f0bcd5139709fa5f64efb305f44bb521" Feb 26 12:27:56 crc kubenswrapper[4699]: I0226 12:27:56.429757 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba7fc42bb71cb3014343c4f82cbede20f0bcd5139709fa5f64efb305f44bb521"} err="failed to get container status \"ba7fc42bb71cb3014343c4f82cbede20f0bcd5139709fa5f64efb305f44bb521\": rpc error: code = NotFound desc = could not find container \"ba7fc42bb71cb3014343c4f82cbede20f0bcd5139709fa5f64efb305f44bb521\": container with ID starting with ba7fc42bb71cb3014343c4f82cbede20f0bcd5139709fa5f64efb305f44bb521 not found: ID does not exist" Feb 26 12:27:56 crc kubenswrapper[4699]: I0226 12:27:56.429791 4699 scope.go:117] "RemoveContainer" containerID="596ca82299eb4b3c2c0c7cdfb8e6e760467fefb1fafd10f080a0a95e6093e4d7" Feb 26 12:27:56 crc kubenswrapper[4699]: E0226 12:27:56.430275 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"596ca82299eb4b3c2c0c7cdfb8e6e760467fefb1fafd10f080a0a95e6093e4d7\": container with ID starting with 596ca82299eb4b3c2c0c7cdfb8e6e760467fefb1fafd10f080a0a95e6093e4d7 not found: ID does not exist" containerID="596ca82299eb4b3c2c0c7cdfb8e6e760467fefb1fafd10f080a0a95e6093e4d7" Feb 26 12:27:56 crc kubenswrapper[4699]: I0226 12:27:56.430313 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596ca82299eb4b3c2c0c7cdfb8e6e760467fefb1fafd10f080a0a95e6093e4d7"} err="failed to get container status \"596ca82299eb4b3c2c0c7cdfb8e6e760467fefb1fafd10f080a0a95e6093e4d7\": rpc error: code = NotFound desc = could not find container \"596ca82299eb4b3c2c0c7cdfb8e6e760467fefb1fafd10f080a0a95e6093e4d7\": container with ID starting with 596ca82299eb4b3c2c0c7cdfb8e6e760467fefb1fafd10f080a0a95e6093e4d7 not found: ID does not exist" Feb 26 12:27:56 crc kubenswrapper[4699]: I0226 12:27:56.430334 4699 scope.go:117] "RemoveContainer" containerID="b4022229d156fc5677e43d0ae8674a6a03ab663afab5557ed38024993deb55f7" Feb 26 12:27:56 crc kubenswrapper[4699]: E0226 12:27:56.430689 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4022229d156fc5677e43d0ae8674a6a03ab663afab5557ed38024993deb55f7\": container with ID starting with b4022229d156fc5677e43d0ae8674a6a03ab663afab5557ed38024993deb55f7 not found: ID does not exist" containerID="b4022229d156fc5677e43d0ae8674a6a03ab663afab5557ed38024993deb55f7" Feb 26 12:27:56 crc kubenswrapper[4699]: I0226 12:27:56.430721 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4022229d156fc5677e43d0ae8674a6a03ab663afab5557ed38024993deb55f7"} err="failed to get container status \"b4022229d156fc5677e43d0ae8674a6a03ab663afab5557ed38024993deb55f7\": rpc error: code = NotFound desc = could not find container \"b4022229d156fc5677e43d0ae8674a6a03ab663afab5557ed38024993deb55f7\": container with ID starting with b4022229d156fc5677e43d0ae8674a6a03ab663afab5557ed38024993deb55f7 not found: ID does not exist" Feb 26 12:28:00 crc kubenswrapper[4699]: I0226 12:28:00.158054 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535148-xbtxt"] Feb 26 12:28:00 crc kubenswrapper[4699]: E0226 12:28:00.159255 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaaa487f-21d2-470a-9bce-914a42da0710" containerName="registry-server" Feb 26 12:28:00 crc kubenswrapper[4699]: I0226 12:28:00.159276 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaaa487f-21d2-470a-9bce-914a42da0710" containerName="registry-server" Feb 26 12:28:00 crc kubenswrapper[4699]: E0226 12:28:00.159293 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaaa487f-21d2-470a-9bce-914a42da0710" containerName="extract-utilities" Feb 26 12:28:00 crc kubenswrapper[4699]: I0226 12:28:00.159300 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaaa487f-21d2-470a-9bce-914a42da0710" containerName="extract-utilities" Feb 26 12:28:00 crc kubenswrapper[4699]: E0226 12:28:00.159315 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d3bec2-0500-4cf0-bd86-afb60afff196" containerName="registry-server" Feb 26 12:28:00 crc kubenswrapper[4699]: I0226 12:28:00.159324 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d3bec2-0500-4cf0-bd86-afb60afff196" containerName="registry-server" Feb 26 12:28:00 crc kubenswrapper[4699]: E0226 12:28:00.159373 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaaa487f-21d2-470a-9bce-914a42da0710" containerName="extract-content" Feb 26 12:28:00 crc kubenswrapper[4699]: I0226 12:28:00.159384 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaaa487f-21d2-470a-9bce-914a42da0710" containerName="extract-content" Feb 26 12:28:00 crc kubenswrapper[4699]: E0226 12:28:00.161406 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d3bec2-0500-4cf0-bd86-afb60afff196" containerName="extract-utilities" Feb 26 12:28:00 crc kubenswrapper[4699]: I0226 12:28:00.161425 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d3bec2-0500-4cf0-bd86-afb60afff196" containerName="extract-utilities" Feb 26 12:28:00 crc kubenswrapper[4699]: E0226 12:28:00.161445 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d3bec2-0500-4cf0-bd86-afb60afff196" containerName="extract-content" Feb 26 12:28:00 crc kubenswrapper[4699]: I0226 12:28:00.161452 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d3bec2-0500-4cf0-bd86-afb60afff196" containerName="extract-content" Feb 26 12:28:00 crc kubenswrapper[4699]: I0226 12:28:00.161796 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="02d3bec2-0500-4cf0-bd86-afb60afff196" containerName="registry-server" Feb 26 12:28:00 crc kubenswrapper[4699]: I0226 12:28:00.161820 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaaa487f-21d2-470a-9bce-914a42da0710" containerName="registry-server" Feb 26 12:28:00 crc kubenswrapper[4699]: I0226 12:28:00.163386 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535148-xbtxt" Feb 26 12:28:00 crc kubenswrapper[4699]: I0226 12:28:00.168076 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 12:28:00 crc kubenswrapper[4699]: I0226 12:28:00.172088 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 12:28:00 crc kubenswrapper[4699]: I0226 12:28:00.172681 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9855d" Feb 26 12:28:00 crc kubenswrapper[4699]: I0226 12:28:00.185610 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535148-xbtxt"] Feb 26 12:28:00 crc kubenswrapper[4699]: I0226 12:28:00.239894 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-992z2\" (UniqueName: \"kubernetes.io/projected/c75250fb-35c3-4966-a995-33aaa68ec5e9-kube-api-access-992z2\") pod \"auto-csr-approver-29535148-xbtxt\" (UID: \"c75250fb-35c3-4966-a995-33aaa68ec5e9\") " pod="openshift-infra/auto-csr-approver-29535148-xbtxt" Feb 26 12:28:00 crc kubenswrapper[4699]: I0226 12:28:00.261010 4699 scope.go:117] "RemoveContainer" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" Feb 26 12:28:00 crc kubenswrapper[4699]: E0226 12:28:00.261332 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:28:00 crc kubenswrapper[4699]: I0226 12:28:00.341095 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-992z2\" (UniqueName: \"kubernetes.io/projected/c75250fb-35c3-4966-a995-33aaa68ec5e9-kube-api-access-992z2\") pod \"auto-csr-approver-29535148-xbtxt\" (UID: \"c75250fb-35c3-4966-a995-33aaa68ec5e9\") " pod="openshift-infra/auto-csr-approver-29535148-xbtxt" Feb 26 12:28:00 crc kubenswrapper[4699]: I0226 12:28:00.360564 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-992z2\" (UniqueName: \"kubernetes.io/projected/c75250fb-35c3-4966-a995-33aaa68ec5e9-kube-api-access-992z2\") pod \"auto-csr-approver-29535148-xbtxt\" (UID: \"c75250fb-35c3-4966-a995-33aaa68ec5e9\") " pod="openshift-infra/auto-csr-approver-29535148-xbtxt" Feb 26 12:28:00 crc kubenswrapper[4699]: I0226 12:28:00.486091 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535148-xbtxt" Feb 26 12:28:00 crc kubenswrapper[4699]: I0226 12:28:00.933441 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535148-xbtxt"] Feb 26 12:28:01 crc kubenswrapper[4699]: I0226 12:28:01.773046 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535148-xbtxt" event={"ID":"c75250fb-35c3-4966-a995-33aaa68ec5e9","Type":"ContainerStarted","Data":"f300627272e8f7839b6085e86065bce05f2918a517a3d28a295e941c8f546eed"} Feb 26 12:28:02 crc kubenswrapper[4699]: I0226 12:28:02.782731 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535148-xbtxt" event={"ID":"c75250fb-35c3-4966-a995-33aaa68ec5e9","Type":"ContainerStarted","Data":"802b2f4b84123adda32c52f3d96782204f705a61314b180c33cfe926c167eee1"} Feb 26 12:28:03 crc kubenswrapper[4699]: I0226 12:28:03.791104 4699 generic.go:334] "Generic (PLEG): container finished" podID="c75250fb-35c3-4966-a995-33aaa68ec5e9" containerID="802b2f4b84123adda32c52f3d96782204f705a61314b180c33cfe926c167eee1" exitCode=0 Feb 26 12:28:03 crc kubenswrapper[4699]: I0226 12:28:03.791370 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535148-xbtxt" event={"ID":"c75250fb-35c3-4966-a995-33aaa68ec5e9","Type":"ContainerDied","Data":"802b2f4b84123adda32c52f3d96782204f705a61314b180c33cfe926c167eee1"} Feb 26 12:28:05 crc kubenswrapper[4699]: I0226 12:28:05.216662 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535148-xbtxt" Feb 26 12:28:05 crc kubenswrapper[4699]: I0226 12:28:05.352907 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-992z2\" (UniqueName: \"kubernetes.io/projected/c75250fb-35c3-4966-a995-33aaa68ec5e9-kube-api-access-992z2\") pod \"c75250fb-35c3-4966-a995-33aaa68ec5e9\" (UID: \"c75250fb-35c3-4966-a995-33aaa68ec5e9\") " Feb 26 12:28:05 crc kubenswrapper[4699]: I0226 12:28:05.366414 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c75250fb-35c3-4966-a995-33aaa68ec5e9-kube-api-access-992z2" (OuterVolumeSpecName: "kube-api-access-992z2") pod "c75250fb-35c3-4966-a995-33aaa68ec5e9" (UID: "c75250fb-35c3-4966-a995-33aaa68ec5e9"). InnerVolumeSpecName "kube-api-access-992z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 12:28:05 crc kubenswrapper[4699]: I0226 12:28:05.455717 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-992z2\" (UniqueName: \"kubernetes.io/projected/c75250fb-35c3-4966-a995-33aaa68ec5e9-kube-api-access-992z2\") on node \"crc\" DevicePath \"\"" Feb 26 12:28:05 crc kubenswrapper[4699]: I0226 12:28:05.812951 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535148-xbtxt" event={"ID":"c75250fb-35c3-4966-a995-33aaa68ec5e9","Type":"ContainerDied","Data":"f300627272e8f7839b6085e86065bce05f2918a517a3d28a295e941c8f546eed"} Feb 26 12:28:05 crc kubenswrapper[4699]: I0226 12:28:05.812996 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f300627272e8f7839b6085e86065bce05f2918a517a3d28a295e941c8f546eed" Feb 26 12:28:05 crc kubenswrapper[4699]: I0226 12:28:05.813062 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535148-xbtxt" Feb 26 12:28:05 crc kubenswrapper[4699]: I0226 12:28:05.867546 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535142-vfnhz"] Feb 26 12:28:05 crc kubenswrapper[4699]: I0226 12:28:05.877990 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535142-vfnhz"] Feb 26 12:28:06 crc kubenswrapper[4699]: I0226 12:28:06.275437 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a1ba6a1-6a82-47c4-9706-f77275f34d3a" path="/var/lib/kubelet/pods/8a1ba6a1-6a82-47c4-9706-f77275f34d3a/volumes" Feb 26 12:28:15 crc kubenswrapper[4699]: I0226 12:28:15.261185 4699 scope.go:117] "RemoveContainer" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" Feb 26 12:28:15 crc kubenswrapper[4699]: E0226 12:28:15.261978 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:28:28 crc kubenswrapper[4699]: I0226 12:28:28.260808 4699 scope.go:117] "RemoveContainer" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" Feb 26 12:28:28 crc kubenswrapper[4699]: E0226 12:28:28.261802 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-28p79_openshift-machine-config-operator(e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff)\"" pod="openshift-machine-config-operator/machine-config-daemon-28p79" podUID="e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff" Feb 26 12:28:41 crc kubenswrapper[4699]: I0226 12:28:41.852677 4699 scope.go:117] "RemoveContainer" containerID="99ba6bfc8510f503ebb43686b1e59641b632a364615001984ba3d20ee91c082d" Feb 26 12:28:42 crc kubenswrapper[4699]: I0226 12:28:42.261191 4699 scope.go:117] "RemoveContainer" containerID="db77674b6c4932546ef12383ba478551ccaf182e6ce1c41c613b6bd49208df30" Feb 26 12:28:43 crc kubenswrapper[4699]: I0226 12:28:43.184235 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-28p79" event={"ID":"e2a8e4ba-2bc7-4e9e-b7b9-dfce00cecdff","Type":"ContainerStarted","Data":"c4f9f594c37e44e301e74c3bc3bace132ed50a6fe4e52a87ba8cc2a910ac571e"} Feb 26 12:29:32 crc kubenswrapper[4699]: I0226 12:29:32.316945 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w7wbx"] Feb 26 12:29:32 crc kubenswrapper[4699]: E0226 12:29:32.317888 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c75250fb-35c3-4966-a995-33aaa68ec5e9" containerName="oc" Feb 26 12:29:32 crc kubenswrapper[4699]: I0226 12:29:32.317902 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75250fb-35c3-4966-a995-33aaa68ec5e9" containerName="oc" Feb 26 12:29:32 crc kubenswrapper[4699]: I0226 12:29:32.318131 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="c75250fb-35c3-4966-a995-33aaa68ec5e9" containerName="oc" Feb 26 12:29:32 crc kubenswrapper[4699]: I0226 12:29:32.319522 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w7wbx" Feb 26 12:29:32 crc kubenswrapper[4699]: I0226 12:29:32.348590 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w7wbx"] Feb 26 12:29:32 crc kubenswrapper[4699]: I0226 12:29:32.451165 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29138193-2151-4750-bfb5-cca407a7f6f8-catalog-content\") pod \"redhat-operators-w7wbx\" (UID: \"29138193-2151-4750-bfb5-cca407a7f6f8\") " pod="openshift-marketplace/redhat-operators-w7wbx" Feb 26 12:29:32 crc kubenswrapper[4699]: I0226 12:29:32.451335 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbwb2\" (UniqueName: \"kubernetes.io/projected/29138193-2151-4750-bfb5-cca407a7f6f8-kube-api-access-lbwb2\") pod \"redhat-operators-w7wbx\" (UID: \"29138193-2151-4750-bfb5-cca407a7f6f8\") " pod="openshift-marketplace/redhat-operators-w7wbx" Feb 26 12:29:32 crc kubenswrapper[4699]: I0226 12:29:32.451545 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29138193-2151-4750-bfb5-cca407a7f6f8-utilities\") pod \"redhat-operators-w7wbx\" (UID: \"29138193-2151-4750-bfb5-cca407a7f6f8\") " pod="openshift-marketplace/redhat-operators-w7wbx" Feb 26 12:29:32 crc kubenswrapper[4699]: I0226 12:29:32.553711 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbwb2\" (UniqueName: \"kubernetes.io/projected/29138193-2151-4750-bfb5-cca407a7f6f8-kube-api-access-lbwb2\") pod \"redhat-operators-w7wbx\" (UID: \"29138193-2151-4750-bfb5-cca407a7f6f8\") " pod="openshift-marketplace/redhat-operators-w7wbx" Feb 26 12:29:32 crc kubenswrapper[4699]: I0226 12:29:32.553804 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29138193-2151-4750-bfb5-cca407a7f6f8-utilities\") pod \"redhat-operators-w7wbx\" (UID: \"29138193-2151-4750-bfb5-cca407a7f6f8\") " pod="openshift-marketplace/redhat-operators-w7wbx" Feb 26 12:29:32 crc kubenswrapper[4699]: I0226 12:29:32.553923 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29138193-2151-4750-bfb5-cca407a7f6f8-catalog-content\") pod \"redhat-operators-w7wbx\" (UID: \"29138193-2151-4750-bfb5-cca407a7f6f8\") " pod="openshift-marketplace/redhat-operators-w7wbx" Feb 26 12:29:32 crc kubenswrapper[4699]: I0226 12:29:32.554436 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29138193-2151-4750-bfb5-cca407a7f6f8-utilities\") pod \"redhat-operators-w7wbx\" (UID: \"29138193-2151-4750-bfb5-cca407a7f6f8\") " pod="openshift-marketplace/redhat-operators-w7wbx" Feb 26 12:29:32 crc kubenswrapper[4699]: I0226 12:29:32.554463 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29138193-2151-4750-bfb5-cca407a7f6f8-catalog-content\") pod \"redhat-operators-w7wbx\" (UID: \"29138193-2151-4750-bfb5-cca407a7f6f8\") " pod="openshift-marketplace/redhat-operators-w7wbx" Feb 26 12:29:32 crc kubenswrapper[4699]: I0226 12:29:32.580690 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbwb2\" (UniqueName: \"kubernetes.io/projected/29138193-2151-4750-bfb5-cca407a7f6f8-kube-api-access-lbwb2\") pod \"redhat-operators-w7wbx\" (UID: \"29138193-2151-4750-bfb5-cca407a7f6f8\") " pod="openshift-marketplace/redhat-operators-w7wbx" Feb 26 12:29:32 crc kubenswrapper[4699]: I0226 12:29:32.648312 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w7wbx" Feb 26 12:29:33 crc kubenswrapper[4699]: I0226 12:29:33.136411 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w7wbx"] Feb 26 12:29:33 crc kubenswrapper[4699]: I0226 12:29:33.644593 4699 generic.go:334] "Generic (PLEG): container finished" podID="29138193-2151-4750-bfb5-cca407a7f6f8" containerID="6c8f8fcf2debae75b40182870a6800883b4df2cbc03e64023fd4c971fb361ef9" exitCode=0 Feb 26 12:29:33 crc kubenswrapper[4699]: I0226 12:29:33.644896 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7wbx" event={"ID":"29138193-2151-4750-bfb5-cca407a7f6f8","Type":"ContainerDied","Data":"6c8f8fcf2debae75b40182870a6800883b4df2cbc03e64023fd4c971fb361ef9"} Feb 26 12:29:33 crc kubenswrapper[4699]: I0226 12:29:33.644928 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7wbx" event={"ID":"29138193-2151-4750-bfb5-cca407a7f6f8","Type":"ContainerStarted","Data":"94a6780caecf32f1ad7d55e8859b300aac9007115891a2146c786976a754dc8c"} Feb 26 12:29:35 crc kubenswrapper[4699]: I0226 12:29:35.663916 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7wbx" event={"ID":"29138193-2151-4750-bfb5-cca407a7f6f8","Type":"ContainerStarted","Data":"b2cee1367f890d7bb6a2d67fab857d2642af3a2057061f12d14a8d883f6fc03b"} Feb 26 12:29:42 crc kubenswrapper[4699]: I0226 12:29:42.722985 4699 generic.go:334] "Generic (PLEG): container finished" podID="29138193-2151-4750-bfb5-cca407a7f6f8" containerID="b2cee1367f890d7bb6a2d67fab857d2642af3a2057061f12d14a8d883f6fc03b" exitCode=0 Feb 26 12:29:42 crc kubenswrapper[4699]: I0226 12:29:42.723067 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7wbx" event={"ID":"29138193-2151-4750-bfb5-cca407a7f6f8","Type":"ContainerDied","Data":"b2cee1367f890d7bb6a2d67fab857d2642af3a2057061f12d14a8d883f6fc03b"} Feb 26 12:29:43 crc kubenswrapper[4699]: I0226 12:29:43.734313 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7wbx" event={"ID":"29138193-2151-4750-bfb5-cca407a7f6f8","Type":"ContainerStarted","Data":"5f67881196d229fdf31f75bd924a788b820088e5127f0c3de3dc7f3180db124b"} Feb 26 12:29:43 crc kubenswrapper[4699]: I0226 12:29:43.761966 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w7wbx" podStartSLOduration=2.230023587 podStartE2EDuration="11.761938748s" podCreationTimestamp="2026-02-26 12:29:32 +0000 UTC" firstStartedPulling="2026-02-26 12:29:33.647886074 +0000 UTC m=+4719.458712508" lastFinishedPulling="2026-02-26 12:29:43.179801235 +0000 UTC m=+4728.990627669" observedRunningTime="2026-02-26 12:29:43.75110058 +0000 UTC m=+4729.561927034" watchObservedRunningTime="2026-02-26 12:29:43.761938748 +0000 UTC m=+4729.572765182" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515150036305024443 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015150036306017361 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015150024505016502 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015150024505015452 5ustar corecore